National Undergraduate Electronic Design Contest

Overview

In the National Undergraduate Electronic Design Contest (Aug 2023), our team designed and developed a laser motion control and auto-tracking system, earning a provincial second prize. This competition required participants to build an innovative system under rigorous time constraints. Our solution featured precise motion control and real-time tracking using advanced image processing and feedback algorithms.

Results

  • Achievement: Provincial Second Prize, ranked second among teams in our school.
  • Performance Metrics:
    • Successfully implemented precise motion control with an average error of 1.1 cm during testing.
    • Achieved stable laser tracking performance with a maximum tracking error of 3.6 cm under moderate speeds.
  • Presentation: Delivered a working prototype meeting the core requirements under competitive time constraints.

[Report PDF (Chinese version)]

On-site assembly system demonstration.

Technical Details

  • System Architecture:
    • Two independent servo-driven gimbals controlled by an Arduino Mega2560 microcontroller.
    • Image processing conducted via OpenMV H7 to identify and track red and green laser spots.
  • Algorithms:
    • Coordinate Mapping: Developed a nonlinear mapping between screen coordinates and gimbal angles using MATLAB-based regression.
    • Motion Control: Employed discrete PID algorithms for trajectory interpolation and servo adjustments.
    • Tracking: Enhanced tracking accuracy with Kalman filtering to predict and correct laser positions.
  • Challenges:
    • Overcame mechanical inaccuracies by calibrating servo feedback with additional correction factors and replacing faulty servos to improve trajectory precision.
    • Improved image processing robustness under varying light conditions through optimized exposure settings and LAB color space filtering.
    • Enhanced tracking success rates for high-speed laser movements by adjusting control frequencies and refining PID parameters for smoother tracking.

Reflection and Insights

This competition was a test of endurance, adaptability, and teamwork. The intense four-day schedule required rapid problem-solving and collaboration. The experience highlighted the importance of robust system design and precise calibration in achieving high-performance results. It also reinforced the value of integrating advanced algorithms, such as Kalman filtering, to address real-world constraints.

Team and Role

  • Team: A three-member team collaboratively handled hardware design, software development, and optimization.
  • My Role:
    • Led the implementation of image processing algorithms and Kalman filtering.
    • Developed and tested the PID control system for trajectory tracking.
    • Conducted system debugging and parameter tuning under competition constraints.

National Undergraduate Embedded Chip and System Design Competition

Overview

In the National Undergraduate Embedded Chip and System Design Competition (July 2023), I developed a dual-quadcopter control system using STM32F4 and NRFx series controllers. The project demonstrated motion synchronization between a primary and a secondary quadcopter through innovative wireless communication and control algorithms. This work earned us district-level recognition for its technical complexity and application potential.

Results

  • Achievement: District-level recognition for innovative quadcopter motion synchronization.
  • Performance Metrics:
    • Wireless communication demonstrated stability over distances of up to 100 meters in controlled testing conditions.
    • Achieved synchronized motion control with precise PID tuning for smooth coordination.
  • Deliverables:
    • A functional dual-quadcopter prototype capable of real-time motion matching.
    • System robustness tested under various environmental conditions.

[Report PDF (Chinese version)]

Two quadcopters used in the competition.

Technical Details

  • System Architecture:
    • Developed a control system with STM32F411 as the primary controller and NRF51822 for extended communication.
    • Integrated sensors, including MPU9250 for attitude measurement and BMP280 for altitude control.
    • Real-time task management implemented using FreeRTOS for multitasking.
  • Wireless Communication:
    • Leveraged NRF51822 and NRF24L01+ for long-range and low-latency communication.
    • Designed custom communication protocols to support bidirectional data flow and address synchronization issues.
  • Control Algorithms:
    • Implemented a closed-loop PID control system for stable quadcopter motion.
    • Enhanced robustness with signal filtering and fallback mechanisms to prevent flight instability.

Challenges

  • Signal Interference: Addressed challenges with RF signal collisions by implementing Enhanced ShockBurst protocols and custom filtering methods.
  • Synchronization Complexity: Fine-tuned PID parameters to handle discrepancies in quadcopter structure, weight, and motion dynamics.
  • Hardware Limitations: Overcame initial instability in the flight control system by incorporating open-source flight control adaptations.

Reflection and Insights

This project offered valuable insights into embedded system design and wireless communication. Working through challenges such as signal interference and parameter tuning enhanced my problem-solving skills. The experience also underscored the importance of collaboration and iterative debugging in achieving stable and synchronized quadcopter control.

SUSTech Electronic Design Competition

Overview

In the SUSTech College Student Electronic Design Competition, our team developed an embedded system for a smart medicine delivery vehicle. The system utilized modular design to complete tasks such as path tracking, room number recognition, and automatic medicine delivery and return. Using Arduino Mega2560 as the main controller and OpenMV Cam H7 Plus for vision-based recognition, we successfully implemented a functional prototype that met the core requirements.

Results

  • Achievements:
    • Designed and built a smart medicine delivery vehicle capable of transporting 200g medicines to specific rooms and returning to the starting point autonomously.
    • Achieved stable path tracking and room number recognition using color sensors and OpenMV-based vision modules.
  • Performance Metrics:
    • Accuracy: Room number recognition improved from low precision (~80%) to over 95% with increased training data.
    • Stability: Optimized dual-motor control for smooth motion and reduced trajectory deviation.

[Report PDF (Chinese version)]

Datasets collected for training visual models.

Number recognition test.

Technical Details

  • System Architecture:
    • Controllers: Used Arduino Mega2560 for motion control and OpenMV Cam H7 Plus for visual recognition.
    • Sensors: Incorporated TCS230 color sensors for red-white line tracking and ultrasonic sensors for medicine placement detection.
    • Power Management: Employed LM2596 DC-DC modules for stable power supply to the motors and sensors.
  • Vision-Based Recognition:
    • Trained a MobileNet V2 model on OpenMV for room number detection, improving accuracy with an expanded dataset of over 1,000 images.
    • Integrated servo-driven camera adjustments to widen the field of view for larger room number recognition.
  • Motion Control:
    • Implemented dual-motor PWM control for precise trajectory adjustments.
    • Addressed power distribution imbalances to enhance stability and reduce deviation during turns.

Challenges

  • Path Tracking: Gray sensors failed to distinguish red and white lines due to spectral similarity, resolved by switching to TCS230 color sensors.
  • Recognition Precision: Initial low accuracy in room number detection improved through iterative dataset expansion and neural network fine-tuning.
  • Motor Synchronization: Addressed power discrepancies by using independent motor drivers for improved stability.

Reflection and Insights

This competition underscored the importance of modular system design in solving complex real-world problems. From refining vision models to debugging hardware components, the experience deepened my understanding of embedded systems and control algorithms. It also highlighted the critical role of interdisciplinary approaches in achieving reliable system performance.

Team and Role

  • Team: Collaborated with two teammates on design, implementation, and testing.
  • My Role:
    • Designed and implemented the vision system for room number recognition.
    • Led the integration of the motion control and tracking subsystems.
    • Conducted iterative testing and fine-tuning for system optimization.

Mathematical Contest In Modeling

Overview

As part of the Mathematical Contest in Modeling (MCM), our team developed models to analyze and predict trends in the game “Wordle.” Leveraging ARIMA for time-series forecasting, BP neural networks for player behavior prediction, and K-means clustering for difficulty classification, we provided actionable insights for game developers and researchers. The project demonstrated the applicability of machine learning techniques to analyze complex patterns in player data.

Results

  • Achievements:
    • Modeled and predicted participant numbers with ARIMA, achieving high accuracy (R² = 0.982).
    • Classified Wordle words into “easy,” “medium,” and “difficult” categories using K-means clustering.
    • Predicted distribution of attempts for words based on their features with BP neural networks.
  • Model Outputs:
    • Predicted the number of participants on March 1, 2023, to be between 10,288 and 10,624.
    • Classified the word EERIE as of medium difficulty based on clustering results.
    • Visualized player behavior with accurate predictions for multiple words.

[Report PDF]

Technical Details

  • Time-Series Forecasting (ARIMA):
    • Applied ARIMA (1,1,0) to model participant trends over time.
    • Conducted statistical tests (ADF and Ljung-Box) to ensure model validity.
    • Predicted the number of participants with high reliability.
  • Attempts Percentage Prediction (BP Neural Network):
    • Extracted word features, including isolation level, priority, and elimination value.
    • Designed and trained a BP neural network with 70% training data, achieving optimal performance for predicting attempt distributions.
    • Key Metrics: RMSE = 3.57, MAPE = 45.28%.
  • Difficulty Classification (K-means):
    • Clustered words into three difficulty levels using player attempt distributions.
    • Evaluated and classified the word EERIE as medium difficulty based on clustering results.
    • Improved model robustness by excluding outliers and noisy data.

Challenges

  • Data limitations: A small dataset (~359 samples) constrained model generalization.
  • Non-linear relationships between features and outcomes made interpretability challenging.
  • Overfitting risks in ARIMA required careful parameter tuning and validation.

Reflection and Insights

This competition provided valuable experience in applying statistical and machine learning techniques to real-world problems. It reinforced the importance of data cleaning, feature extraction, and model validation. The integration of multiple models (ARIMA, BP neural networks, and K-means) highlighted the versatility of these methods in addressing diverse challenges.

Team and Role

  • Team: Collaborated with two teammates on data analysis, modeling, and report writing.
  • My Role:
    • Played a leading role in modeling tasks, including the design and implementation of ARIMA, BP neural networks, and K-means clustering.
    • Assisted in programming tasks, focusing on feature engineering and model optimization.
    • Contributed significantly to the report writing by summarizing results, interpreting findings, and drafting technical sections.