eMeritBox

Overview

The eMeritBox project combines traditional Buddhist cultural elements with modern technology, creating an interactive gravity-sensing electronic donation box. Built with a Raspberry Pi, the system integrates PWM control, motion sensing, and a web-based interface to modernize the concept of a traditional donation box. This innovative design bridges traditional practices with digital solutions, offering a seamless and meaningful user experience.

Results

  • System Features:
    • Automatic wooden fish strikes with real-time donation ball accumulation.
    • Gravity-sensing motion control for dynamic donation ball movement.
    • Dual operational modes: manual and auto donation switching.
  • Achievements:
    • Successfully implemented a complete hardware-software system using Raspberry Pi and Flask.
    • Developed reusable classes for matrix display and gravity sensing, enabling future adaptations.

GitHub (Chinese README) | Presentation PDF (Chinese version)

eMeritBox system overview

eMeritBox functional demonstration

eMeritBox functional demonstration

Technical Details

  • System Architecture:
    • Controller: Raspberry Pi handles signal processing, PWM control, and web server operations.
    • Modules:
      • MG-90 servo for wooden fish strikes.
      • GY-25 gyroscope for motion sensing.
      • MAX7219 matrix display for donation ball visualization.
  • Key Functionalities:
    • Gravity-Sensing Donation: Balls dynamically move based on the box’s tilt angle.
    • Flask Web Server: Supports browser-based remote operation of wooden fish strikes.
    • Matrix Display: Visualizes donation balls in real-time, reflecting their position and state.
  • Software Implementation:
    • Developed Python classes for modular control:
      • GY25Ctrl for gyroscope data processing.
      • MatrixCtrl for donation ball display updates.
      • BGMPlayer for background music playback.
    • Solved hardware conflicts by reconfiguring UART ports and enabling additional I²C channels.

Challenges

  • UART hardware resource conflicts: Raspberry Pi’s default UART settings caused resource contention.
    • Solution: Re-mapped hardware and mini UARTs (ttyAMA0 ↔ ttyS0) and configured multiple UART ports (+ttyAMA1, 2, …) for simultaneous operation.
  • I²C channel conflicts: Dual I²C channels on Raspberry Pi conflicted with camera usage.
    • Solution: Disabled the camera function and enabled additional I²C channels with dtparam=i2c_vc=on.
  • SPI and I²C competing with UART ports: Enabling SPI and I²C modules on the Raspberry Pi caused UART port contention.
    • Solution: Adjusted hardware configurations to optimize resource allocation.
  • Synchronization of Multiple Modules: Managing the simultaneous operation of PWM, matrix display, and motion sensing.
    • Solution: Utilized multi-threading to ensure real-time responsiveness and system stability.

Reflection and Insights

The eMeritBox represents a modernized approach to traditional Buddhist donation practices, seamlessly integrating spiritual elements with advanced technology. By reimagining the donation process with dynamic visuals and interactive controls, this project demonstrates the potential of technology to preserve and innovate cultural traditions. The challenges in hardware-software integration further highlighted the importance of modular design and multi-threaded programming in building robust embedded systems.

National Undergraduate Electronic Design Contest

Overview

In the National Undergraduate Electronic Design Contest (Aug 2023), our team designed and developed a laser motion control and auto-tracking system, earning a provincial second prize. This competition required participants to build an innovative system under rigorous time constraints. Our solution featured precise motion control and real-time tracking using advanced image processing and feedback algorithms.

Results

  • Achievement: Provincial Second Prize, ranked second among teams in our school.
  • Performance Metrics:
    • Successfully implemented precise motion control with an average error of 1.1 cm during testing.
    • Achieved stable laser tracking performance with a maximum tracking error of 3.6 cm under moderate speeds.
  • Presentation: Delivered a working prototype meeting the core requirements under competitive time constraints.

[Report PDF (Chinese version)]

On-site assembly system demonstration.

Technical Details

  • System Architecture:
    • Two independent servo-driven gimbals controlled by an Arduino Mega2560 microcontroller.
    • Image processing conducted via OpenMV H7 to identify and track red and green laser spots.
  • Algorithms:
    • Coordinate Mapping: Developed a nonlinear mapping between screen coordinates and gimbal angles using MATLAB-based regression.
    • Motion Control: Employed discrete PID algorithms for trajectory interpolation and servo adjustments.
    • Tracking: Enhanced tracking accuracy with Kalman filtering to predict and correct laser positions.
  • Challenges:
    • Overcame mechanical inaccuracies by calibrating servo feedback with additional correction factors and replacing faulty servos to improve trajectory precision.
    • Improved image processing robustness under varying light conditions through optimized exposure settings and LAB color space filtering.
    • Enhanced tracking success rates for high-speed laser movements by adjusting control frequencies and refining PID parameters for smoother tracking.

Reflection and Insights

This competition was a test of endurance, adaptability, and teamwork. The intense four-day schedule required rapid problem-solving and collaboration. The experience highlighted the importance of robust system design and precise calibration in achieving high-performance results. It also reinforced the value of integrating advanced algorithms, such as Kalman filtering, to address real-world constraints.

Team and Role

  • Team: A three-member team collaboratively handled hardware design, software development, and optimization.
  • My Role:
    • Led the implementation of image processing algorithms and Kalman filtering.
    • Developed and tested the PID control system for trajectory tracking.
    • Conducted system debugging and parameter tuning under competition constraints.

National Undergraduate Embedded Chip and System Design Competition

Overview

In the National Undergraduate Embedded Chip and System Design Competition (July 2023), I developed a dual-quadcopter control system using STM32F4 and NRFx series controllers. The project demonstrated motion synchronization between a primary and a secondary quadcopter through innovative wireless communication and control algorithms. This work earned us district-level recognition for its technical complexity and application potential.

Results

  • Achievement: District-level recognition for innovative quadcopter motion synchronization.
  • Performance Metrics:
    • Wireless communication demonstrated stability over distances of up to 100 meters in controlled testing conditions.
    • Achieved synchronized motion control with precise PID tuning for smooth coordination.
  • Deliverables:
    • A functional dual-quadcopter prototype capable of real-time motion matching.
    • System robustness tested under various environmental conditions.

[Report PDF (Chinese version)]

Two quadcopters used in the competition.

Technical Details

  • System Architecture:
    • Developed a control system with STM32F411 as the primary controller and NRF51822 for extended communication.
    • Integrated sensors, including MPU9250 for attitude measurement and BMP280 for altitude control.
    • Real-time task management implemented using FreeRTOS for multitasking.
  • Wireless Communication:
    • Leveraged NRF51822 and NRF24L01+ for long-range and low-latency communication.
    • Designed custom communication protocols to support bidirectional data flow and address synchronization issues.
  • Control Algorithms:
    • Implemented a closed-loop PID control system for stable quadcopter motion.
    • Enhanced robustness with signal filtering and fallback mechanisms to prevent flight instability.

Challenges

  • Signal Interference: Addressed challenges with RF signal collisions by implementing Enhanced ShockBurst protocols and custom filtering methods.
  • Synchronization Complexity: Fine-tuned PID parameters to handle discrepancies in quadcopter structure, weight, and motion dynamics.
  • Hardware Limitations: Overcame initial instability in the flight control system by incorporating open-source flight control adaptations.

Reflection and Insights

This project offered valuable insights into embedded system design and wireless communication. Working through challenges such as signal interference and parameter tuning enhanced my problem-solving skills. The experience also underscored the importance of collaboration and iterative debugging in achieving stable and synchronized quadcopter control.

SUSTech Electronic Design Competition

Overview

In the SUSTech College Student Electronic Design Competition, our team developed an embedded system for a smart medicine delivery vehicle. The system utilized modular design to complete tasks such as path tracking, room number recognition, and automatic medicine delivery and return. Using Arduino Mega2560 as the main controller and OpenMV Cam H7 Plus for vision-based recognition, we successfully implemented a functional prototype that met the core requirements.

Results

  • Achievements:
    • Designed and built a smart medicine delivery vehicle capable of transporting 200g medicines to specific rooms and returning to the starting point autonomously.
    • Achieved stable path tracking and room number recognition using color sensors and OpenMV-based vision modules.
  • Performance Metrics:
    • Accuracy: Room number recognition improved from low precision (~80%) to over 95% with increased training data.
    • Stability: Optimized dual-motor control for smooth motion and reduced trajectory deviation.

[Report PDF (Chinese version)]

Datasets collected for training visual models.

Number recognition test.

Technical Details

  • System Architecture:
    • Controllers: Used Arduino Mega2560 for motion control and OpenMV Cam H7 Plus for visual recognition.
    • Sensors: Incorporated TCS230 color sensors for red-white line tracking and ultrasonic sensors for medicine placement detection.
    • Power Management: Employed LM2596 DC-DC modules for stable power supply to the motors and sensors.
  • Vision-Based Recognition:
    • Trained a MobileNet V2 model on OpenMV for room number detection, improving accuracy with an expanded dataset of over 1,000 images.
    • Integrated servo-driven camera adjustments to widen the field of view for larger room number recognition.
  • Motion Control:
    • Implemented dual-motor PWM control for precise trajectory adjustments.
    • Addressed power distribution imbalances to enhance stability and reduce deviation during turns.

Challenges

  • Path Tracking: Gray sensors failed to distinguish red and white lines due to spectral similarity, resolved by switching to TCS230 color sensors.
  • Recognition Precision: Initial low accuracy in room number detection improved through iterative dataset expansion and neural network fine-tuning.
  • Motor Synchronization: Addressed power discrepancies by using independent motor drivers for improved stability.

Reflection and Insights

This competition underscored the importance of modular system design in solving complex real-world problems. From refining vision models to debugging hardware components, the experience deepened my understanding of embedded systems and control algorithms. It also highlighted the critical role of interdisciplinary approaches in achieving reliable system performance.

Team and Role

  • Team: Collaborated with two teammates on design, implementation, and testing.
  • My Role:
    • Designed and implemented the vision system for room number recognition.
    • Led the integration of the motion control and tracking subsystems.
    • Conducted iterative testing and fine-tuning for system optimization.