Projekt-Work: Optimization study of Head Camera Stabilization Mechanism Using Machine learning and different Control strategy

Project Description:

The project aims to enhance the functionality of the pre-designed/fabricated Head Camera mechanism by integrating closed-loop feedback control and a machine learning algorithm. The mechanism is capable of mimicking human head movement for extension, flexion, and lateral position. A multilayer perceptron Artificial Neural Network (ANN) is utilized to construct an identification system that establishes a relationship between desired camera poses and the angular configuration of the three motors controlling the mechanism, achieved by solving inverse kinematic models. The Proportional Integral Derivative (PID), Fractional Order Proportional Integral (FOPI), and FOPID are considered as control algorithms. To prevent gimbal lock issues in a 3-DOF actuator, quaternion-based techniques should be employed, interfacing with a VIVE VR headset for immersive interaction. Quaternion representations offer advantages in depicting 3D rotations without the constraints of Euler angles, thereby mitigating gimbal lock while ensuring smooth and accurate translation of head movement to actuator rotation. Additionally, incorporating voice control for user communication in the VR environment is desirable.

Tasks and Duties:

Literature Review:

  • Conduct a thorough review of the literature on gimbal lock mitigation techniques, camera control systems, quaternion-based orientation, and voice recognition technologies in VR environments.

  • Identify existing methods and solutions for addressing gimbal lock, integrating multiple cameras, implementing voice commands, Machine learning algorithms, and utilizing quaternions for orientation representation.

System Design:

  • Extend the system architecture to incorporate camera control mechanisms, voice recognition capabilities, and quaternion-based orientation techniques.

  • Define the specifications for integrating the camera on the 3 DOF mechanism.

  • Design the voice command interface for, controlling zoom functions

Camera Integration:

  • Implement hardware and software components to integrate the cameras with the VR headset and Read the IMU/Temp sensor for monitoring purposes on VR.

  • Develop communication protocols and interfaces to ensure seamless interaction between the cameras and the control system.

  • Implement the required sensor for feedback control (External IMU or embedded camera sensor).

  • Utilizing the ANN algorithm to solve the inverse kinematics and comparing it to the analytical solution.

  • Implement the PID/ FOPI/ FOPID controller for the close loop system and compare the results.

  • Calibrate the cameras and optimize the positioning for optimal viewing angles.

Quaternion-based Orientation:

  • Develop algorithms for quaternion-based representation of head movements captured by the VR headset.

  • Implement quaternion transformations to translate head orientations into precise actuator rotations.

  • Ensure compatibility and synchronization with ROS middleware.

Voice Control System:

  • Research and select appropriate voice recognition technologies compatible with the VR headset and ROS.

  • Develop algorithms for processing voice commands related to camera switching, zoom in, zoom out, and orientation control (Scale control).

  • Integrate the voice control system within the VR headset, ensuring accuracy and responsiveness to user commands.

Testing and Validation:

  • Conduct comprehensive testing of the integrated system to verify the functionality of the camera, zoom control, and quaternion-based orientation.

  • Gather user feedback through usability testing to identify areas for improvement and refinement.

Documentation and Presentation:

  • Document the design, implementation, and testing processes.

  • Prepare user manuals or guides for operating the VR headset and utilizing voice commands.

  • Create presentation materials summarizing the project objectives, methodologies, and outcomes for final reporting.

 

Requirements:

o   Programming Skills (Arduino, C++, OR Python)

o   Good mathematical background (Quaternion)

o   Basic knowledge of Machine learning (e.g., ANN)

o   Understanding of Control Systems (PID)

o   ROS Familiarity would be an advantage

 

Contact Person: Mohammad Sadeghi (mohammad.sadeghi@tuhh.de)

Desired starting date: As Soon As possible

 

Institut für Mechatronik im Maschinenbau (iMEK), Eißendorfer Straße 38, 21073 Hamburg