For more information, contact me on lucas.fonseca@nottingham.ac.uk
Develop a simulation of a single joint, such as the knee, using OpenSim. The project involves estimating joint torque based on angles measured by an inertial measurement unit (IMU). Students will analyze angular velocity, acceleration, and position to create a comprehensive model of joint dynamics. Validation will be conducted by comparing simulation results with real-world data, potentially involving the use of physical weights or motion capture systems. This project offers opportunities to explore musculoskeletal modeling and motion analysis, with applications in rehabilitation and sports science. It is ideal for students interested in biomechanics and wearable sensor technologies.
This project focuses on using IMU and/or EMG data to control a Spot robot. Students will develop algorithms that translate human body movements or muscle activity into commands for the robot, enabling it to perform tasks such as walking, turning, or crouching. The work will involve signal processing, sensor integration, and robotics programming to create a functional and intuitive control system. This project allows exploration of human-robot interaction and wearable technology while applying AI techniques for mapping signals to robot actions. It is well-suited for students interested in robotics applications and assistive technologies.
Design and build a robot that responds to hand gestures tracked by IMU and/or EMG sensors. The project involves programming the robot to interpret gestures such as swiping, tilting, or clenching into specific actions like moving forward, turning, or stopping. Students will focus on real-time signal processing, gesture recognition algorithms, and robotics control systems. This project provides opportunities to explore intuitive human-machine interfaces and wearable sensor applications. It is ideal for students interested in creating interactive robotic systems that enhance user experience and accessibility.
Design a telepresence robot system aimed at assisting telerehabilitation by remotely evaluating user movements. The system will integrate video-based motion tracking or wearable sensors (e.g., IMUs or EMG) to assess patient exercises and provide real-time feedback. Students will work on developing features like automated movement analysis, therapist-patient interaction tools, and data logging for progress monitoring. This project combines robotics, healthcare technology, and AI to address challenges in remote rehabilitation practices. It is ideal for students interested in rehabilitation engineering, human-robot interaction, or telemedicine applications.
This highly practical project involves collecting IMU and optionally EMG data from able-bodied participants performing various tasks such as walking, sitting, standing, or transitioning between activities. Students will design the data collection protocol, recruit participants, ensure proper sensor placement, and synchronize data streams effectively. The resulting dataset can be used for research in motion analysis, activity recognition, or rehabilitation applications. This project emphasizes experimental design and hands-on work with wearable sensors while providing opportunities for preliminary data analysis. It is ideal for students interested in biomechanics or building datasets for machine learning studies.
This project focuses on utilizing the LEA personal assistance robot, designed for rehabilitation and mobility support, to collect, compile, and analyze its sensor data. The LEA robot is equipped with dozens of sensors, including those for navigation, obstacle detection, and user interaction. Students will design experiments to gather data during various tasks (e.g., walking assistance or obstacle avoidance) and analyze it to identify patterns or improve performance. This project provides hands-on experience with robotics and sensor data processing and offers opportunities to apply machine learning for insights or optimizations.
This project aims to compare machine learning techniques with the Fréchet distance for classifying human movement data. Students will preprocess movement trajectories (e.g., from IMU sensors) and implement classification models using both approaches. The goal is to evaluate their accuracy, computational efficiency, and practicality in real-world scenarios. This project offers opportunities to explore advanced similarity measures like the Fréchet distance while applying machine learning algorithms to motion data. It is ideal for students interested in combining theoretical and practical aspects of data analysis.