Project ID BE-MI2024_06


Co Supervisor 1A Faculty of Natural, Mathematical & Engineering Sciences, Department of EngineeringWebsite

Co Supervisor 1B Faculty of Natural Mathematical & Engineering Sciences, Department of InformaticsWebsite

Additional Supervisor Professor Louise Rose

Vision in soft tactile for contextual-aware activity recognition in clinical rehabilitation

Activity recognition plays a crucial role in wearable robotics for clinical rehabilitation. Recent technological advancements have enabled the use of soft textile sensors for activity recognition in high-level controllers. These sensors offer several advantages, including being low-cost, unobtrusive, and wearable like ordinary, fashionable clothing. In comparison to rigidly attached sensors, comfortable fabric-attached sensors have demonstrated increased accuracy in detecting human activities. However, to enhance the range of tasks and improve the quality of assistance provided by soft wearables, gathering information about the context and environment in which motor actions occur is essential. This is where computer vision becomes instrumental. By leveraging computer vision, wearable systems can obtain valuable information about the user, fabric features, and the surrounding environment. With the inclusion of computer vision with soft textile sensors, the project aims to leverage vision’s capabilities in obtaining information about the environment, task context, and user intent. This integration aims to facilitate the development of contextual-aware fabric sensing that can improve movement detection in clinical rehabilitation to enhance the quality of life for individuals with motor impairments, with potential translational aspects, with the ability to recognise activities of daily living outside clinical settings. Skills to be learned include practical experience in computer vision and robotics, particularly with deep learning libraries like PyTorch and TensorFlow, and the opportunity to learn and work with novel sensing technologies that capture human motion from clothing-embedded sensors. The objectives for each year are: (1) embed cameras in soft fabric for multidimensional movement analyses; (2) categorise discrete activities (e.g., walking, running) and terrains (e.g., sand, tarmac) based on motion data; (3) evaluation using parametric and nonparametric vision techniques to account for external factors that might affect fabric movement. The overarching goal is to enable motion assessment behaviour and intention for tailored assistance.

Representative Publications

1. ‘Shen, Tianchen, Irene Di Giulio, and Matthew Howard. “A Probabilistic Model of Human Activity Recognition with Loose Clothing.” Sensors 23.10 (2023): 4669;;

2. Shen, Tianchen, et al. “Identification of the Design Parameters for a Spacer Fabric Pressure-Mapping Sensor.” Proceedings. Vol. 68. No. 1. MDPI, 2021; tps://;

3. Kamavuako, Ernest N., et al. “Affordable embroidered EMG electrodes for myoelectric control of prostheses: A pilot study.” Sensors 21.15 (2021): 5245. Sensors 2021, 21(15), 5245;

1. Gionfrida, Letizia, Richard W. Nuckols, Conor J. Walsh, and Robert D. Howe. “Age-Related Reliability of B-Mode Analysis for Tailored Exosuit Assistance.” Sensors 23, no. 3 (2023): 1670.;

2. Gionfrida, Letizia, et al. “Validation of two-dimensional video-based inference of finger kinematics with pose estimation.” Plos one 17.11 (2022): e0276799.;

3. Gionfrida, Letizia, et al. “A 3DCNN-LSTM multi-class temporal segmentation for hand gesture recognition.” Electronics 11.15 (2022): 2427.