Occupational Ergonomics and Biomechanics Laboratory           


The following research projects are being conducted in the Occupational Ergonomics and Biomechanics Laboratory:

Video Motion Analysis of Repetitive Exertions 

Upper extremity musculoskeletal injuries are common in hand intensive work involving highly repetitive motions and exertions. Previous exposure assessment methods involve either direct measurements using instruments attached to a worker’s hands or arms, or indirect observations. Both methods however are highly impractical for occupational health practice. Compared to instruments, indirect observation lacks precision and accuracy, is not suitable for long observation periods, and requires considerable analyst time. Alternatively, attaching sensors on working hands is too time consuming, and sensors may interfere with normal working operation. We are developing novel real-time video processing algorithms to automatically and unobtrusively measure upper limb kinematics from conventional single camera digital video. In this exploratory research, we use a video feature extraction method to indirectly quantify hand activity using marker-less video motion analysis. The real time video algorithms developed are designed to be robust using sequential Bayesian estimation to track the motion of a general region of interest (ROI) on the upper extremities selected by the camera operator or analyst, and statistically estimating its velocity and acceleration. The long-term goal is to develop a video-based direct reading exposure assessment instrument for upper limb repetitive motion activities that should be useful for evaluating the risk of injuries in the workplace, and for primary prevention.  [VIDEO]

Sponsor:  National Institute of Biomedical Imaging & Bioengineering, National Institute of Health 


Video Exposure Assessment of Hand Activity Level

Assembly workersThis exploratory research investigates the feasibility of automatically evaluating the American Conference of Government Industrial Hygienists (ACGIH) Hand Activity Level (HAL) using digital video processing. We are developing feature extraction algorithms for quantifying HAL in real-time using conventional digital videos focused on the upper extremities. There is currently no practical instrument for objectively, unobtrusively, and efficiently measuring repetitive motion exposure for evaluating the risk of musculoskeletal injuries in the workplace. Current methods involve either direct measurements using instruments attached to a worker’s hands or arms, or indirect observations. Both instrument and observation methods are mostly limited to research studies and are highly impractical for industry practitioners. This approach uses digital video processing to measure repetitive motion exposure.  Ultimately this research may lead to a video-based direct reading exposure assessment instrument with broad applications for occupational health and safety. 

Chen, C-H., Hu, Y. H., Yen, T. Y. and Radwin, R. G., Automated Video Exposure Assessment of Repetitive Hand Motion, Human Factors, 55(2), 298 – 308, 2013.

Sponsor:  Department of Health and Human Services, National Institute for Occupational Safety and Health

Virtual Exertions

Simulation of physical activities is important not only for research, testing and development, but for training participants to perform physically demanding tasks safely, preventing over exertions, or avoiding the risks of falling.  While simulation studies in VR provide ideal and safe experimental conditions to mimic images and situations, and to explore human behavior, unlike physical objects, virtual objects are created by image projections and therefore lack weight and haptic feedback.  Therefore, it is important to incorporate the necessity for users to overcome the natural forces of gravity, friction, and inertia in order to directly manipulate virtual objects without control devices.  In this study we explore if we can create more natural virtual experiences through the concept of virtual exertions, a method utilizing biofeedback from electromyograms (EMG) as well as gestures and movements for interacting with virtual environments.  We define virtual exertions as a mapping of human-generated forceful actions, postures and movements that are generally used to manipulate physical objects, against projections of objects in the hands as an interface into the virtual environment.  In order to create virtual exertions, EMG activity is monitored during rehearsed co-contractions of agonist/antagonist muscles used for specific actions, and contraction patterns and levels are combined with tracked motion of the body and hands for identifying when the participant is exerting sufficient force to displace the intended object in a high-fidelity 3D CAVE virtual environment.  EMG electrodes affixed to the forearm and arm muscles feedback this information to the computer to cause the motion of the virtual object to respond according to the physical laws that allow them to visually react as they would if they were massive objects.  Resistance to forces comes directly from the co-contracting antagonist muscles that stiffen the joints and are naturally already involved in forceful exertions for joint stabilization and posture maintenance.  Continuous visual feedback to the participant displays mechanical work against virtual objects with simulated inertial properties.   

Virtual Exertions’ Research Uses Muscle Activity to Move Virtual Objects  [VIDEO]


Ponto, K., Gleicher, M., Radwin, R. G., and Shin, H. J., Perceptual calibration for immersive display environments,
IEEE Transaactions on Visulaization and Computer Graphics,19(4), 691-700, 2013. [PMC]

Ponto, K., Kimmel, R., Kohlmann, J., Bartholomew, A., and Radwin, R. G. Virtual Exertions: a user interface combining visual information, kinesthetics and biofeedback for virtual object manipulation. In 3D User Interfaces (3DUI), 2012 IEEE Symposium on (pp. 85-88). IEEE., 2012. [PDF]

Radwin, R. G., Chen, K. B., Ponto, K. and Tredinnick, R. D., Virtual Exertions Physical Interactions in a Virtual Reality CAVE for Simulating Forceful Tasks, Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 57(1), 967-971, 2013.

Hand Tool Operator Reaction Force Prediction Computer Model

Our laboratory has developed a novel biodynamic tool operator model and associated tool test instruments for evaluating tool and operator response to various tool spindle conditions.  This model can be used for predicting the reaction to static and dynamic handle reaction forces for a population of male and female tool operators in different horizontal and vertical locations and orientations.  Instruments for measuring tool inertial properties including center of mass and mass moment of inertia have been developed for obtaining tool parameters used in the model.   The model is used for predicting operator reaction to pneumatic, programmed electric, and pulse tools for selecting the best tool and tool installation that enhances tool performance and minimizes operator physical stress. 

Lin, J. H, Radwin, R. G., and Richard, T. G.  Handle dynamics predictions for selected power hand tool applications, Human Factors, 45(4), 645-656, 2003.  [PDF]

Radwin, R. G., Chourasia, A. O., Howery, R. S., Fronczak, F. J., Yen,  T. Y., Subedi, Y., and Sesto, M. E.  Factors Influencing Power Hand Tool Fastening Accuracy and Reaction Forces, Human Factors, 2014.  [Request Reprint]

A Virtual Reality Rehabilitative System for Mitigating Musculoskeletal Pain-Related Fear Avoidance

This project investigates altering visual feedback through control-display gain for a head mounted display for encouraging individuals with kinesiophobia, i.e., fear avoidance of movement due to chronic pain, to perform therapeutic neck exercises.  The entire VR rehabilitation system consists of three commercially available hardware devices, including the Microsoft Kinect™ for translational motion and position tracking, the Oculus Rift™ HMD for visual feedback and rotational movement tracking through a built-in accelerometer, and an Xbox 360 controller for user input. Preliminary studies show that immersive VR effectively altered visual feedback, which will next be used to assess its effectiveness for rehabilitation.