Indexed by:
Abstract:
Traditional muscle strength training instruments often rely on torque-based feedback to guide exercises, which can introduce delays in system response and result in discomfort due to hysteresis effects. Surface electromyography (sEMG) signals were used as control inputs to overcome the lag in torque-based muscle strength training instruments. The sEMG is generated 20-80 ms before movement, which is called "muscle electromechanical delay." If the torque can be effectively predicted during this period, the lag effect can be significantly reduced, thus improving the effectiveness and comfort of training. We, therefore, proposed a multistep ahead (MSA) model based on the nonlinear autoregressive network with exogenous inputs (NARX) dynamic recurrent neural network. It predicted torques using sEMG, and allowed natural control of the instrument. The results showed that the normalized root-mean-square error (NRMSE) was lower than 0.1167, and the Pearson correlation coefficients (rho) exceeded 0.9444, even when the ahead steps achieved 35. The intrasubject and the intersubject validation demonstrated significantly lower NRMSE ( p <0.05) and higher rho ( p < 0.05) of the MSA model, compared with some state-of-theart recursive models and typical models without autoregression items. It proves that the MSA can accurately predict the motion. Meanwhile, the introduction of the sEMG signal as a control source significantly reduced the root-mean-square jerk (RMSJ) of the torque, demonstrating smoother motion. The experimental results revealed that the one-step-ahead model achieved an average response time of 3.73 ms, which is markedly lower than the muscle electromechanical delay. The response time increased by an average of approximately 0.068 ms per additional ahead step. In conclusion, the proposed EMG-driven muscle strength training instrument enables natural muscle strength training.
Keyword:
Reprint 's Address:
Email:
Version:
Source :
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT
ISSN: 0018-9456
Year: 2025
Volume: 74
5 . 6 0 0
JCR@2023
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 1
Affiliated Colleges: