Submit manuscript...
International Journal of
eISSN: 2573-2838

Biosensors & Bioelectronics

Research Article Volume 7 Issue 4

Musculoskeletal disorders analysis with artificial intelligence for Mexican Nom–036 normativity

José de Jesús Sandoval-Palomares, Gerardo Pérez-Duarte-Marcoux

Department of research and technological solutions, CIATEC AC, México

Correspondence: José de Jesús Sandoval-Palomares, Department of research and technological solutions, CIATEC AC, Omega 201, Industrial Delta 37545, León Gto, México

Received: June 11, 2021 | Published: July 6, 2021

Citation: Sandoval-Palomares JJ, Pérez-Duarte-Marcoux G. Musculoskeletal disorders analysis with artificial intelligence for Mexican Nom–036 normativity. Int J Biosen Bioelectron. 2021;7(3):96-100. DOI: 10.15406/ijbsbe.2021.07.00220

Download PDF

Abstract

According to statistics from the Mexican Institute of Social Security, from 2010 to 2019, the number of work accidents in relation to musculoskeletal disorders has been increasing, starting at 573 in 2010 to 6,297 in 2019. Mexico has an official standard as NOM-036-1-STPS-2017: Ergonomic risk factors at work-Identification, analysis, prevention and control. This work seeks to support to establish an analytical measurement to obtain human pose estimation and its angles in working conditions, and to be evidence for actions to comply with the mentioned standard. For posture analysis, the real-time human pose estimation neural network artificial intelligence algorithm of the TensorFlow platform is applied. A computer user interface was implemented, where the data obtained are stored and presented in tables and graphs, in an interactive way it allows to change the reference angle and perform the analysis according to the standards.

Keywords: musculoskeletal disorders, human pose estimation, ergonomic evaluation, official standard, NOM-036

Introduction

In 2015 the World Health Organization (WHO) World Labor Organization (UNWTO)1 and the International Labor Office (ILO), established the need to develop a culture of prevention at work throughout the world. That same year, Kka Takala director of the ILO Safework program, commented that “The most common occupational diseases are cancer attributable to exposure to dangerous substances, musculoskeletal diseases, respiratory diseases“.2-4 According to statistics from the Mexican Institute of Social Security (IMSS), from 2010 to 2019,5 the number of work accidents in relation to musculoskeletal disorders has been increasing, starting at 573 in 2010 to 6,297 in 2019. Mexico has an official standard as NOM-036-1-STPS-2017, ergonomic risk factors at work-Identification, analysis, prevention and control. Figure 1, shows the practically exponential growth from 2010 to 2019 of musculoskeletal disorders work associates, according to IMSS data.

Figure 1 Practically exponential growth from 2010 to 2019 of musculoskeletal disorders work associates, according to Mexican institute of social security data.

The musculoskeletal disorders analysis over workers is a problem studied and analyzed from various perspectives, for example analyzing the different pain scales presented during repretentive activities.6 Studies establish that inadequate postures adopted by an operator at work are among the most important risk factors in work-related musculoskeletal disorders.7 To facilitate a prolonged work life, a balance between work activities and human capabilities must be found, especially for manufacturing workers.8 Various organizations around the world establish guidelines to regulate and follow up on musculoskeletal disorders. As mentioned, world organizations such as WHO.9 In the European Union, the European Agency for Safety and Health at Work, establishes a set of practical tools and guidance on musculoskeletal disorders,10 in the USA, the Occupational Safety and Health Administration, in its ergonomic section, details this situation and how companies and workers must deal with it.11 In Mexico, the body that establishes the guidelines on occupational health is the " Secretaría del Trabajo y Previsión Social " or Ministry of Labor and Social Security12 and the IMSS, established in the standard PROY-NOM-036-1-STPS -2017, ergonomic risk factors at work: Identification, analysis, prevention and control of manual handling of loads, by means of which the provisions to be adopted in the work centers are established, in order to prevent health risks of workers, particularly in Mexico studies on musculoskeletal disorders, studies have been made regarding the risk of back injury in a workplace due to manual handling of loads.13 Regarding the consequence of the muscular overload in work activities due to posture,14 they found that the effort and repetitive movements with defined intensity, frequency and duration, explain the discomfort in the dynamic work of the upper limbs. An extensive study on the diseases and injuries of the musculoskeletal system is presented and establish a classification and prevention of musculoskeletal disorders.15 To obtain video of people in work activities, and to be a useful source for ergonomic studies and obtain human pose detection, various devices can be used, such as the use of the Kinect,16–18 conventional digital cameras and even web cameras.19–22. Artificial intelligence (AI) techniques and video processing prove to be useful to obtain human postures in work activities.23–25 For the estimation of human posture. For the estimation of human posture, applying machine learning forming a neural network in TensorFlow tecnlogy, where is used an image or a video by estimating the spatial locations of key body joints: nose, left Eye, right Eye, left Ear, right Ear, left Shoulder, right Shoulder, left Elbow, right Elbow, left Wrist, right Wrist, left Hip, right Hip, left Knee, right Knee, left Ankle, right Ankle.26 allows to obtain useful angles and references that can be interpreted for various uses. Applications of pose estimation and data collection during human-robot interaction, sports, rehabilitation have been reported.27,28

Materials and methods

Video processing

Videos for data processing and obtaining are recorded in the work environment while carrying out its productive activity. Visual aids were developed on how to take the videos for users, considering the documentation of the standard, in this guide it is defined that the video can be taken with conventional digital cameras and even mobile phone or webcam cameras, where the shot must be made where the movement to be analyzed is observed, use a tripod to avoid unwanted movements in the video and with lighting of normal operating conditions, daylight or artificial and with a resolution of 640x 480 pixels of the video, in necessary cases, a video test was requested to guide him in the most appropriate implementation. The most suitable environment for taking the video and measuring work activities is the real workplace, that is, in an industrial environment. It is important to consider that this scenario has a high probability of presenting various elements or objects that are considered noise or distractors with respect to the object of interest, in this case the human body and its movements, that is, there may be other people doing other activities or even moving machinery. The pose estimation model automatically discriminates the noise elements, leaving only the human figures, key to clarify that the neural network model is already prepared for that, being highly efficient to identify the human form. In order for the processing to be carried out only for the purpose of interest, to be more efficient and to reduce the processing time, regions within the video are defined, specifying their coordinates from position [0,0] which is the upper left corner, for example a region of [300,200] [370,250], which could be considered near the center of the video.

Data acquisition

The video processing allows to obtain a set of data, in particular on the x, y position of shoulders, hips, knees, within each of the video frames, the “PoseNet” algorithm generates a data matrix, having the columns as the positions x, y, and in the rows the elements that are identified, these being: nose, left Eye, right Eye, left Ear, right Ear, left Shoulder, right Shoulder, left Elbow, right Elbow, left Wrist, right Wrist, left Hip, right Hip, left Knee, right Knee, left Ankle, right Ankle. From the generated matrix, those corresponding to left Shoulder, right Shoulder, left Hip, right Hip, left Knee, right Knee are obtained, with these a secondary table is generated, which is the one used to calculate the angle of interest. Once the estimation of the corresponding angles has been made to the joints of interest, knee, hip and shoulder in order to form a vertex that allows calculating the inclination of the person, Figure 2. shows the angles, considering a person standing at 180° and with an inclination of 150°. To calculate the angle of interest, the angle theorem between two lines was applied, where the tuple a (x1, y1) of one end of a first line, the tuple b (x2, y2) of the point where the two touch lines and that forms the angle, the tuple c (x3, y3) of the point of the end of the second line. Considering the points, formula (1) is applied to obtain the angle. When the person standing is completely straight, a 180° angle is detected, while as he leans, the angle decreases.

Figure 2 Shows the angles, considering a person standing at 180° and with an inclination of 150°.

angle = degrees [ atan(c[y3]-b[y2] , c[x3]-b[x2]) - atan(a[y1]-b[y2] , a[x1]-b[x2]) ]             (1)

where atan is the arc tangent of y/x, in radians. x’s and y´s are the coordinates of a point (x,y), degrees converts an angle from radians to degrees and angle the calculated angle data. Just obtaining the angle, a secondary matrix is generated to store the data, grouped in the tuple (number of frames, degrees, threshold).

Data visualization

An interface was designed where the numerical results obtained from the processing, video analysis and data obtained are shown. In this interface, a results table is presented, where the frames where the action was presented, the activity that is being analyzed, is presented; Example torso torso and flexion, this table allows you to make filters, which generate representative graphs of the analyzed action. This interface allows you to view the original video, the video with the general frame and only the musculoskeletal frame. According to the changes that the user applies to the filter, the interface identifies the positions in green traffic light, which means activity position performed correctly (No restriction), a yellow traffic light significant to activity position to review (restricted) and a traffic light red prohibited activity posture (severely restricted). Finally, in this interface you can also view the original video, the postures frame processed video and Musculoskeletal frame video.

Results

Video acquisition

An example of the video capture guide is shown in Figure 3, in which the type of element of the shot to be analyzed is defined, the image shows the sections: Torsion and lateral flexion of the toso “Torsión y flexión lateral del toso” and Region vertical lift “Region de levantamiento vertical ”, each of its stages are presented in images represented by mannequins and a brief explanation of the section. And it is specified with a “YES” for the correct video taking and a “NO” the incorrect way of taking the video, depending on the position of the worker, corresponding to the side or front respectively of these guides Figure 4.

Figure 3 Example of the video capture guide, A) “Torsion and lateral flexion of the toso”, B) “Region vertical lift”.

Figure 4 Shows an example of a frame for three videos A) original video, B) original video with musculoskeletal frame, C) only musculoskeletal frame.

Data acquisition

Once the videos are obtained, they are processed and two additional ones are generated, the first where the musculoskeletal frame is added to the original video and a second with only the musculoskeletal frame, these are in mp4 format. Already having you three videos, these are stored and their references are placed to be able to show them online for consultation by users. Figure 3 shows an example of a frame for these three videos, a) original video, b) original video with musculoskeletal frame and c) only musculoskeletal frame. In Table 1, the number of processed video frames of the total of those that make up the recording is shown in the first column. The second column indicates the degrees of inclination of the back, where 180° means that the person standing is completely straight, while when the person is bending, this value begins to decrease. The higher the incline, the lower the value in degrees in the second column and therefore the greater the risk to back health. In order to determine moments of greatest risk, a threshold was established to classify low and high risk positions. Since the NOM-036 standard does not indicate a specific risk value, a threshold of 150° was established for this study. In the third column of the table, the positions of low occupational risk that correspond to the values ??that exceed the mentioned threshold are indicated with “0”, while those that are below said threshold are indicated with a value of “1” and therefore They are of greater risk, referenced to the 150° angle mentioned, which can be modified by the user. It is important to consider that, since the degrees of each video frame are kept in the database, it is possible to recalculate the third column based on the criteria established to determine the degree of inclination necessary to consider a posture as risky.

Picture of video

Degrees

Less than threshold

1

178

0

2

165

0

3

121

1

4

137

1

5

135

1

6

121

1

7

110

1

8

106

1

9

95

1

10

93

1

11

80

1

12

81

1

Table 1 Where first column, information about number of processed video frames, second column indicates the degrees of inclination and third column indicated with “0”, low occupational risk, “1” indicated with greater risk, referenced to one 150° angle

Data visualization

The interface developed, on web platform, consists of 4 sections, Figure 5. In the first section " Apartado", a diagram is presented relative to the particular section of the standard that is being analyzed, and written guidance information. In the second section "Videos", there is a group of buttons that allow you to see three videos, the original video, a video processed with Musculoskeletal frame, and the third of the musculoskeletal frame. The data section “Datos” shows a table with the data, consisting of the different angles obtained from the video processing, the user being able to order by record, minute video frame or event according to the standard 1 for a risk-free situation and 0 for a risky situation. Finally, the graphics "Grafica" section shows 2 graphics, in the first inclination degrees “Grados de inclinación”, it shows the degrees presented during the duration of the video, showing in this example, pronounced falls below 100 degrees, in a second graphic time that exceeds maximum inclination “ Tiempo que supera inclinación máxima”, where the blank spaces of the graph.

Figure 5 The interface developed, on Web platform, consists of 4 sections, the first section " Apartado", a diagram of the standard that is being analyzed, in the second section "Videos", that allow you to see the three video. Table with the data, consisting of the different angles obtained “Datos” section and 2 graphics, Inclination degrees and time that exceeds maximum inclination.

Conclusion

The NOM-036-1-STPS-2018 standard is quite extensive and most of it has been done in a traditional way, that is, taking a video, taking pictures, measuring angles on them and recording the data on sheets of paper or better of the cases in templates in Excel. Our proposal implements emerging technologies, to achieve a more precise quantitative analysis, store the data and interact with them in such a way that they can review the parameters and gradually adjust the critical angle data, for these in an interface prepared for it. In an innovative way, applying AI techniques to postures and identification of movement and artificial generation Musculoskeletal frame to detect inappropriate postures, at risk and adequate in regulations requested by countries, in work activity situations, such as the Mexican standard NOM-036-1 -STPS-2018. Work continues and related future work, templates are being implemented for the automatic generation of reports that the standard marks and to present useful documentary evidence for companies and auditors. In addition to exploring the possibility of analysis for more than one person per video.

Acknowledgments

None.

Conflicts of interest

Authors declare that there is no conflict of interest.

References

  1. https://www.who.int/news-room/fact-sheets/detail/musculoskeletal-conditions
  2. https://www.who.int/news-room/fact-sheets/detail/musculoskeletal-conditions
  3. https://safework.es/en/
  4. https://www.ilo.org/global/lang--en/index.htm
  5. http://www.imss.gob.mx/conoce-al-imss/informes-estadisticas
  6. Walsh IA, Oishi J, Coury HJ. Clinical and functional aspects of work-related musculoskeletal disorders among active workers. Rev Saude Publica. 2008;42(1):108–116.
  7. Barkallah E, Freulard J, Otis MJ, et al. Wearable Devices for Classification of Inadequate Posture at Work Using Neural Networks. Sensors (Basel, Switzerland). 2017;17(9):2003.
  8. Kadefors R, Nilsson K, Östergren PO, et al. Social inequality in working life expectancy in Sweden. Z Gerontol Geriatr. 2019;52:52–61.
  9. https://www.who.int/news-room/fact-sheets/detail/musculoskeletal-conditions
  10. https://osha.europa.eu/en/themes/musculoskeletal-disorders
  11. https://www.osha.gov/ergonomics
  12. https://www.gob.mx/stps.
  13. Maribel Balderas López, Mireya Zamora Macorra, Susana Martínez Alcántara. Trastornos musculo esqueléticos en trabajadores de la manufactura de neumáticos, análisis del proceso de trabajo y riesgo de la actividad. 2019.
  14. Arenas-Ortiz L, Cantú-Gómez Ó. Factores de riesgo de trastornos músculo-esqueléticos crónicos laborales. Med Int Mex. 2013;29(4):370–379.
  15. http://www.inr.gob.mx/descargas/ops-oms/lasenfermedadestraumatismossistemamusculoesqueletico.pdf
  16. Hernández ÓG, Morell V, Ramon JL, et al. Human Pose Detection for Robotic-Assisted and Rehabilitation Environments. Applied Sciences. 2021;11(9):4183.
  17. Plantard P, Auvinet E, Pierres AS, et al. Pose estimation with a Kinect for ergonomic studies: evaluation of the accuracy using a virtual mannequin. Sensors (Basel). 2015;15(1):1785–1803.
  18. Diego-Mas JA, Alcaide-Marzal J. Using KinectTM sensor in observational methods for assessing postures at work. Applied Ergonomics. 2014;45(4):976–985.
  19. Olibario J Machado Neto, Amanda Polin Pereira, Valeria Meirelles C Elui, et al. Posture Monitoring via Mobile Devices: SmartVest Case Study. Pimentel Proceedings of the 22nd Brazilian Symposium on Multimedia and the Web. 2016:55–61.
  20. Pedro Vinicius A de Freitas, Paulo Renato C Mendes, Antonio José G Busson, et al. An ergonomic evaluation method using a mobile depth sensor and pose estimation. Proceedings of the 25th Brazillian Symposium on Multimedia and the Web. 2019:445–452.
  21. Proceedings of the 25th Brazillian Symposium on Multimedia and the Web October 2019 Pages 445–452.
  22. https://deepblue.lib.umich.edu/bitstream/handle/2027.42/151464/meiyin_1.pdf?sequence=1
  23. Li L, Xu X. A deep learning-based RULA method for working posture assessment. Proceedings of the Human Factors and Ergonomics Society Annual Meeting. 2019;63(1):1090-1094.
  24. Toshev A, Szegedy C. DeepPose: Human Pose Estimation via Deep Neural Networks. 2014 IEEE Conference on Computer Vision and Pattern Recognitio. 2014:1653–1660.
  25. Yoga Pose Classifification Using Deep Learning Kothari, Shruti, "Yoga Pose Classification Using Deep Learning". 2020.
  26. https://www.tensorflow.org/lite/examples/pose_estimation/overview
  27. Andriluka M, Pishchulin L, Gehler P, et al. 2D Human Pose Estimation: New Benchmark and State of the Art Analysis. 2014 IEEE Conference on Computer Vision and Pattern Recognition. 2014: 3686-3693.
  28. Martinez J, Hossain R, Romero J, et al. A Simple Yet Effective Baseline for 3d Human Pose Estimation. 2017 IEEE International Conference on Computer Vision (ICCV). 2017:659–668.
Creative Commons Attribution License

©2021 Sandoval-Palomares, et al. This is an open access article distributed under the terms of the, which permits unrestricted use, distribution, and build upon your work non-commercially.