Horizon CDT Research Highlights

Research Highlights

Automatic pain assessment in neonates

  Joy Egede (2013 cohort)   www.cvl.cs.nott.ac.uk/people/joy-egede.html

Background and motivation

As part of medical treatment, neonates in Intensive Care Units (ICU) go through several painful medical procedures. Contrary to the previous conception that neonates are less sensitive to pain or do not recall pain [1], clinical studies have shown that neonates do in fact have a higher sensitivity to pain than older children. Prolonged exposure to pain and use of analgesics may have adverse effects on the future well-being of the child and could affect their eventual sensitivity to pain [2].

Currently, pain assessment in newborns involves observation of physiological, contextual and behavioural indicators. Some of these indicators include facial expression changes, heart rate, palm sweating, limb movement and cry. Infant monitoring is usually done immediately after painful medical procedures and subsequently after every two to four hours depending on the severity of illness. However, this approach does not allow for constant monitoring of response to pain medication as doing this will place enormous demand on the human resources. In addition, this method is highly subjective and could potentially be biased. This poses a risk to the life of the infant.

Research aim

  1. To develop an objective pain assessment tool that will support nurses clinical decisions on infant pain treatment and also provide real time pain monitoring.
  2. To investigate the perception of neonatal ICU nurses towards using an automatic pain assessment tool.

Research methodology

This work proposes to apply automatic human behaviour recognition techniques to infant pain assessment. These techniques have already been applied in other areas such as security and surveillance, health systems, sleepy/drunken driver detection, games and depression recognition [3]. The first part of this study will involve video recording of newborns going through painful medical procedures or infants in pain due to long term medical conditions. The videos will be rated for pain intensity using standard infant pain assessment tools. Physiological data of the infant e.g. heart rate and oxygen saturation will also be recorded.

Machine learning and computer vision techniques will be used to develop an automatic infant pain assessment tool from analysis of audio-visual data and physiological signals.

Furthermore, a focus group study will be conducted with neonatal ICU nurses and doctors. The purpose will be to (i) gain a better understanding of the clinical pain assessment and identify areas of possible improvements and (iii) investigate the disposition of the stakeholders towards using an automatic pain assessment tool.

Research impact

A pilot study has been completed on an adult pain database and we achieved state-of-the art results on a continuous pain scale [4]. The output of this research could potentially result in improved pain treatment for infants. Beyond hospital use, this tool can be used at home by parents taking care of outpatient infants. It also creates opportunities for new research e.g. measuring the impact of different types of medications on infants. Finally, information from the clinical field study will provide valuable knowledge that will inform the future design of automatic infant pain assessment tools.

References

  1. N. Wellington and M. J. Rieder, "Attitudes and practices regarding analgesia for newborn circumcision," Pediatrics, vol. 92, pp. 541-543, 1993.
  2. C. C. Johnston and B. J. Stevens, "Experience in a neonatal intensive care unit affects pain response," Pediatrics, vol. 98, pp. 925-930, Nov 1996.
  3. M. Valstar, S. Björn, S. Kirsty, E. Florian, J. Bihan, B. Sanjay, et al., "AVEC 2013: the continuous audio/visual emotion and depression recognition challenge," Proceedings of the 3rd ACM international workshop on Audio/visual emotion challenge, pp. 3-10, 2013.
  4. Egede, J., Valstar, M., Martinez, B. (2017) "Fusing Deep Learned and Hand-Crafted Features of Appearance, Shape, and Dynamics for Automatic Pain Estimation". IEEE Conference on Automatic Face and Gesture Recognition, Washington DC, USA.

Publications

  1. Egede, J., Valstar, M., Martinez, B. (2017) "Fusing Deep Learned and Hand-Crafted Features of Appearance, Shape, and Dynamics for Automatic Pain Estimation". 12th IEEE Conference on Automatic Face and Gesture Recognition, Washington DC, USA.

This work was carried out at the International Doctoral Innovation Centre (IDIC). The authors acknowledge the financial support from Ningbo Education Bureau, Ningbo Science and Technology Bureau, China's MOST, and the University of Nottingham. The work is also partially supported by EPSRC grant no EP/G037574/1.