ERK'2022, Portorož, 83-86 83 Concept Development for Sensor Based Virtual Intensive Care Unit Jugoslav Achkoski 1 , Boban Temelkovski 1 , Andrej Košir 2 , Marko Meža 2 1 Military Academy “General Mihailo Apostolski”- Skopje associate member of "Goce Delchev" University - Shtip Republic of Macedonia 2 Fakulteta za elektrotehniko, Univerza v Ljubljani E-pošta: jugoslav.ackoski@ugd.edu.mk Abstract. The study describes concept development for sensor based virtual Intensive Care Unit, incorporating collecting variety of psychophysiological signals from different sensors. Exploiting the implemented architecture, the experiment was conducted over three participants. The design of experiment has been composed of three phases and one task. The task has been related to subtract numbers starting from 100 and decreasing it by 7. The data about heart rate (HR), respiratory rate (RR) and Arousal from the experiment were processed and created dataset has been validated. 1 Introduction Nowadays, the use of sensors has become indispensable in the daily practise of clinicians. Sensors are integrated into systems (hardware and software) to make appropriate decisions regarding patients [1,2]. Intensive care units in hospitals are the base for monitoring the condition of patients. The use of modern technology in the ICU helps to reduce the mortality rate of monitored patients and the duration of hospital treatment. In addition, we have found that the development of technology leads to a reduction in costs for ICUs in terms of transportation and personnel. In our research of sources available online, we found that 300 hospitals in the United States have established virtual ICUs in more than 34 states. This number is expected to increase in rural areas in the future, as it is currently limited by high-speed Internet connections. In addition, it was found that half a million patients admitted to the ICU die each year in the United States. Further analysis found that this is potentially preventable and one in 10 deaths could be avoided if intensivists (doctors and nurses) provided adequate medical care. This concept refers to more detailed determination of patient status combined with better bedside monitoring. The technological approach of this concept helps to strengthen medical capacity for these types of situations, as critical care nurses and virtual intensivists can care for more patients simultaneously [3-5]. In emergency medicine, on the other hand, virtual ICUs are equipped with cameras, sensors to collect physiological data, etc., and the data is visualised by computer and alarms are triggered. Virtual ICUs are at a higher level of patient care than traditional ICUs. The benefits of virtual ICUs (eICUs) are well known in the nursing world, but current technological developments allow inpatient ICUs to operate at a higher level [10,16,17,18,19]. Therefore, the implementation of a social cueing system in the ICU along with the new multisensory platform to capture physiological parameters is a possible approach to achieve this goal. Despite the importance of social cues in everyday situations and recent advances in machine analysis of relevant behavioural cues such as blinking, smiling, crossed arms, laughing, etc., the design and development of automated social cue processing systems (SSP) is not straightforward. In addition to monitoring patient status in the eICU, it is also possible to introduce ICU monitoring (in terms of cognitive load) compared to the traditional ICU. To be more accurate, sensor technology to detect physiological and psychophysiological parameters is becoming more sophisticated. On the other hand, the ICU can be improved with various technologies as a system, such as a social signal detection system [11,12]. Considering the above advantages of this technology, it is possible to develop a concept for sensor-based eICUs. It is important to point out that the development of an advanced sensor-based virtual intensive care unit (eICU) that uses advanced multisensory platforms to measure physiological and psychophysiological data will help monitor subtle changes that may escape the attention of the intensivist. Another important outcome of this research is the implementation of a social cueing system in the Quazy ICU based on several different sensors. The application of the system refers to the assessment of the patient's status through different measurements: internal - measurement of physiological parameters and external - measurement of psychophysiological signals [13]. Research on virtual intensive care units, software development for physiological data storage, development of algorithms for appropriate decision- making regarding patients, implementation of the system for social signals acquisition in quazy - medical virtual intensive care units. In addition, there are many articles illustrating virtual ICUs, but the concept of integrating various multiplatform sensors, especially for social signal and biosignal sensing, has not been sufficiently exploited [14,15]. We believe that this is a new dimension for intensivists, where integration of different systems should be the next step in the ICU. 84 2 System Architecture for Collecting Psycho-physiological Data in ICU This section demonstrates data acquisition from various multisensory platforms that capture biosignals in the form of ECG, heart rate (HR), and respiratory rate (RR). The section also demonstrates the new approach of implementing the multisensory platform to capture behavioural signals in the ICU. The multisensory platform consisted of sensors to capture behavioural data on the following: Gaze, eye position, pupil dilation, facial expression. In addition, the galvanic skin response (GSR) sensor was added as a separate sensor. Connecting sensors and video cameras to a single computer simplified time synchronization because all recorded time stamps from different sensors used the same internal clock. 1) Eye tracking: Tobii Eye Tracking: The Tobii X2-30 Eye-Tracker4 is used to record gaze position, eye position and pupil dilation. It is attached to the bottom of the monitor. The Tobii Eye-Tracker uses an infrared light source to create reflection patterns on the corneas of the eyes. The reflection patterns are detected and processed by the eye tracker's image sensors to determine the 3D position of each eyeball, the direction of gaze, and pupil dilation. The signals are tracked separately for each eye. The sampling frequency during the experiment was approximately 30 Hz. Each data sample consisted of a time stamp (later used for synchronization), a 3D vector of eye position, a 3D vector of gaze position, and a scalar pupil diameter. Some additional signals were computed from the measurements acquired by the TobiiEye Tracker. 2) Capturing facial emotion expressions: Noldus FaceReader: Noldus FaceReader 6.15 is used for facial emotion detection and behavioral signal detection (the affective state based on valence and arousal dimensions). FaceReader detects facial emotional expressions by analyzing 20 commonly used action units (AUs); for a comprehensive overview of FaceReader. FaceReader's camera is located on top of the monitor. The sampling frequency of FaceReader was about 6 Hz during the experiment. Each data sample is time stamped and contains the values for valence and excitation. 3) The galvanic skin response was measured using a specially designed device to measure the electrodermal activity of the skin at a rate of a few samples per second. Skin conductance depends on sweat gland activity, which is not under conscious control but is modulated by sympathetic activity, which correlates closely with cognitive and emotional states [3]. The Zephyr BioHarness™ 3 can be used in a variety of activities (sports, military, everyday activities). The BioHarness™ 3 collects and transmits comprehensive physiological data of the wearer via mobile and fixed data networks, enabling true remote monitoring of human performance and condition in the real world. The following physiological data can be measured: Heart rate, R-R interval, respiratory rate, ECG, posture, activity level, peak acceleration. Zephyr API for Android operating system is provided by the manufacturer for physiological data acquisition. The communication protocol for physiological data transmission is Bluetooth 2.0, which supports 3 Mbit/sec data transmission. For the described laboratory setup, the zephyr demo app was extended by an additional function for saving the data in a CSV file. Physiological data were collected in the following order: Heart rate (HR), respiratory rate (RR), and peak acceleration. 3 Experimental Study In this section, the experimental design performed in a simulated intensive care unit and the analysis procedures of the collected data are explained. The concept of the experiment was to create a suitable environment for continuous data acquisition by the previously mentioned sensor, integrated in the same way as the sensors in the ICU. Therefore, a simulated ICU was created (Figure 3). In the simulated ICU, data were collected from the following sensor: Zephyr Bioharness 3, Noldus FaceReader 6.15, Galvanic Skin Sensor, and Tobii Eye Tracking. Three participants equipped with sensors participated in the experiment. The experiment was conducted in three phases: Phase I - the participant sits for 2 minutes and does nothing; Phase II - the participant sits and is given a mental task. The task refers to subtracting numbers Figure 2. Graphical User interface of Multisensory platform for measuring psychophysiological signals Figure 1. System architecture for collecting bio and behavioural signals from different multisensory platforms 85 starting at 100 and decreasing by 7. During the II phase of the experiment, the participant is not timed; III phase - the participant does nothing, and the activity is identical to that from phase I. The start and end point of each phase is determined via the Noldus camera using a white sheet of paper. On one side of the sheet, when the phase II begins, is written "COST ENJECT START" and on the other side of the sheet, when the phase II ends, is written "COST ENJECT STOP". The experiment is recorded, and a video file is created for each participant and a log file is saved in Notepad with data about the timestamp in milliseconds, the captured image and the values of the psychological parameters. The recorded video is used to determine the beginning and end of the II phase of the experiment, and the saved log determines the exact timestamp. The motivation to conduct the experiment in this way refers to the possibility to study the psychophysiological signals in terms of changes under stimuli. The experimental design includes the stimuli within the subtraction activity. In addition, the signals obtained show the changes of the activities between the phases. Moreover, the experiment was performed twice per participant to investigate possible changes and to make comparisons. The collected data were stored in different formats (JSON, CSV and TXT). Log files generated by the Tobii Eye Tracking Camera and Galvanic Skin Response Sensor were saved in JSON format. Data from the Zephyr Bioharness 3 sensor were stored in CSV format for each participant in the experiment. The application sent the data to the data collection center via email. Synchronization of clocks between the Android tablet and the Windows laptop was performed, which is reflected in the CSV file. A delay of 4s was detected, which was caused by the clock of the Android tablet. The delay of 4s was added to the data set received from the Zephyr Bioharness 3 to achieve synchronization of the data. 4 Data analysis Analysis of the collected psychophysiological data was performed. The analysis of the data set focused on the following psychophysiological parameters: Arousal, heart rate, and respiratory rate. We calculated the correlation coefficient (R). The result of the correlation coefficient was determined as a critical point in the research, as it shows the further development of the research. To be more precise, the obtained result of correlation showed the degree of relationship between variables (HR, RR and arousal) and it also showed positive and negative changes in the form of increasing and decreasing values in psychophysiological variables. The value of R between HR and arousal is -0.2. This is a negative correlation, the relationship between these variables is weak. This means that the changes in heartbeats are not linearly followed by arousal (Figure 5). Moreover, the negative correlation means that one variable is increasing while another is decreasing. The closest value of R between RR and Arousal is -0.2, which is also a negative correlation, and the relationship is weak. This means that changes in RR are not linearly followed by arousal (Figure 6). The value of R between HR and RR is 0.7. It is a positive correlation, and the relationship is strong. This means that changes in respiratory rate are linearly followed by heart rate (Figure 7). 5 Conclusion In conclusion, the obtained results on the correlation of psychophysiological parameters are not sufficient to be considered as meaningful evidence for the establishment of the concept of a sensor-based virtual ICU. More specifically, the results show that the relationship between psychological and physiological parameters for bedside monitoring of patients is weakly correlated, and changes in one parameter are not linearly associated with changes in another parameter. Figure 5. Scatter plot of HR vs Arousal Figure 4. Graphical presentation of obtained dataset in the experiment Figure 3. Experimental measurement setup, showing two study participants 86 In our opinion, one of the reasons for the weak correlation between parameters may be related to the stimuli specified in the experiment. Another reason for the result could be the environment, because the experiment was conducted in a simulated intensive care unit. Moreover, the participants of the experiment are not real patients with health problems. Acknowledgements This work was sponsored in part by ARRS under the scientific program P2-0246 and COST action: TD1405 - European Network for the Joint Evaluation of Connected Health Technologies (ENJECT). References [3] Critchley, H. D. (2002). Electrodermal responses: What happens in the brain. Neuroscientist, 8, 132-142 [1] Baggs, J. G., Ryan, S. A., Phelps, C. E., Richeson, J. F., & Johnson, J. E. (1992). The association between interdisciplinary collaboration and patient outcomes in a medical intensive care unit. Heart & lung: the journal of critical care, 21(1), 18-24. [2] Myers, M. A., & Reed, K. D. (2008). The virtual ICU (vICU): a new dimension for critical care nursing practice. Critical care nursing clinics of North America, 20(4), 435-439. [3] Kowitlawakul, Y. (2011). The technology acceptance model: predicting nurses' intention to use telemedicine technology (eICU). CIN: computers, informatics, nursing, 29(7), 411-418. [4] Lilly, C. M., Zubrow, M. T., Kempner, K. M., Reynolds, H. N., Subramanian, S., Eriksson, E. A., ... &Cowboy, E. R. (2014). Critical care telemedicine: evolution and state of the art. Critical care medicine, 42(11), 2429-2436. [5] Williams, L. M., Hubbard, K. E., Daye, O., & Barden, C. (2012). Telenursing in the intensive care unit: transforming nursing practice. Critical care nurse, 32(6), 62-69. [6] Abkai, C., & Hesser, J. (2009). Virtual intensive care unit (ICU): real-time simulation environment applying hybrid approach using dynamic bayesian networks and ODEs. Stud. Health Technol. Inform, 142, 1-6. [7] Landrigan, C. P., Rothschild, J. M., Cronin, J. W., Kaushal, R., Burdick, E., Katz, J. T., ... & Czeisler, C. A. (2004). Effect of reducing interns' work hours on serious medical errors in intensive care units. New England Journal of Medicine, 351(18), 1838-1848. [8] Stone, P. W., Larson, E. L., Mooney-Kane, C., Smolowitz, J., Lin, S. X., & Dick, A. W. (2006). Organizational climate and intensive care unit nurses’ intention to leave. Critical care medicine, 34(7), 1907- 1912. [9] Bratt, M. M., Broome, M., Kelber, S. T., & Lostocco, L. (2000). Influence of stress and nursing leadership on job satisfaction of pediatric intensive care unit nurses. American Journal of Critical Care. [10] Meža, M., Košir, J., Strle, G., & Košir, A. (2017). Towards Automatic Real-Time Estimation of Observed Learner’s Attention Using Psychophysiological and Affective Signals: The Touch-Typing Study Case. IEEE Access. [11] Siebig, S., Kuhls, S., Imhoff, M., Gather, U., Schölmerich, J., & Wrede, C. E. (2010). Intensive care unit alarms—How many do we need?. Critical care medicine, 38(2), 451-456. [12] Burykin, A., Peck, T., Krejci, V., Vannucci, A., Kangrga, I., & Buchman, T. G. (2011). Toward optimal display of physiologic status in critical care: I. Recreating bedside displays from archived physiologic data. Journal of critical care, 26(1), 105-e1. [13] Burykin, A., Peck, T., & Buchman, T. G. (2011). Using “off-the-shelf” tools for terabyte-scale waveform recording in intensive care: Computer system design, database description and lessons learned. Computer methods and programs in biomedicine, 103(3), 151-160. [14] Siachalou, E. J., Kitsas, I. K., Panoulas, K. J., Zadelis, E. T., Saragiotis, C. D., Tolias, Y. A., ... & Panas, S. M. (2005). ICASP: an intensive-care acquisition and signal processing integrated framework. Journal of medical systems, 29(6), 633-646. [15] Sharma, R., Goel, D., Srivastav, M., & Dhasmana, R. (2017). Psycho-Physiological Parameters of Nurses in Critical and Non-Critical Units. International Journal of Nursing Science, 7(5), 107-110. [16] C. Conati and C. Merten, “Eye-tracking for user modeling in exploratory learning environments: An empirical evaluation,” Knowledge-Based Systems, vol. 20, no. 6, pp. 557–574, 2007. [17] J. Allanson and S. H. Fairclough, “A research agenda for physiological computing,” Interacting with Computers, vol. 16, no. 5, pp. 857–878, 2004. [18] S. H. Fairclough, “Fundamentals of physiological computing,” Interacting with Computers, vol. 21, no. 1-2, pp. 133–145, 2009. [19] A. Vinciarelli and F. Valente, “Social Signal Processing: Understanding Nonverbal Communication in Social Interactions,” in Proceedings of Measuring Behavior, 2010, pp. 118–121. Figure 7. Scatter plot of RR vs HR Figure 6. Scatter plot of RR vs Arousal