1.2.1 Considering human-machine interaction
Practical guidance - healthcare
Author: SAM demonstrator project
Artificial intelligence (AI) and machine learning (ML) applications in healthcare are often evaluated on narrowly defined tasks. However, the real challenges for the adoption of AI and ML will arise when algorithms are integrated into clinical systems to deliver a service in collaboration with clinicians as well as other technology. It is at this clinical system level, where teams consisting of healthcare professionals and AI systems cooperate and collaborate to provide a service, that human factors challenges will come to the fore.
When automation started to be deployed at scale in industrial systems, human factors research on “automation surprises” and the “ironies of automation” explained some of the problems that appeared with the introduction of automation. The fundamental fallacy is the assumption that automation might replace people, but in actual reality the use of automation changes and transforms what people do. Clinical systems are not necessarily comparable to commercial aircraft or autonomous vehicles. However, a look across these different industries can be useful to highlight potential human factors challenges that are likely to require consideration when adopting AI and ML in patient care. Such human factors challenges relate to cognitive aspects (automation bias and human performance), handover and communication between clinicians and AI systems, situation awareness and the impact on the interaction with patients.
The table provides an illustration of human factors issues that might require consideration in the example of the design of an autonomous infusion pump to be deployed in the intensive care setting.
Human factors challenge | Description | Example |
---|---|---|
Handover | The autonomous system needs to be able to recognise its own performance boundaries, project into the future clinical scenarios that will be beyond its performance boundaries, and identify suitable ways to hand over control to the clinician. Handover includes consideration of: (a) when to hand over; (b) whom to hand over to; (c) what to hand over; and (d) how to hand over. | The patient’s blood sugar levels do not respond sufficiently to the insulin given by the autonomous infusion pump. The pump predicts and recognises that it will not be able to control the patient’s blood sugar. The pump triggers an alert on the electronic health record, raises an audible alarm, and requests the nurse to take over. The nurse can review the reason for the alert, the history of the pump’s insulin management, and its projection into the future, and act accordingly. |
Performance variability | Clinicians need to manage competing organisational priorities and operational demands. They use their experience and judgement to make trade-offs based on the requirements of a specific situation. The autonomous system needs to support rather than constrain this performance variability and adaptive capacity. | The nurse realises that insulin has not yet been prescribed for the patient even though they will likely need it. The nurse finds the doctor, explains the situation, and the doctor issues a verbal medication order and will follow this up with the written prescription later (performance variability). The autonomous system requires an electronic medication order, but allows for a manual override. The autonomous system sends reminders to the doctor with a request for completing the electronic medication order. |
Automation bias | When a system works well most of the time, clinicians start to rely on it. In some situations, this can lead to overreliance, for example when the system takes an inappropriate action but the clinician does not recognise this because they trust the system. | Due to sepsis the patient requires tighter control of blood sugar levels than usual. The autonomous system has successfully managed septic patients before but, in this instance, fails to recognise the need for tighter glycaemic control. The autonomous system provides the clinician interpretable justification and explanation of its decisions, and the clinician, who has received training on potentially inappropriate behaviours of the autonomous system, is able to spot the discrepancy and act accordingly. |
Supervision | Clinicians are both users and supervisors of the autonomous system. They need to understand not only how to operate the autonomous system (e.g. loading a syringe), but also how to recognise potential failure modes or deviations from appropriate behaviour or changes in the environment that might move the autonomous system outside of its design envelope. | The autonomous infusion pump is operating on the sliding scale algorithm for administering insulin. It classifies the patient’s response to the current insulin infusion as requiring transition to another scale with 70%, as opposed to 30% for staying within the current scale. The autonomous system initiates and the transition, and activates an “uncertainty marker” to alert the clinician. |
Related links
Download this guidance as a PDF:
Related links
Download this guidance as a PDF: