Consent and AI Medical Scribes - Is It Required?

After hearing a variety of opinions, I'd like to share my thoughts on whether explicit patient consent is required to use AI scribes in clinical settings

Consent and AI Medical Scribes - Is It Required?

I’d like to start a discussion around whether explicit patient consent is necessary before using an AI medical scribe in a clinical setting. I have heard a variety of different answers to this question, including the confusion between consenting to receive healthcare, and consenting to personal data processing, so wanted to share my thoughts.

Consent to processing of personal data - GDPR Considerations

Firstly, considering personal data processing - does a patient need to consent for their data to be processed? Under General Data Protection Regulation (GDPR), explicit patient consent is not usually required for processing health data if:
there is a valid lawful basis (Article 6); and a special category condition is also met (Article 9) [Ref 1].

For healthcare organisations processing patient data for the purpose of delivering healthcare, these are typically:

  • Article 6(1)(e) – in the public interest/task; with
  • Article 9(2)(h) – for the provision of health or social care, provided that appropriate safeguards are in place

In a hospital or GP practice setting, the healthcare organisation acts as the Data Controller. If the healthcare organisation deploys an AI medical scribe (a Data Processor) under appropriate contracts and Data Processing Agreements, the organisation could rely on these lawful bases rather than requiring explicit Consent from each patient.

However, in addition to Lawfulness, other GDPR principles still apply—including Transparency. Patients should be informed:

  • how their data will be processed;
  • by whom;
  • for what purpose; and
  • how it will be protected.

This can be communicated in a number of ways, for example verbally during consultations, via privacy notices, leaflets or posters. Additionally, because this involves special category data, a Data Protection Impact Assessment (DPIA) is required, completed by the healthcare organisation (Data Controller) with the assistance of the AI scribe manufacturer (Data Processor).

Consent to receive healthcare

When patients present voluntarily for clinical care and interact with a clinician, there is an implied consent to engage with the healthcare process. In most cases, this implies the use of standard tools and technologies—such as an EHR or AI scribe—to help deliver care efficiently.
GMC Professional Standards do mention the need for consent when storing recordings [Ref 2]. However, if an AI scribe merely transcribes audio in near real time—and does not store or retain the audio recordings — this requirement may not apply directly. Nonetheless, informing the patient remains best practice.

Opt-Outs

While explicit consent might not be a legal requirement, some clinicians or organisations choose to offer an opt-out as a courtesy. This respects patient autonomy — if someone objects to the use of an AI scribe, the clinician can address any concerns and either reassure the patient about privacy and security measures, or avoid using the tool for that particular consultation.

Patient trust in AI

Given some portrayals of AI in the media, it is important to note that it may take time for patients to trust the use of AI in clinical settings. This should be taken into consideration when informing patients of use of AI in clinical practice, and in part, a responsibility of government, healthcare organisations and clinicians to inform and educate patients on the safe use of AI, reassuring them and continuing to maintain a trusted clinician-patient relationship.

Other requirements

Outside of compliance with data protection and security requirements, AI ambient scribe manufacturers, will of course need to meet additional requirements including an appropriate clinical risk assessment in line with DCB0129. I will cover Safety and AI scribes in a future article. In addition, given the clinical use and the nature of summarising, AI scribes might in future be regulated as medical devices, although this is a topic of ongoing debate. I will cover Regulation and AI scribes in another future article.

Conclusion

In summary, explicit consent is not strictly required to use an AI medical scribe in most clinical contexts, provided the proper lawful basis, transparency, and data protection measures are in place. Patients should be clearly informed about how and why their data is processed. If a patient objects, the clinician can decide on the best way forward to maintain a positive clinician–patient relationship.

What do you think? Have I missed anything important? I’d be interested in hearing any additional considerations or experiences with AI scribes in clinical practice.


As always, we can help with any issues both digital health suppliers or healthcare organisations might face around the security, safety and regulation of digital health products, including AI. Reach out to set up a call or to learn more!



References
Ref 1: ICO website - Special category data
Ref 2: GMC Professional Standards

Disclaimer: This post provides a general overview and should not be taken as legal advice. Always consult your local guidelines, regulatory bodies, and legal counsel for specific requirements