The development and use of artificial intelligence in health, aged care and biotechnology is creating opportunities and benefits for health care providers and consumers. Already AI is being used in medical fields such as diagnostics, e-health and evidenced based medicine, although there would appear to be some way to go with the reliability of using AI interrogation of free text data fields in medical records. Deciphering doctor’s hand-written notes is an issue.
Hall & Wilcox has been working with international and Australian clients at the cutting edge of innovation who are using AI to detect falls in hospitals and residential aged care facilities, to determine pain using face recognition software and who are using AI in predictive medicine.
A number of legal, regulatory, ethical and social issues have arisen with the use of AI in the health care sector. The issues is: Can the law keep up with the pace?
There have been a number of working groups established to discuss ethical issues concerning the use of IA in healthcare.
In 2017, the World Health Organisation and its Collaborating Centre at the University of Miami organised an international consultation on the subject. A theme issue of the WHO Bulletin devoted to big data, machine learning and AI will be published in 2020.
The European Union on Ethics in Science and New Technologies published a ‘Statement on Artificial Intelligence, Robotics and Autonomous Systems’ in March 2018.
Whilst Australia is not a member of the EU, its therapeutic goods regulation is heavily aligned with the EU, less so the USA.
The above statement proposed a set of basic principles and democratic prerequisites, based on the fundamental values laid down in the EU Treaties and in the EU Charter of Fundamental Rights. These principles and our commentary are set out below.
Commentary: Should we be transparent in telling people that they are interfacing with AI?
Commentary: What should we delegate to machines? Surely, the best care is the human touch and people should come first?
Commentary: This is consistent with the principle that we should do no harm.
Commentary: It is important to grant equity of access and that the benefits of AI not only be provided to those countries or people who can pay for the technology.
Commentary: The use of AI should be done in accordance with community expectations and standards.
Commentary: There should be adequate compensation for negligence.
Commentary: The use of AI in health care should be appropriately regulated to ensure that it is safe.
Commentary: The protection of privacy and data protection is important.
In Australia, the Therapeutic Goods Act 1989 (Cth) defines ‘therapeutic goods’ and ‘medical devices’ very broadly, particularly if therapeutic claims are made.
Section 41BD of the Act defines ‘medical device’ as.
‘(a) any instrument, apparatus, appliance, material or other article (whether used alone or in combination, and including the software necessary for its proper application) intended, by the person under whose name it is or is to be supplied, to be used for human beings for the purpose of one or more of the following:
and that does not achieve its principal intended action in or on the human body by pharmacological, immunological or metabolic means, but that may be assisted in its function by such means.’
This includes software and mobile apps that meet the definition of ‘medical devices’.
Mobile apps which are simply sources of information or tools to manage a healthy lifestyle are not medical devices.
Software as a Medical Device (SaMD) is regulated on the basis of risk. SaMD must be included on the Australian Register of Therapeutic Goods before they are supplied in Australia unless an exemption applies (such as a clinical trial).
One of the main regulatory hurdles with registration of AI is that it is fluid and constantly changing whereas the TGA review of medical devices is currently based upon a pre-market product at a fixed period of time. The traditional framework of medical device regulation is not designed for adaptive artificial intelligence and machine learning techniques.
On 2 April 2019, the USA FDA published a discussion paper ‘Proposed Regulatory Framework for Modifications to Artificial Intelligence/Machines Learning (AI/ML) Based Software as a Medical Device (SaMD) – Discussion Paper and Request for Feedback’. In this framework, the FDA introduces a ‘predetermined change control plan’ in pre-market submissions. The FDA expects from manufacturers a commitment on transparency and real-world performance monitoring for artificial intelligence and machine learning-based software as a medical device, as well as periodic updates on what changes were implemented as part of the approval pre-specifications and the algorithm change protocol.
It will be interesting to see how the law of negligence and duty of care will adapt to this new technology. If a patient suffers an injury and that injury arose out of the use of AI who will be liable? The treating clinician who relied upon the SaMD? The developer of the algorithm? The programmer of the software? Proving causation may be difficult when there is machine learning in a multi-layered fluid environment when the machine itself is influencing the output.
Only time will tell, and we are in for interesting times.
With over 25 years of corporate, commercial and regulatory experience, Alison Choy Flannigan has specialised in advising clients in the health, aged care, disability, life sciences and community sectors. Alison leads the firm’s Health & Community industry group. She also provides ongoing support for various industry associations and has enthusiastically taken positions within the Industry. She is the Company Secretary for the National Foundation for Medical Research and Innovation and the Asia Pacific Regional Forum Liaison Officer, Healthcare and Life Sciences Law Committee of the International Bar Association. Alison is also on the Australia Chinese Business Council (NSW) Health & Ageing Subcommittee. Alison was previously General Counsel for Ramsay Health Care Limited and was awarded the ACHSM President’s Award for her contribution to and support of the Australian College of Health Service Management. She was formerly Company Secretary of Research Australia and on the risk committee of St Vincent’s Hospital Sydney, as well as on the Institutional Ethics Committees of Northern Sydney Local Health District and South Eastern Sydney Local Health District. Alison is a market leader, having been listed in The Best Lawyers in Australia (and the Australian Financial Review) for Health & Aged Care and Biotechnology since 2008. She has been recognised in the Doyle’s Guide to the Australian Legal Profession as a Leading Health and Aged Care Lawyer in 2017, 2018 and 2019. Alison has been a finalist for the Lawyers Weekly Partner of the Year in Health every year since 2016 and won this prestigious award in 2019. Connect with Alison via email