Doc bots: AI tools have entered GP clinics

The growing use of new AI tools within the medical field has some doctors embracing the new technology and others treating it with caution.  

Doctors use personal devices to access AI tools. Photo: Nicolas Thurn-Valsassina

Artificial intelligence tools are now available for general practitioners during medical consultations. 

One of then, which was developed by two Melbourne GPs, has more than 1000 active users. 

The Australian-made AI software can suggest treatments and write prescriptions, potentially giving doctors more time to concentrate and listen to their patients, rather than typing.

Considered an “AI Intern”, the software provides medical transcriptions of patient consultations through voice recording in 30 seconds. 

However, the software’s capability to provide health advice based on a patient’s condition has led to new concerns within the medical community. co-founder, Dr Chris Irwin, said the software’s ability to recommend treatment was due to advanced search engines with predictive text. 

“We can bid for them to only listen to verified medical texts as opposed to the whole of the internet,” Dr Irwin said. 

Despite the efforts to explain the product, critics believe that the lack of testing could potentially expose patients. 

Melbourne University’s General Practice and Primary Care expert, Dr. Javiera Martinez, advocates for additional testing prior to permitting AI to provide medical advice. 

“That information has not been proven correct or effective in a protected environment, such as in a randomized study, or with people who know the methodology to effectively prove that it is safe for the population,” Dr Martinez said. 

“There’s a significant risk of this information being used and the clinical recommendations being inappropriate.” 

Dr Martinez also warns of risks to patient’s privacy and data confidentiality being exposed, as the software keeps record of all transcripts from consultations. 

“We have conducted surveys and systematic reviews with patients regarding their thoughts on these types of clinical support tools, and privacy is always the main concern,” she said. 

“The information that identifies patients is confidential and cannot be shared beyond a clinical purpose.” 

Despite not having any current AI legislation, the government aims for new measures in the future. The Australian Medical Association is calling for more regulations towards the use of AI within the healthcare sector, which is potentially creating more barriers for software such as 

Dr Irwin argues the Department of Health has been caught off guard by AI’s rapid development, with fear and dogma being the prime reasons for its pushback within the public health sector. 

“The potential of AI far exceeds the risks,” he said. 

“Ironically, they’re going to kill patients that otherwise could have been saved without more AI integration into healthcare.”  


Your email address will not be published. Required fields are marked *

About Dscribe

Dscribe showcases the work of Deakin University’s journalism students. The opinions contained in Dscribe stories are that of the individual, and not Deakin University. If you believe that any of the material on this website infringes on your rights, click here: COPYRIGHT