Using ChatGPT in Healthcare Settings
Chat GPT has been on the tip of everyone’s tongues. It has featured in the media, writing our TV shows, and even booking our holidays. But what actually is ChatGPT and does it have a place in healthcare settings?
This blog will explore ChatGPT and its possible role in healthcare settings
What is chat GPT?
ChatGPT stands for Chat – Generative Pre-trained Transformer model. This is a type of artificially intelligent (AI) language model is specifically trained to understand natural language and respond in the same manner.
ChatGPT is based on a type of AI model called a large language model (LLM). Without getting too technical LLMs are based on a type of statistical modelling which is particularly powerful in the realm of processing natural language.
For example, I asked ChatGPT a question or a ‘prompt’ as it is known:
‘What is the treatment for hypertension?
It will generate an answer similar to:
Medications are often prescribed when lifestyle modifications alone are insufficient to control blood pressure, or if hypertension is severe. The choice of medication depends on individual factors and includes options like diuretics, ACE inhibitors, angiotensin II receptor blockers (ARBs), beta-blockers, calcium channel blockers, and others.
How does ChatGPT work?
In its simplest form, AI models are trained on a database and can learn patterns via computational analysis. The answers provided are based on the database that it was trained on. The particular model used in ChatGPT converts the initial input text (i.e. the question) into numbers. Calculations are performed on these numbers based on the database the model was trained on (an algorithm). An algorithm is a series of rules which is followed by a computer to solve a problem. Common algorithms used are called neural networks. Once the original set of numbers have passed through the algorithm. A final set of numbers are ejected. These are converted back to words i.e. the answer to our original prompt/question. There are many ways and different models available where words can be converted into numbers. But for the purpose of this blog I won’t go into depth.
The above is an extremely simplified description of the overview of the inner workings of ChatGPT. As time goes on and more questions are asked, the database will grow and the AI model will learn in its ability to answer more questions more accurately. Currently, the ChatGPT database is based on the internet, books, articles etc. – so as this body of knowledge grows it will grow in its ability to provide more accurate answers.
Is there a use for it in healthcare settings?
As with most open-ended questions, I believe there isn’t a straight answer for this. Healthcare teams day to day is as diverse as the sector. One key factor which we have to keep in the forefront of our minds is the database which underpins the answers, which are delivered from Chat GPT.
Revisiting the initial question/prompt I demonstrated above about hypertension we can see that the answer is very generalised. If we delved a bit deeper, the information is actually from US and WHO guidelines, which is not our main source in the UK. In addition to this our treatment pathways are focussed on ethnicity amongst other parameters for choice of treatment. Therefore, I feel this demonstrates the limitations of such a tool in its current form. However, that is not to say without a more focussed question/prompt we may get a more focussed answer.
Some questions which come to mind when thinking about the validity of ChatGPT are:
Who has contributed to this data? Does the data have a hierarchy where some views are presented more than others in the answers it provides?
In an attempt, to reduce the philosophical angle, as healthcare professionals it is our duty to listen and understand our patients from any background – ethnicity, race, gender etc.
But are all these voices represented when asking ChatGPT more complex questions?
To conclude, I would be inclined to suggest when it comes to more complex areas such as consultation practices, I would hesitate to use such a tool in its current form. However, other forms of AI can be used to support our consultation practices to increase patient safety, optimise patient outcomes and workflows. In addition, with all businesses, small and large, there is a large amount of paperwork and administration tasks. These tend to have clear rules – such as tick a box here, fill out a template there. This is where I would feel a tool such as ChatGPT would have a place to optimise our time and workflow within the sector.
Dr Yasmin Karsan, MRPharmS, PhD, PGDip, MSc.
Founder of the Digital Clinical Safety agency.
Contact: yasmin@digitalclinicalsafety.com