The CSP office will be closed between Christmas and New Year (25 December-2 January).  If you need urgent advice during this period visit "Advice for members during the holiday closure"

AI: the future of physio?

Could ChatGPT replace a physiotherapist? CSP health informatics lead Euan McComiskie explains

AI: the future of Physio

AI: friend or foe?

With the increasing prominence of AI in everyday life, questions are being asked as to whether this technology will give rise to a modern ‘industrial revolution’ impacting jobs and the way we work in the future. It clearly has the potential to transform work and society, but to what extent and to what benefit?  Could ChatGPT eventually replace a physiotherapist? Well, we asked it …

‘In summary, AI can augment and enhance the role of physiotherapists, but it cannot replace them entirely. The human touch, critical thinking, and communication skills of physiotherapists are essential components of the care they provide, and these qualities cannot be replicated by AI’.  An insightful reply from something generated by a computer.

What is ChatGPT?

Since its public launch in November 2022, there has been a lot of conversation and column inches devoted to ChatGPT (and the wider use of artificial intelligence).  ChatGPT is an AI chatbot developed by OpenAI, a commercial San Francisco-based organisation backed by Elon Musk and Microsoft (amongst others). ChatGPT is similar to other AI chatbots (for example Google Bard), however it has had a rapid swell in users since its launch, gaining four million users in five days, rising to over 100 million users after two months following intense media coverage.

How does it work?

ChatGPT and its competitors work by accessing an enormous dataset to predict the next character and word in a sequence – similar to how predictive text is on your mobile phone but extra. In previous iterations, this generated clunky, non-human and largely unintelligible responses to even relatively simple commands. Recent technological developments have resulted in responses that are often so ‘real’ that users might genuinely believe they have come from another human. This may lead users to disclose personal information that they should not be sharing with an online data platform.

What are the concerns? 

There have been significant concerns about data protection, leading to a temporary ban of ChatGPT in Italy, a libel case in Australia, requests for an AI regulator in America, and a lack of clarity over what data AI chatbots store (or indeed what can be deleted). Caution should be used when sharing information with AI chatbots, expect a similar level of visibility of your ChatGPT posts to a public Facebook post. Because of the rapid advancements in AI technology, coupled with little public understanding of the risks of sharing content, and limited clarity and governance from developers, many tech leaders have asked for a halt in further iterations of existing products. 

Could it be used in physiotherapy?

CSP head of research and development  Matt Liston highlights the impact of ChatGPT in research.

‘We would rate the current strength of evidence from AI chatbots to be of the lowest possible quality and recommend that more robust sources of evidence/ guidance are used (guidelines, Cochrane reviews, peer reviewed systematic reviews).

‘There are many examples of AI systems being developed that have the potential to help shape the future of physiotherapy. These may have uses to support decision making, interactions with service users, prioritisation of workloads and even in personalising interventions. However, although the future is exciting, there are significant technological, ethical, cultural and governance problems that need to be addressed prior to the routine utilisation of AI powered systems in services, requiring significant robust research to understand their acceptability, utility, and performance in clinical environments. 

‘Currently, AI chatbots may have a function in supporting people to understand a new topic, as you can ask it questions such as “provide me with a summary of the recent research evidence for the physiotherapy management of low back pain”. The chatbot will provide a seemingly coherent summary of articles, but will not provide any critical appraisal, nor will it compare and contrast findings to highlight inconsistencies. Also, it will not tell you how it found the articles (search terms), where it looked (that is scientific databases versus web) or whether it provided equal weighting to all criteria (so Lancet articles and Daily Mail articles carry the same importance). Also, the frightening thing is, that some articles it quotes may not exist. There are several examples of ChatGPT creating fictional articles and references. It is for these reasons that we urge members to use extreme caution when using AI chatbots, and that any information that you draw from it should be scrutinised and subject to your own critical appraisal and clinical reasoning, and that all references should be checked.’

CSP data protection co-ordinator Laura Searlesays that while ChatGPT has the potential to support practice, this should not include anything involving patient data such as generating clinic letters or treatment plans.

‘From a data protection point of view ChatGPT, Google Bard, and other AI chatbots, should be approached with caution. When dealing with personal data, and this would include any identifiable patient details, you must comply with data protection legislation, which requires you to ensure appropriate security of the data you are using. The threshold for what is appropriate for medical information will be higher than for some other types of information, and it is unlikely that most chatbots would be deemed sufficiently secure for any task involving patient data (or even for less sensitive personal data). Chatbots can be vulnerable to hacking and data breaches, and ChatGPT itself warns users “Please don’t share any sensitive information in your conversations”. If the issues around security are resolved in the future, there are still other legal requirements, including around transparency information for patients and jurisdiction issues, that would need to be addressed before patient details should be uploaded.’

CSP education adviser Sundeep Watkinsbelieves there is a cautious place for AI in physiotherapy education.

‘The arrival of AI has certainly been a recent cause for concern for those involved in physiotherapy education as its evolving capabilities have raised fears around plagiarism and possibly enabling cheating in online examinations. In fact, a 2023 study found that ChatGPT technology could attain exam scores required of a third year medical student. However, whilst AI is excellent for locating factual information, it is not as capable in demonstrating a deeper understanding of topics or providing critical analysis and the references it supplies can be fabricated by the AI itself. Plagiarism detection software used by higher education institutions have begun to update their products to be more sensitive to detecting AI generated content in essays and this is an important step to ensure academic integrity. 

‘There is currently very little guidance for students as to whether use of AI is permissible during their courses with differing opinions on its safety and efficacy. New advice from the Quality Assurance Agency (QAA) which reviews standards at UK universities, urges HEIs to equip students with AI skills they can take into the world of work. It suggests guidance should be made available to students for the coming academic year as to how and when AI should be used, and also to look to adapt courses where appropriate, possibly looking into alternative methods of assessment. 

‘When studying a vocational degree course such as physiotherapy there are some ways in which AI can be supportive of student learning and development. Chat GPT can be used to quickly summarise topics or answer questions in relation to factual content such as anatomy, pathophysiology, or exercise prescription. The technology can be used as an interactive interface and simulate conversations with virtual clients or create virtual cases for students to practice communication and assessment skills. AI can easily devise a revision timetable or provide a starting point for beginning written coursework. It can also help to summarise your essay if you are struggling to reduce your word count. The future will see AI technology develop further and by understanding its capability, Chat GPT could be used as a tool in a purposefully designed approach to teaching, learning and assessment and provide support for learners in their education.’

CSP professional adviser Jackie Lidgardsays that ChatGPT itself tells us that it won’t replace a physiotherapist. 

But could it help our practice? And is there anything we need to worry about? ‘Services are already using chatbots to provide real-time triage to patients and, using evidence-based recommendations, to advise them on their next steps. With ever-increasing waiting lists and pressures on services, using the technology in this way has the potential to free up clinician time to use elsewhere in the pathway. But what happens if a red flag symptom is missed, or incorrect advice is given? There is currently no specific legislation regarding the use of AI in healthcare. Instead, other laws relating to data protection and medical devices have been adapted. Therefore, services will need to ensure governance and pathway structures are in place and regularly reviewed. 

‘Chat GPT can be used as a ‘virtual assistant’ to support patients to manage their appointments or to send data from telemedicine devices that monitor patients’ health at home. It can use this information to alert patients or clinicians to make timely changes to management, to help prevent deterioration. It has the capability to take information from medical records and use this to write treatment summaries or discharge letters. To do this, it requires patients’ data. This needs to have the same protection as any other form of confidential patient information. 

‘The UK government believe that, in the future, it is almost certain that Chat GPT or similar technology will take over some of the work along the physiotherapy pathway. However, it is unlikely that it will be a substitute for your own physical assessment, clinical expertise, and experience, and therefore needs to be cautiously welcomed into practice rather than embraced with open arms.’

Does ChatGPT have a place in physiotherapy?

As the experts state above there is a role for AI to supplement the skillset, learning and delivery of the physiotherapy profession but only if used appropriately. This is a position supported by research published in 2022. But does ChatGPT agree? … ‘ In summary, while I do not have a direct role in physiotherapy, I can provide information and support that can be valuable to physiotherapists and patients’.

If you’d like to know more about the potential role of ChatGPT, or any other AI platforms, as well as other technology then visit the digital physiotherapy pages of the CSP website.

You can also explore the Physiotherapy Health Informatics Strategy and find more information on how to join the Digital and Informatics Physiotherapy Group. 

Professional Advice team

The CSP’s Professional Advice Service gives advice and support to members on complex and specialist enquiries about physiotherapy practice, including professional practice issues, standards, values and behaviours, international working, service design and commissioning, and policy in practice.

Number of subscribers: 1

Log in to comment and read comments that have been added