Chat GPT – a simple experiment
Let’s get one thing clear from the start. I love ChatGPT. It’s a great source of information and the chat functionality is quite addictive.
I heard on the radio that ChatGPT can now get a PhD in Chemistry -this can’t be true surely? Perhaps it’s ‘consumed’ all the texts but how on earth will it cope with the lab work? Seriously. Will it discover the cure for cancer? It will certainly speed up the process.
But can it be left to write an article unfiltered? The simple answer is no. And here’s why, based on my simple scientific experiment, shortened down to get to the key points quickly but my advice to you is to try it yourself BEFORE you get it to write your blog posts.
Experiment
An evening spent chatting with ChatGPT, asking a series of questions about cricket.
Results
- I asked how to play a ‘bouncer’, and I got a reasonably reliable reply. Factually correct but not terribly useful,
- If you knew nothing about cricket, I think you would struggle to follow the advice and
- If you were on the best batters in the world (Steve Smith was famously felled by a Jofra Archer bouncer), I don’t think it would help. In fairness, that’s not what ChatGPT is for. But think PhD in Chemistry.
Verdict – Just about right but not useful.
- I moved on to ask if Steve Smith was the best batsman in the world. ChatGPT told me that statistically, he is. So, I asked, surely Joe Root is more elegant to watch – ChatGPT said good point, but his stats aren’t as good. But what about Don Bradman – ChatGPT again, said good point, his stats are very good, but he played in a different era (good answer) when wickets were uncovered, and bowlers weren’t as fit. I felt this answer was pretty good – a little dry and not really appreciating the beauty of the game but that’s ok and to its credit, the debate about someone playing in a different era was insightful (although repeating what people have said before).
Verdict – 100% accurate, with some good historic insight but lacking in emotion
- Then I got obscure. What about Lol Cook (Lancashire Legend from 1908), – was he any good at cricket I asked? As expected, I got the statistics but for the wrong player- ChatGPT tried to crack a joke. When I pointed any errors, it said always said good point, and corrected itself. It then made up some facts which were clearly incorrect (Lol Cook played after he retired for example). Finally, I asked about his brothers – putting myself in there – apparently, I played for Cheshire in the 1920s (untrue and I’m not a brother!).
Verdict – Perhaps an unfair test? Unreliable answers created spurious ‘facts’, contradicting itself and forgetting earlier answers. Dangerous.
What can we learn from this
- ChatGPT can quickly process facts from a variety of sources—use it for generating ideas, not as the final answer.
- It struggles with opinion and subjective assessments, but that’s okay if we recognise this.
- On the fringes of its knowledge, it can behave erratically and present information as fact. Always verify the facts and apply your own logic and common sense.
- It will improve over time—that’s the goal. Play with the tool and test its limits to better understand how it works.
- Most importantly, stay curious and open-minded about how it can assist you.
Feel free to get in touch if you have any questions and want to have a conversation.