ChatGPT has undeniably established itself as a reliable source of technical information, but can it also excel in offering social advice? A recent study published in the journal Frontiers in Psychology explores this question, revealing that later versions of ChatGPT surpass professional columnists in dispensing personal advice.

Since its public release in November of the previous year, ChatGPT has garnered an estimated 100 million active monthly users in just two months. Fueled by one of the largest language models ever created, the paid version (GPT-4) boasts approximately 1.76 trillion parameters, sparking a revolution in the AI industry. Trained on extensive text datasets, ChatGPT showcases versatility in providing advice on diverse topics, from law and medicine to history, geography, and more.

Users and AI experts have marveled at ChatGPT’s conversational style and adaptability. Many have turned to the chatbot for personal advice, a realm that necessitates empathy—a quality not explicitly programmed into ChatGPT.

Earlier iterations of ChatGPT struggled with social advice, lacking emotional sensitivity. However, the latest version, using GPT-4, enables users to request multiple responses to the same question, allowing them to indicate preferences. This feedback mechanism enhances the model’s ability to produce socially appropriate and empathetic responses.

A groundbreaking study compared ChatGPT’s responses to those of human advice columnists in addressing social dilemmas. Participants overwhelmingly perceived ChatGPT’s advice as more balanced, complete, empathetic, helpful, and superior overall compared to professional advice.

One dilemma, involving a marine biologist facing a long-distance relationship, demonstrated ChatGPT’s nuanced response. Participants favored ChatGPT’s advice, emphasizing the importance of considering career paths and suggesting compromises.

Despite the positive reception of ChatGPT’s responses, participants expressed a preference for human advice when unaware of the source. This bias suggests that, while ChatGPT excels in certain aspects, humans still value the emotional understanding that machines lack.

The study’s results highlight the potential for appropriately designed chatbots, like ChatGPT, to augment therapy in the future. However, caution is warranted, and the study acknowledges that AI chatbots should not replace professional advisers or therapists entirely.

In conclusion, the success of ChatGPT in providing superior social advice opens new possibilities for AI applications in counseling, while also encouraging human advisers to enhance their approaches by learning from AI.

By Impact Lab