Let's chat (about) GPT

Large language model AI services may seem intelligent, but they are still just computer programs.

Marty Traynor is an Omaha-based consultant in the benefits field.

Everyone loves a hot new idea, and unless you’ve been on another planet, you are aware that everyone is chatting about large language model AI programs like ChatGPT.

This has been a hot topic  in virtually every aspect of business, and the marketing and administration of benefits is no exception. It’s easy to find articles that contemplate the ways in which communication, enrollment, administration, and ongoing customer service can be made more efficient through use of these tools.

A few months ago, I was preparing a presentation on the past, present, and future of ben admin systems. After creating my outline, I asked ChatGPT to do the same. It was nearly identical, but very generic; it certainly didn’t include personal examples from experience. I used it during the presentation to demonstrate how AI can create first drafts of meeting agendas, sales pitches, etc. with the proviso that an expert needs to add experiences, analogies, and calls to action. The output from ChatGPT was competent, but not inspired. Nor was it inspiring.

There have been some embarrassing examples of outrageous or inappropriate output which have been highly ­publicized. Before your organization uses ChatGPT or similar programs, spend resources learning their proper use.

Large language model AI services may seem intelligent, but they are still just computer programs, so the famous dictum “garbage in, garbage out” applies. Great care should be taken to ask the right questions and to properly frame requests. Ask it to generate output through a series of iterative requests that can be reviewed to determine which produced a result consistent with your thinking.

Naturally, you should always add your personal touch to a document before it is used with a customer. The most effective communications between product providers and customers cover both intellectual and emotional content. Facts and statistics make up the intellectual content; the emotional content includes analogies, examples, and personal experiences. Customers need to be able to relate to our products and services. An AI program, no matter how sophisticated, will be unable to provide personal experiences or create meaningful analogies, and is likely to be clinical in providing information. Never forget a personal touch is the secret sauce behind customer connections.

Using unedited material prepared by an AI program can also lead to embarrassing ethical missteps. One of the issues being discussed in many forums is how these systems “learn” by absorbing millions of pages of content from any available machine-readable source. You do not want to find out that you are publishing material created and copyrighted by a competitor.

Related: DIY benefits: Exploring self-funding

The use of AI-powered natural language processing systems, combined with machine learning, can result in streamlining administration, simplifying the enrollment experience for employees, and enhancing employee engagement. But it’s vitally important for benefit professionals who are embracing this digital transformation to provide appropriate human oversight.

We want to serve our customers in a way that inspires!