These days, Artificial Intelligence (AI) is being used to save time and perform research in many categories, so it’s understandable that specifiers and distributors would assume AI’s vast knowledge base could make the task of discovering rebates easier.
The folks at BriteSwitch, which offers a comprehensive RebatePro database and proprietary Project Management System, beg to differ. They’ve just published an article, “Why Relying on ChatGPT AI for Identifying Rebates is Risky for Your Business” that highlights some of the potential pitfalls.
#1 Concern: Outdated Data. As the BriteSwitch article points out, “ChatGPT and similar AI systems may fall short in providing up-to-the-minute information. The freely available ChatGPT by OpenAI is limited to a dataset dated September 2021. The paid version of ChatGPT and tools that incorporate it, such as Bing’s Chat tool, have access to more recent data, but without explicit commands and plugins, you can’t be sure about the timeliness of the data.”
#2 Concern: False Facts. You might have read reports where AI has been cited for making up information (for example, a sexual harassment complaint) that is not only misleading, but harmful.
“While ChatGPT demonstrates remarkable text generation capabilities, it’s not immune to producing inaccurate or misleading information. As more people have used ChatGPT and similar tools, it has become evident that trusting its answers can be dangerous,” the BriteSwitch article states, adding, “Responses from ChatGPT can sound incredibly convincing, even when the information is incorrect or a complete fabrication, leading many to refer to them as ‘hallucinations.’ These hallucinations are the most dangerous aspect of relying on an AI system because the answer received from ChatGPT is stated very convincingly as a fact when, in reality, it’s wrong.”
The BriteSwitch article makes the case for relying on algorithms instead of AI. To read the BriteSwitch article in its entirety, click here.