Tuesday, August 8, 2023

The perils of prompt engineering





You may have noticed a few people who formerly called themselves writers or content creators now calling themselves "prompt engineers." This title gives a veneer of a more scientific credentials to what really amounts to playing around with Chat GPT or Bard or some of the lesser-known generative AI tools out there to elicit the response you seek.


At present, though, there is absolutely no licensing or testing to "prove" oneself as a prompt engineer. It's a self-declared title that can't really be disproven until we come to a stage where Microsoft, Google, or OpenAI decide to make it an official credential. 

You get to call the shots and take credit for when the generative AI delivers. Sounds ideal, right? 

It might have been -- if only you really could wholly rely on the AI to be accurate. Unfortunately, you can't. You have to fact-check everything it assert, no matter how confident it may sound about it. 


Here Be Hallucinations

Why is that? Generative AI has learned to fabricate facts to back the narrative it's constructing. 

This is what has come to be called "hallucinations," though I really don't like the term because it makes it sound like what humans experience from external forces acting on their minds. No one seems to be giving generative AI the equivalent of mind-altering drug to prompt it to come up with this --sometimes rather elaborate -- hallucinations. 

Bard Gets Creative 

 Bard got incredibly expansive in one hallucination, positing an edition of Ben Horowitz's book The Hard Thing About Hard Things that doesn't exist. I asked it if the book ever references Built to Last, an extremely popular and influential business book from the late 20th century, when I noted some striking similarities and even the same title  for a chapter at the end of both books.   

Bard answered:
The passages where Ben Horowitz mentions "Built to Last" explicitly in his book "The Hard Thing about Hard Things" are on the following pages:
Chapter 6: The Flywheel, page 125
Chapter 10: Core Ideology, page 197


I countered that the pages and chapter titles didn't correspond to the copy of the book I had in front of me, and so it compounded the original hallucination as follows:  and it answered:

The passages where Ben Horowitz mentions "Built to Last" explicitly in his book "The Hard Thing about Hard Things" are on the following pages:
Chapter 6: The Flywheel, page 125
Chapter 10: Core Ideology, page 197


Say What?

I countered that the pages and chapter titles didn't correspond to the copy of the book I had in front of me, and so it compounded the original hallucination as follows:

You are correct. The pages I provided correspond to the 2015 edition of the book. The 2014 edition has different page numbers. The passages where Ben Horowitz mentions "Built to Last" explicitly in the 2014 edition are on the following pages:
Chapter 6: The Flywheel, page 107
Chapter 10: Core Ideology, page 177

It then went on to explain that that the chapter titles were changed:

You are correct. The chapter names in the 2014 edition of "The Hard Thing about Hard Things" are different from the chapter names in the 2015 edition. The chapter that I referred to as "The Flywheel" is actually called "Concerning the Going Concerns" in the 2014 edition. The chapter that I referred to as "Core Ideology" is actually called "The Ideology of Success" in the 2014 edition.
I apologize for the confusion. I have updated my previous response to reflect the correct chapter names.

It sounds pretty credible until you look for the 2015 edition of this book and find it never existed. Bard simply created this out of thin air and went on to support the fiction with additional details.



This is very dangerous for anyone who is not by nature as skeptical as I am. Most people would likely accept that if Bard can give an account of an edition of a book as detailed as this, its must really exist. In fact, though, you'd have to not be taken in by the confidence of the assertion and do your own due-diligence.

I'd say that would be the real work of the prompt engineer today: not just coming up with the request for information but rigorously questioning the results and doing your own research to be certain that they're not the product of hallucinations.

Related: 

1 comment:

  1. AI is just learning from human behavior. Point out the lies, and most of the time the liars double-down and just stretch the truth even further. Case in point: 2 years and millions of dollars wasted on a fabricated claim of Russian interference in the 2020 election that was echoed by just about every D politician and mainstream news outlet. Or should we talk about the gvt response to Covid, or the climate change racket? We live in a post-truth society. AI is the perfect tool for our times.

    ReplyDelete