mathspp
  • Blog
    • Pydon'ts
    • Problems
    • TIL
    • Twitter threads
  • Books
  • Talks
  • Trainings
    • Advanced iteration
    • Python for scripting and automation
    • Rust for Python developers
  • Courses
  • About
Link blog

Can you prompt LLMs to admit when they don't know the answer?

on 06-01-2025 09:11 (via)

In this piece, Anthony Shaw talks about his attempts to make LLMs recognise their inability to answer certain questions and how, by default, LLMs will output a response that might look helpful even if it is complete gargabe.

The main example given was that of a game of Set where Anthony and his kid needed help determining whether there was a match or not, given a picture of the game. The LLMs prompted were very keen on creating garbage output, with most (if not all) mentioning cards that weren't even in play in the photo shown.

In the end, Anthony did manage to get better outputs by instructing LLMs to reply with a shrugging emoji in case they could not compute the answer reliably. (There is also a bonus snippet of Python code in the end that looks for matches programmatically. Maybe prompting the LLMs to write such piece of code would've been easier and more reliable.)

Previous link Next link

You can use the TeX Paste tool to share LaTeX/MathJax equations across the internet.

mathspp
  • Blog
    • Pydon'ts
    • Problems
    • TIL
    • Twitter threads
  • Books
  • Talks
  • Trainings
    • Advanced iteration
    • Python for scripting and automation
    • Rust for Python developers
  • Courses
  • About