Discussion
ksaj: I saw a YouTube Short of a teacher demonstrating this to her young students. Of course the kids are laughing lots at the results of her literally enacting their instructions and exaggerating the missing necessary info. But I bet they came out with a far more technical thought process.This should be part of the curriculum.
GianFabien: What's the point? No matter how detailed and comprehensive the instructions and steps by the AI, you still don't get a PBJ sandwich to eat. You have to go to the kitchen and do it yourself.
t-writescode: It’s a reference to a famous YouTube video[0] about how to write instructions that can be followed.One of the most important things a programmer needs to do is learn how to tell a computer how to do something. It’s a surprisingly hard skill because each step is way more complicated and has way more variables to go through.https://youtu.be/FN2RM-CHkuI
jgable: It’s funny, when I’ve seen this demonstrated, it’s basically literally impossible to get the right result because the test maker doesn’t define an instruction set that you can rely on. They will deliberately screw up whatever instructions you give them no matter how detailed. A computer has a defined ISA that is specified in terms of behavior. A compiler transforms a language with higher level abstractions into this low-level language. I’ve never seen this “test” done with any similar affordance, which doesn’t really teach anything.
Benjamin_Dobell: Although this is a facetious take, instructing a robot to follow recipes is a fantastic introduction to coding. I added a visual scripting layer to Overcooked so kids can program robots to make all sorts of dishes (Sushi, Pasta, Cakes etc.)https://youtu.be/ITWSL5lTLigThis is part of a club to teach kids coding, creativity and digital literacy.
parpfish: i once had this "make a PB&J" as part of a written take-home interview.i knew the schtick -- no matter how precise and complete you are, there is always the possibility for another little gotcha. and that makes it absolute rubbish for a take home because... how much detail do i need to go into to satisfy the manager reviewing this? i think i wrote a couple paragraphs and ended with a little rant about how i know how this problem works and it'd work better in person. i don't know how much they expected somebody to write.
void-star: It’s almost like we need some deterministic set of instructions that can be fed to a machine and followed reliably? Like… I don’t know… a “programming language”?
totallymike: Oh I think this lesson teaches quite a lot. Maybe your instructor is deliberately screwing up, but perhaps other end users are just not paying attention, or are missing assumed knowledge, or are feeling particularly adversarial on the day they need to follow your instructions.One of many lessons that can be taken away from this exercise is to understand your audience and challenge the assumptions you make about their prior knowledge, culture, kind of peanut butter, et deters.
fghorow: As always, there's an XKCD [1] for this![1] https://xkcd.com/149/
LeoPanthera: Demonstrations like this are a regular feature of the Japanese educational TV show "Texico", which teaches logical thinking with the specific goal of preparing young children for programming.I highly recommend it. It's extremely well made, and quite entertaining even for adults.It's available in English, 10 minutes per episode, no subscription required:https://www3.nhk.or.jp/nhkworld/en/shows/texico/
userbinator: Texaco + Mexico = Texico? The Japanese never fail to amuse foreigners with their naming.
ljlolel: Texas?
aryehof: [delayed]
notsylver: This feels like a buzzfeed quizz for developers. If you think about each step long enough you can't really get a wrong answer
dang: My "related" past threads fu is failing me right now but I know there have been several threads with this theme in the past, including the video with the dad carrying out his kids' literal instructions in a cute but also borderline uncomfortable way.
jbritton: It’s kind of interesting relating this to LLMs. A chef in a kitchen you can just say you want PB&J. With a robot, does it know where things are, once it knows that, does it know how to retrieve them, open and close them. It’s always a mystery what you get back from an LLM.
jbritton: Also true of specifications. Anything not explicitly stated will be decided by the implementer, maybe to your liking or maybe not.
gormen: Of course, we need to give the robot a cognitive architecture so that it understands the task, the context, and corrects its actions, and then it will autonomously make such sandwiches every morning for breakfast.
gnabgib: PB&J AI (3 points, 1 year ago, 2 comments) https://news.ycombinator.com/item?id=42222009Dad Annoys the Heck Out of His Kids by Making PB&Js Based on Their Instructions (2017) https://news.ycombinator.com/item?id=13688715 https://news.ycombinator.com/item?id=41599917& infamous: sudo make me a sandwich (2009) https://news.ycombinator.com/item?id=530000