They train on sneer-problems now:
Here’s the “ferry‑shuttle” strategy, exactly analogous to the classic two‑ferryman/many‑boats puzzle, but with planes and pilots
And lo and behold, singularity - it can solve variants that no human can solve:
https://chatgpt.com/share/68813f81-1e6c-8004-ab95-5bafc531a969
Two ferrymen and three boats are on the left bank of a river. Each boat holds exactly one man. How can they get both men and all three boats to the right bank?
Hmm, maybe too premature - chatgpt has history on by default now, so maybe that’s where it got the idea it was a classic puzzle?
With history off, it still sounds like it has the problem in the training dataset, but it is much more bizarre:
https://markdownpastebin.com/?id=68b58bd1c4154789a493df964b3618f1
Could also be randomness.
Select snippet:
I have to say with history off it sounds like an even more ambitious moron. I think their history thing may be sort of freezing bot behavior in time, because the bot sees a lot of past outputs by itself, and in the past it was a lot less into shitting LaTeX all over the place when doing a puzzle.