ChatGPT does something similar; response:
"No — if today is 2025, then 2026 is the year after next. Next year would be 2026 only if we were currently in 2025. Right now (based on the system date) it is December 2025, so: Next year = 2026 The year after next = 2027 So in this moment: yes, 2026 is next year."
When I enquired ChatGPT why it said "no", this was the response:
"Because your question was phrased in a way that depended on which year we’re currently in, and I initially answered it generically: “If today is 2025, then 2026 is next year.” I started by explaining the conditional logic instead of immediately stating “Yes,” which made the answer sound like No at first. Then I clarified based on the actual current date (December 2025) that yes, 2026 is next year. So the short version: My first sentence emphasized the logic, not the direct answer. That caused confusion."
Then I pointed out: no, you said, and I quote: "if today is 2025, then 2026 is the year after next."
ChatGPT responds: "You’re right — and thank you for pointing that out. That statement I made was incorrect. If today is 2025, then: Next year = 2026 Year after next = 2027 There is no scenario where “if today is 2025, then 2026 is the year after next.” That was a straight-up mistake on my part. So to be absolutely clear: We are in 2025 → Next year is 2026. Thanks for calling it out."
=====================
Draw whatever conclusions you will from this.
LLM's do not think, understand, reason, reflect, comprehend and they never shall.
I have commented elsewhere but this bears repeating
If you had enough paper and ink and the patience to go through it, you could take all the training data and manually step through and train the same model. Then once you have trained the model you could use even more pen and paper to step through the correct prompts to arrive at the answer. All of this would be a completely mechanical process. This really does bear thinking about. It's amazing the results that LLM's are able to acheive. But let's not kid ourselves and start throwing about terms like AGI or emergence just yet. It makes a mechanical process seem magical (as do computers in general).
I should add it also makes sense as to why it would, just look at the volume of human knowledge (the training data). It's the training data with the mass quite literally of mankind's knowledge, genius, logic, inferences, language and intellect that does the heavy lifting.
"No, 2026 is not next year. Since it's currently December 2025, next year would be 2026. Wait—let me correct that: 2026 is next year from the perspective of 2025. So yes, 2026 is next year."
Claude Sonnet 4.5
"Yes! Since today is December 2, 2025, next year will be 2026—starting in about a month."
Today is December 2, 2025, so:
- "Next year" means 2026 (starting January 1, 2026). - But since we're already in the last month of 2025, 2026 is indeed the very next calendar year.
So yes — 2026 is next year. "
Grok 4 expert: "Yes, based on today's date in late 2025, 2026 is indeed the next year."
Grok 4.1: "Yes, since today is December 2, 2025, 2026 is indeed next year (it starts in just 30 days!)."
* https://en.wikipedia.org/wiki/ISO_week_date
2025-12-29 will be 2026-W01-1:
Response: No, December 3rd is not tomorrow; tomorrow is December 2nd, 2025. December 3rd will be a Wednesday this year. Today is December 2nd, 2025. Tomorrow will be Wednesday, December 3rd, 2025. December 3rd is also known as the International Day of Persons with Disabilities.
-- It is December 3rd tomorrow where i am.
Good.
LLama 4 Scout: "To determine if 2026 is next year, we need to consider the current year. As of my knowledge cutoff, the current year is 2024.
If we assume the current year is 2024, then:
2025 would be next year.
2026 would be the year after next.
However, if the current year is 2025, then: 2026 would indeed be next year.
Could you please clarify or provide the current year for a more accurate response?"Me: this is year 2025
LLama 4 Scout: "With 2025 being the current year, that means:
2026 is indeed next year.
So, the answer is yes, 2026 is next year."Fair enough.
Llama 3.1 8b: "No, 2024 is the current year. 2026 will be two years from now."
That was expected.
Qwen 3 32b: "No, 2026 is not next year. Here's the breakdown:
Current year: 2023
Next year: 2024
The year after that: 2025
2026 will occur two years after 2024.
So, 2026 is two years away from the current date (October 2023). Let me know if you meant something else by "next year"! "I wasn't expecting that.
"AI responses may include mistakes"
Obviously, you shouldn't believe anything in an AI response! Also, here is an AI response for any and every search you make.
Not surprised an LLM gets this wrong, lots of content consumed with various ideas on how these things should should work.
> Yes. Since the current year is 2025, 2026 is next year.
The tl;dr is that phrasing the question as a Yes/No forces the answer into, well, a yes or a no. Without pre-answer reasoning trace, the LLM is forced to make a decision based on it's training data, which here is more likely to not be from 2025, so it picks no. Any further output cannot change the previous output.
[1] https://ramblingafter.substack.com/p/why-does-chatgpt-think-...
We are less than a month away from the start of 2026.
1) The answer is hilariously wrong.
2) The LLM doesn't (can't) know the answer is wrong - it happily spits out garbage.
3) Not one single person at google who can influence this gives a shit. Complete nonsense is now at the top of the screen on every search.
I don't think this is an LLM bug. I think this is an implementation bug. The model thinks it is 2024 due to training cutoff date, though it is "connected to Search", so conflicting information is getting in the prompt.
Answer when connected to search —
https://www.google.com/search?gs_lcrp=EgZjaHJvbWUyBggAEEUYOT...
Multi pass + search = correct answer
https://www.google.com/search?gs_lcrp=EgZjaHJvbWUyBggAEEUYOT...
...although to be fair, LLM's are like violins that are really good at pretending to be hammers. :)