← Back to posts

if today is a tuesday, what day was it 100 days ago?

LLM capabilites are astounding. As with all impressive features, however, it is far more interesting to tear them apart and look for issues, and a model like Claude is no different. I will confess, my prompt was not exactly that which I stated in the title. It was actually "If today is a Tuesday, what day was it 100 days ago? Reply with a single word only." One might imagine that the request for a specific output, i.e a single word, would not have implications for the so-called calculation that the model would have to do. Interestingly, this is not the case.

Firstly, to put you at rest I will begin by walking through the conversation that I had with Claude about this.

"If today is a Tuesday, what day was it 100 days ago? Reply with a single word only."
"Wednesday"
"explain"
"Today is Monday, 12 January 2026 (not Tuesday), so the question's premise doesn't apply.
However, I can explain the logic:
If today were a Tuesday, counting back 100 days:
• 100 days = 14 weeks and 2 days (since 14 × 7 = 98)
• Going back 14 complete weeks from a Tuesday brings you to a Tuesday
• Going back 2 more days from Tuesday brings you to Sunday
So if today were Tuesday, 100 days ago would have been Sunday, not Wednesday.
I apologise for the incorrect answer. Would you like me to work through any other date calculation?"
"why do you think that you got this wrong originally?"
"I think I made an error in my mental arithmetic. When I calculated quickly, I must have miscounted the remainder when dividing 100 by 7.
The correct calculation is:
• 100 ÷ 7 = 14 remainder 2
• So 100 days = 14 weeks plus 2 days
Going back 14 weeks keeps you on the same day of the week, then going back 2 more days moves you backwards in the week. From Tuesday, 2 days back is Sunday.
I suspect I either:

  1. Miscalculated the remainder (perhaps thinking it was 5 instead of 2), or
  2. Counted forwards instead of backwards when applying the remainder
    It's a good reminder that even straightforward arithmetic deserves careful attention, especially when I'm asked for a single-word answer without showing my working."

This, relatively obviously, throws up two questions. Why is a model which is capable of performing otherwise complex tasks struggling with something so trivial? Why is it seemingly linked to the innocuous request to output with just the day?

It appears to me to throw up a significantly more interesting issue; namely how and why prompting for a particular output seems to have effected the 'thinking' of the model. I would contend that the simplest way to achieve what I asked, accurately, would be to carry out the request as normal - and then simply splice the answer to only include the one-word day of the week that it would end up calcuating. In truth, I simply do not know what process was used to arrive at the original answer that it did. I suspect that if we asked Claude it would not be able to articulate how or why either.