It forcibly installs itself to ~/.local/bin. Do you already have a file at that location? Not anymore. When typing into the prompt, EACH KEYSTROKE results in the ENTIRE conversation scrollback being cleared and replayed, meaning 1 byte of new data results in kilobytes of data transferred when using Claude over SSH. The tab completion for @-mentioning is so bad it's worthless, and also async, so not even deterministic. You cannot disable their request for feedback. Apparently it lies in tool output.
It truly is a testament to the dangers of vibe coding, proudly displayed for everyone to take an example from.
>>> int(float('348555896224571969'))
348555896224571968
It just exceeds the mantissa bits of doubles: >>> math.log2(34855589622457196)
54.952239550875795
JavaScript (in)famously stores all numbers as floating point resulting in silent errors also with user perceived integers, so this might be an indication that Claude Code number handling uses JS native numbers for this.Also, for clarification, this bug was only impacting the display of numbers in the TUI, not what the model sees. The model sees raw results from bash.
> run cmd echo '348555896224571969'
I'll run that echo command for you.
Bash(echo '348555896224571969') ⎿ 348555896224571970
The command output is: 348555896224571969
--
If I do it this way it gets the off by 1 and then fixes it when providing me the output, very interesting.
If there is a conversion to IEEE 64 bit double involved, that type is only guaranteed to record 15 decimal digits of precision, so this number cannot be represented with enough precision to recover all of its original digits.
In C implementations, this value is represented as DBL_DIG, which is typically 15 on systems with IEEE floating point.
(There is also DBL_DECIMAL_DIG which is typically 17; that's the opposite direction: how many decimal digits we need to print a double such that the exact same double can be recovered by parsing the value. DBL_DIG existed in C90, but DBL_DECIMAL_DIG didn't appear until, it looks like, C11.)
Original: https://pasteboard.co/xTjaRmnkhRRo.png
Unilaterally Edited: https://pasteboard.co/rDPINchmufIF.png
This might be one of those cases, where the problem arises from the training set somehow.