##
Another expensive bug
*June 21, 2017*

*Posted by Peter Hornby in software.*

add a comment

add a comment

Around eight years ago, I wrote about the strange case of the gentleman from New Hampshire whose bank debited his account by 23 quadrillion dollars after he bought a pack of cigarettes at a gas station. This still looks like a fairly straightforward bug, made amusing by the size of the miscalculation.

It looks like data representation has raised its ugly head once more, in the shape of the case of the lady who seemingly won 43 million dollars in a casino. The casino has offered her a steak dinner and her money back and, maybe not surprisingly, she’s suing them.

Here’s the machine’s display showing her win:

Let’s look at that number – $42,949,672.76. If you convert this decimal value (the number of cents) to unsigned binary, you get (represented as hexadecimal) 0xFFFFFFEC. What else does that bit string represent? It turns out that if you regard it a signed value, the decimal equivalent is -20.

It looks as though the game software performed an arithmetic calculation that it confidently expected to produce a positive result – I assume that the calculation was supposed to produce the value of her win – and displayed the value accordingly. Unfortunately, the result of the calculation was a negative value – it looks like the program had concluded that she’d won -$0.20. Displaying this value assuming that it was unsigned gave the result seen by Katrina Bookman.

Prove it. Write a small program that declares an integer variable initialized to the value -20. Display the resulting value using the format string appropriate for an unsigned value. See the value 4294967276. Here are two lines which do it.

`int ui = -20;`

printf_s("Value = %u\n", ui);

`Value = 4294967276`