As of writing, the testcase specification for Stage #LV1 (Evaluating Expressions / Literals: Number) states that:
For the number literals, the tester will check that the program prints the number with the minimum number of decimal places without losing precision. (For example, 10.40 should be printed as 10.4).
This is inconsistent with Lox specification to display number literals with the minimum number of decimal places or one decimal place, whichever is larger.
Other stages such as Stage #KJ0 (Scanning: Number literals) and Stage #RA8 (Parsing Expressions / Number literals) follow the Lox specification, as linked in their descriptions:
// Note: This is just for the expression parsing chapter which prints the AST.
(5 - (3 - 1)) + -1
// expect: (+ (group (- 5.0 (group (- 3.0 1.0)))) (- 1.0))
Why is this the case? Wouldn’t requiring number literals to be displayed in the same way make more sense?
Yeah that is unfortunate, I can’t see it as anything but an oversight to be honest. Once I clear this challenge and figure out how to compile and run the program locally, that would be the first thing to go lol
I saw a similar ask from a few months ago, where the idea of having the tester accept either form of numeric output has been suggested. Is that idea still being considered at the moment?
All that being said, this challenge is still great (speaking as a first-time user), and it’s drawing from high-quality source material too. Looking forward to the other challenges!