Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

One of the many reasons I think we all would've been better off, had Brendan Eich decided he'd been able to simply use Scheme within the crazy time constraint he'd been given, rather than create JavaScript, :) is that Scheme comes with a distinction between exact and inexact numbers, in its numerical tower:

https://en.wikipedia.org/wiki/Numerical_tower

One change I'd consider making to Scheme, and to most high-level general-purpose languages (that aren't specialized for number-crunching or systems programming), is to have the reader default to reading numeric literals as exact.

For example, the current behavior in Racket and Guile:

    Welcome to Racket v7.3.
    > (+ 0.1 0.2)
    0.30000000000000004
    > (+ #e0.1 #e0.2)
    3/10
    > (exact->inexact (+ #e0.1 #e0.2))
    0.3
So, I'd lean towards getting the `#e` behavior without needing the `#e` in the source.

By default, that would give the programmer in this high-level language the expected behavior.

And systems programmers, people writing number-crunching code, would be able to add annotations when they want an imprecise float or an overflowable int.

(I'd also default to displaying exact fractional rational numbers using familiar decimal point conventions, not the fractional form in the example above.)



view as:

Many languages make a distinction between floating-point numbers and fixed-point numbers. Fixed-point numbers (e.g.: "Decimal" / "BigDecimal" in Java) do not suffer from this problem.

This would make particular sense in a language like python, which no one (should?) be using for systems programming.

Agreed. Though, for reasons, I had to write essentially a userland device driver in Python (complete with buffer management and keyboard decoder). It was rock-solid in production, in remote appliances, and I was very glad Python was up to that. :)

A Scheme system that implemented exact reals as unnormalized floating decimal (IEEE 754-2008), coupled with a directive that said `numbers with a decimal point should/should not be read as exact' would be wonderful, not just for financial things, but also for teaching students.

It's actually easy to implement that slight variation in Racket, as a `#lang` or reader extension.

As an example of a Scheme-ish `#lang`, here's a Racket `#lang sicp` that I made to mimic MIT Scheme, as well as add a few things needed for SICP: https://github.com/sicp-lang/sicp/blob/master/sicp/main.rkt

It would be even easier to make a `#lang better-scheme`, by defining just a few changes relative to `racket-base`, such as how numbers are read.


The reader even has a parameter `read-decimal-as-inexact` that controls how to read decimal numbers.

Good point. For exacts in `#lang`, I realized afterwards that it's not just literals/self-evals: there's some "standard library" functions you'd probably want to wrap, to force them to produce exact numbers. And maybe check that there's nothing in standard library that does inexacts internally in a way that would be unpredictable to a programmer who expects exact number behavior.

Scheme did not "look like Java" so was ruled out on that basis (also on others which I have discussed at length in several interviews, most recently with Lex Fridman).

Thanks for that interview; very interesting, and I also appreciate your words for Scheme.

For HN, I'd like to point out that it was a historical accident that Java looked like it did, as far as the Web was concerned.

IIRC, Java looked like it did to appeal to technical and shrinkwrap developers, who were using C++ or C. (When I was lucky to first see Java, then called Oak, they said it was for embedded systems development for TV set-top boxes. I didn't see Java applets until a little later.)

But the Web at the time was intended to be democratizing/inclusive (like BASIC, HyperCard, and Python). And the majority of the professional side was closer to what used to be called "MIS" development (such as 4GLs, but not C/C++). And in practice, HTML-generating application backends at the time were mostly written in languages other than C/C++.

I'm sympathetic to the rebranding of the glue language for Java applets (and for small bits of dynamic), to be named like, and look like, Java. That made sense at the time, when we thought Java was going to be big for Web frontend (and I liked the HotJava story for a thin-client browser extended on-demand with multimedia content handlers). And before the browser changed from hypertext navigator to GUI toolkit.

But it's funny that we're all using C-descendant syntax only through a series of historical accidents, when that wasn't even what the programmers at punctuated points in its adoption actually used (we only thought it would be, at the time the decisions were made).


Legal | privacy