Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

C compiler performance in the 80's for home micros was so bad, that functions were basically naked entry points to full inline Assembly.

And here we are now discussing how good modern C compilers are.

If there isn't the willingness to spend the money to optimize performance, of course it won't improve.



view as:

The initial port of Genera to an Alpha matched the performance of Symbolics fastest workstations:

http://pt.withy.org/publications/VLM.html "Version 1.0 of the emulator exceeded our initial performance goals, achieving nearly the performance of our high-end custom workstation."

An initial port to a foreign architecture (in-order execution, 64-bit memory space which is going to overload caches, inefficient 8-bit transactions) matched 15+ years of hardware development on the native architecture.

That wasn't lack of money or development. That was a stunning rebuke that Lisp Machines had an inferior architecture.


> That wasn't lack of money or development. That was a stunning rebuke that Lisp Machines had an inferior architecture.

No, it wasn't nothing more than lack of willingness and money to do it, instead they complained about the state of art and went home.


> C compiler performance in the 80's for home micros was so bad, that functions were basically naked entry points to full inline Assembly.

For good reason: memory and disk were ferociously expensive until about 1996-1997.

Things like unchecked bounds make sense when you can barely get your computer to do what you want. Designing a VLSI chip through about most of the 1990's was HARD. Everything was an exercise in how to fit things on disk, fir things in RAM, run enough polygons, or run enough instructions on the emulator to actually boot.

Now: we can emulate a 6502 in Javascript while animating it. We can emulate entire PowerPC chips at real-time speed. So, yeah, NOW allocating a couple of cycles to bounds checking and garbage collection makes perfect sense.

We should have started the conversion away from C in 2000. The problem is that 1) sunk cost keeps design decisions made in vastly different environments around for far longer than is reasonable and 2) we didn't really have good replacement options for C until starting about 2005.

And then the cellular phone revolution hit and put us back to programming like its 1995. Finally, now that cell phones even have an overabundance of CPU, storage, and memory--we now can start to care about security.


> For good reason: memory and disk were ferociously expensive until about 1996-1997.

Yet Burroughs was able to use Extended Algol in 1961 in an architecture much more constrained than a PDP-11.

> we didn't really have good replacement options for C until starting about 2005.

Ada, Turbo Pascal, Extended Pascal, Modula-2, Modula-2+, Modula-3, Delphi, Oberon, Oberon-2, Component Pascal, Active Oberon, Cedar, Mesa, Algol 68, Extended Algol, CPL,...


Legal | privacy