Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login
Space Jam's 1996 website is still alive (www.spacejam.com) similar stories update story
707 points by olingern | karma 1282 | avg karma 5.77 2020-07-01 20:32:26 | hide | past | favorite | 273 comments



view as:

And still browser compatible :)

loads very fast too!


Hacker News wasn't around that whole time. But the fact that spacejam.com was still around was a big deal a few years ago (don't remember when I first heard that.)

From some googling, it looks like 2010 is when this made news (via reddit).


It’s been discussed on HN multiple other times too in addition to the ones mentioned here.

For example, here is a discussion about it on HN from 2010: https://news.ycombinator.com/item?id=2050807

I remember because it was around that time I first started browsing HN.


What's truly impressive is if you decide to creep a bit, you'll see that loads of users from that thread still post on HN pretty actively.

HN has an insanely high retention rate for being just a little news sharing site. Most people get bored, pissed off, or uninterested at some point and leave. Looking at 10 year old tweet threads or reddit comment sections is basically a graveyard in comparison. Not sure what keeps people sticking around here.

In a couple days, it'll have been 10 years for me. Crazy.


You mentioned Reddit being a graveyard, but is it true?

Check this thread someone linked above from 9 years ago:

https://www.reddit.com/r/todayilearned/comments/esxwd/til_th...

I clicked about 20 profiles, at least 70% are still active.

Actually, I think this phenomenon probably is pretty common for websites that are still on rising or at least hold still (like both HN and Reddit). Their initial users don't just leave for no reason. Less active, perhaps.


It's been regularly mentioned on Twitter since pretty much forever, so I guess that was just when somebody saw a tweet and decided to post it here.

It has been a commonly-shared fun fact in Internet culture for as long as I can remember.

Reddit from 2010:

https://www.reddit.com/r/todayilearned/comments/esxwd/til_th...


Titles! I think 1996 maybe helped this.

Terms and privacy policy were recently updated.

Hmm, needs more <marquee>

As is Heaven's Gate's website - http://heavensgate.com/

I remember seeing that in the news when I was 16 years old. At that time the Internet was just starting in my country so the news came in a local news broadcasting. It was crazy.

This is actually wild to me. How is it being powered? Did they just pay a hosting provider decades in advance?

Two members stayed behind. I heard recently they still respond to email.

>two group members were briefed about a side mission. They would remain on Earth – the last surviving members – and their job was to maintain the Heaven’s Gate website exactly as it was on the day the last suicides took place.

>And for two decades, the lone Gaters have diligently continued their mission, answered queries, paid bills and dealt with problems.

https://www.mirror.co.uk/news/weird-news/two-decades-after-h...



They didn't drink the Kool-aid. lol.

> They didn't drink the Kool-aid. lol.

Fun fact: the folks at Jonestown didn't drink Kool-Aid either, it was grape Flavor Aid.


Alan Moore wrote a comedy comic where a traumatized Cool-Aid Man is publically accused of involvement at Jonestown and insists on the record in a tell all interview it was the Flavor Aid man who committed the crime.

Imagine missing the rapture to do sysadmin and PR. Maybe they are in hell.

hot tip!

Yikes.

I know it's partly nostalgia, but something about both of these sites feels more fun to interact with and browse through than almost any modern website I visit. The web used to be so much fun.

Indeed.

It's not just nostalgia, the design invites exploration. The web was truly built for surfing back then. Now only The Feed exists.

That's it, exactly.

Navigating these sites feels like exploring a labyrinth. I feel like I can spend an hour on those pages, clicking all the hyperlinks, trying to consume all the content it has to offer

Weird thing from the bottom of that page's source code (apart from the black-on-black keyword stuffing that was done a lot at the time):

<div id="linkbyme" style="display: none;"><li><a href="http://www.heavensgate.com/img/index.asp?index=bogner-ski-we... ski wear</a></li></div><script>document.getElementById('linkbyme').style.display='none';</script>

Weird.


Wondered how well it would rank on those insight scores,

https://developers.google.com/speed/pagespeed/insights/?url=...

Looks like 80kb and they still find things.


98 on first try, 97 on refresh. 99 on 2nd refresh. Are they just making stuff up?

If they were making stuff up, you'd think at least their own Google.com would score higher than a 76.

That’s an extremely small variation in score. Why would that make you think they’re making stuff up? Networks and servers don’t always respond with identical timings.

Yes, 98, 97, 99 are small.

But when I punched in Google.com, I got 84, and then 78, then 90. That's a pretty wide range.



Reminds me of that quote from Ryan from the Office:

I'm such a perfectionist that I'd kinda rather not do it at all, than do a crappy version.

Seems that Google's software shares that mentality.


It's pretty darn hard to hit their page delivery targets unless you're serving from something close to their testing site. https doesn't help, because it adds round trips (www.spacejam.com doesn't support tls 1.3, and I wouldn't want pagespeed to be using a resume handshake for initial load anyway).

An ultra small, static site can sometimes get 100 though: https://developers.google.com/speed/pagespeed/insights/?url=...


> It's pretty darn hard to hit their page delivery targets unless you're serving from something close to their testing site.

Are you sure? Have you tried using a CDN?

> An ultra small, static site can sometimes get 100 though

Here's an example site I run that gets a close to perfect score that has a fairly complex landing page:

https://developers.google.com/speed/pagespeed/insights/?url=...

(it should load close to instant once it connects in your browser: https://www.checkbot.io/)

The main tips I can give for a high page speed that most websites don't do are avoid large header images, make sure text is visible before custom fonts load, use minimal CSS (and/or inline the CSS for the header into the top of the HTML), don't use blocking JavaScript and especially avoid huge JavaScript triggered cookie popups (the blocking JavaScript + big delay for the Largest Contentful Paint will kill your score).


Google can't even meet their own metrics. The Gmail landing page gets a PageSpeed score of 24.

https://developers.google.com/speed/pagespeed/insights/?url=...


You can't finish the quiz though - you get stuck on this page: https://www.spacejam.com/cmp/lineup/quiz6.html

Should I contact Warner?


webmaster@spacejam.com would be a starting place

I'm curious how this happens.

Obviously the site owner is intionally keeping the site up and dealing with outages.

But I wonder why?


> But I wonder why?

Publicity for Space Jam 2?


It's been up for years, well before Space Jam 2 was even in the works.

Its quite possibly literally just a bunch of static html files. There is not much maintenance cost there. They probably run all their static sites from the same webservers. It may very well be the same effort to keep it as to delete it.

there's no way it's being maintained on same software.

so how would you make it bullet proof, just s3 and cloudfront?

so what would you say it costs a year to run?


It's just some basic HTML and CSS. You can basically serve it from a Raspberry Pi for free.

Warner is hosting it, it probably cost them almost nothing more than usual.


It's running on a fairly current version of Apache, but aside from keeping the server up to date, it conceivably could be running the same setup for years.

For an organization the size of Warner Bros, it's essentially free, as they are literally doing nothing to the server for that site specifically.

However, it does look like it's running on AWS using their global accelerator (globally optimized traffic) so I assume it's sufficiently robust.


I got a cookie notice this time which I think is new in the past couple of years...

I would guess because of the trademark: keeping a bunch of static HTML files around is not much cost otherwise, and certainly gathers some attention like this post on HN indicates.

“The site owner” is Warner Brothers, they likely have a department responsible to keep movie-related websites up - and might well do it in-house, after the AOL acquisition. A site like this is basically free to host: a domain registration for 20 or 30 years will attract a massive discount, space on disk is probably less than 50MB, used bandwidth is minuscule... after you set up log rotation by file size and automated domain renewal, you can basically forget it exists.

Each page has a funny <!— comment —>

Unrelated but, one old page I miss is www.whatisthematrix.com

It just redirects to www.warnerbros.com/movies/matrix/ now :(


Don't they have a new matrix movie coming out next year? Would be good occasion to bring back the old one.

I made a few close friends there in the chat, where other hacker wannabes and philosophy neophytes would gather. The chat forum had fun weird bugs that my friends and I would play with in order to edit past posts, or obliterate each other's posts. It was wonderful little corner of the web for a short while.

That was just one part of that great site. In 1999, the Internet still felt new and full of potential. I loved all the concept art posted there, the trailers, and finding easter eggs.

Years later I recreated the full chat for my friends, including the bugs. It


The website moved a lot. See the Web Archive: https://web.archive.org/web/20041015000000*/spacejam.com/

Some years it redirects to WB's website, sometimes to an archive website, etc.

It seems that the original was not accessible between 2000 and 2018.


It is hardly surprising that they'd reactivate it. A new Space Jam movie[1] is coming out next year. I know this somehow and am not proud of it.

https://www.imdb.com/title/tt3554046/?ref_=nv_sr_srsg_0


> I know this somehow and am not proud of it.

Anyone who has even casually followed the NBA over the last two years probably knows this too, so you've got plenty of company.


Interesting. I would have sworn that I remember “the Space Jam website is still up” being a common fun fact on the internet when I was in college 10 years ago.

In fact, here’s a reddit post about this fun fact from 2010:

https://www.reddit.com/r/todayilearned/comments/esxwd/til_th...


The original URL, the highly memorable http://www2.warnerbros.com/spacejam/movie/jam.htm shows as pretty consistently available on the Internet Archive since 2003.

Needs moar stars!

It’s such a nostalgic feeling of the earlier web back when just interest groups, universities, fan pages, web-rings ruled the web. Back before it became commercialized by greedy folks that threw ads all over the place, tracked everything you do and spammed the hell out of your inbox.

I miss the good ‘ol days for what the web was intended for.

One of my first projects was maintaining the site for: Looney Tunes Teaches the Internet.

If you look hard enough it’s still out there.


The best thing about the early web was that nobody knew what it was for. So people just did things, without considering if it was "right."

Nowadays, you'd never get Bob's Labrador Page. Because "Hi! I'm Bob. I live in Lebanon, Kansas. I like Labrador dogs. Here are some pictures of my favorite Labradors!"


Those pages still exist, except instead of being hosted on geocities.com/area51/bobslabs it's on instagram.com/bobslabs

Not that much of a difference really IMO


The difference is that the only way you're allowed to be creative are in the ways defined by the corporation that hosts your content.

Some would call that emancipation: you’re freed to create if you never were going to be able to learn how to publish your own web page.

I don't really find this argument convincing because it was plenty easy to publish your own webpage. I did it at 9/10 years old and I don't want to believe that your average adult has less capability than a child.

I think we're doing a disservice by infantilizing people.


> disservice by infantilizing people.

People around the world are more educated than ever in humankind history [0][1] , including ability to code

Possibilities are still out there, it's not like people are forbidden from building their own web stack from scratch - you can still buy vps, bare metal, R-Pi and static IP or dyndns - and just code whatever you want.

Of course, Internet is not what is was in 1996, doing trivial things like publishing cat/dog videos and photos is easy - as it should be. Amount of the content is enormous and one can find amazing, incredible, briliant things - maybe not necessarily on the top of FB/IG feed, but it is still out there.

[0] https://ourworldindata.org/literacy#historical-change-in-lit... [1] https://ourworldindata.org/global-education


I am no more infantilizing people than pointing out that most people can’t change the oil in their car. It’s a speciality where most pay a service fee to get it done for them. And they go about their lives just fine.

Publishing a webpage with 1990s tooling is roughly in the same category of complexity.

It is far easier to pay Wix, or even better, to use one of the myriad photo sharing sites like Instagram, Flickr, Or Facebook. Which is why they’re so successful, and why the internet is a far more widely used and useful platform today than it was 20 years ago. It’s a disservice and carries no virtue to insist on unnecessary complexity for those that really could care less about computers or how networks work.


To my knowledge, they typical ISP of the 1990s provided free web hosting. (At least it was in my neck of the woods.) It may not have been much, but it was enough to put up a personal website that was not plastered with advertising.

The necessity to write your own HTML (or use tools like Dreamweaver and Frontpage) and the anything goes design mentality may have resulted in some atrocious sites, but it also made the web feel more personal. While there may be some ability to tweak the design while using a CMS, it is much more constrained and sites feel much less personal.


Yep. I took a high school computer science class back in 1995. My end of year project was a website about my MUDing adventures and I put it up on the free web hosting from our ISP. I wish I still had it. I remember thinking it was pretty terrible even back then.

Quite different. Old homepages were more "building" less "sharing". Beyond the coding, there was planning and categorising. You put thought into the interface and structure.

Seems like a small thing but its the difference between being a hobby mechanic or just owning a car. Or buying a desktop vs building one. You end up with the same thing, but "feels" like a very different endevour.


This is rose colored glasses IMO. The vast majority of pages were just no standardization ... <img> <br> <br> <img> ...

placed images.

If you want all that hobby mechanic stuff you can do all the same now with firebase or pages or whatever just like you were with frontpage or dreamweaver back then.


That's true, but that small amount of effort is still about 1000x more than is required to use Instagram.

And it was really the discovery of such web pages back then that was the thrill. It really did feel like exploring an alien planet or following a treasure map of link exchanges. Each click was an investment of a couple minutes at the rate pages loaded, so you really couldn't explore every link. And browsers didn't have tabs -- you were looking at one page at a time and maybe bookmarking it for later.


The editorial and stylistic independence is what I miss.

Absolutely: there is more stuff on the internet than there was then.

But! How much of that stuff is creatively controlled by actual end users? I'd say < 10%.

The large platforms are right out - restyling Facebook?! The build-a-site platforms all look somewhat similar because form follows tooling defaults. And because of the professionalization of web technologies, laypeople are locked out from just making their own page (or at least don't believe they can).


I think that’s a beautiful way of describing the differences.

These days the web is all Ikea flat pack. It does it’s job and in many cases it works really well for the price. But the individuality has gone since people aren’t just hacking together something based on their own tastes and limited carpentry/web development skills.


While true, it also puts up a hurdle - most people wouldn't bother learning how to build a website because of how difficult it looks.

But now putting pictures of your labrador on the internet is accessible to everyone. In practice, there's thousands of times more labrador pictures on the internet now. However, it's lowered the value and uniqueness of said labrador pictures.


Nah, it feels very different because Instagram et al has gamified the whole thing. Bob's labrador page is its own space, separate in a way from the rest of the internet. Bobslabs on Instagram is implicitly competing with celebrities and 'influencers' whether Bob likes it or notand that changes the feel.

It feels different because of the nostalgia filter [1] and how narrow we were as kids/teens building that stuff.

I totally get where you're coming from - and I agree that it is different in many ways, but in the important ways it was the same, IMO

[1]https://tvtropes.org/pmwiki/pmwiki.php/Main/NostalgiaFilter


Nostalgia factor is very real, but I don't think it captures just how novel the internet was. Communicating en masse across the world had never happened in human history. And it made you feel like an explorer of an alien planet, at least until one too many "under construction" pages of the night.

So yes, Space Jam site itself was less about wowing, but gave a feeling of interacting with its creators on a more intimate level than other movie marketing. They were using the same tools that any one of us could do ourselves, unlike the millions spent on the movie. The Space Jam site looked much like dozens or hundreds of others from hobby coders or engineers in their free time.

And for me it's more melancholy than fun now, because it reminds me of that feeling of unbounded optimism that the early internet had.


It was very different. It was different in construction, discoverability, intent, and consumption. This isn’t nostalgia. I’m not particularly nostalgic about that time for other reasons and I was neither a kid nor a teen.

Nothing I do on Instagram can possibly make it as personal as my personal sites were. That’s not how I interact with Instagram at all and it couldn’t be even if I tried really hard. And even if I managed it, it’s not how it is offered by Instagram and not how it would be consumed.

That said, I don’t think that web is dead. It’s just a lot less discoverable and there’s a lot more noise. One of my favorite “old web” sites: https://www.fieggen.com/shoelace/ I’m not even sure it’s actually old. It just is more like the old web.

Notice the first comment (towards the bottom of the page): “Low on modern-web-BS...” There’s a qualitative difference.


The shoelace site is wild, but it got me thinking. Detractors might say that Wikipedia would fill the void for this type of information.

...but I don't think that's true. Ian's shoelace site information would instead be edited ad-nauseum by a consortium of shoelace enthusiasts. It doesn't allow for personal opinion or in some cases specific things that aren't well known that can't have their history sourced properly (citation needed?)


I mean I get it. I still call it nostalgia because as you point out, it's still completely possible to do it's just a smaller overall percentage of what is on the web. I run a niche site that gets ~20k MAU:

http://airforcefitnesscalculator.com/

I consider it the ultimate in no-modern-web BS. The only "modern" thing I use is GA, which even then, honestly I'm looking at replacing it with one of those 90s counters.


I was surprised to find a great example of a site like this recently, "How to Care for Jumping Spiders": https://kozmicdreams.com/spidercare.htm , which is part of someone's broader personal home page with random bits of art, photography, and a guestbook(!). The geocities-esque design bowled me over with nostalgia... the header is an image map!!

Though it is somewhat ironic that this post is prompted by an advertisement for a Warner Bros movie.

Funny you mention tracking, one of the few modifications made to the Space Jam website at some point in the past 24 years was the addition of both Adobe and Google analytics.

I wonder how it will compare to: Space Jam: A New Legacy (2021)

Website and Movie :)


Oh no! The mime types for the desktop “backboards” aren’t set right! My phone can’t render the Windows-compatible files or the Mac files!

https://www.spacejam.com/cmp/souvenirs/patternsframes.html


The website may be the same, but at least they've "updated their privacy policy" in the lower left!

The omniture javascript code is copyrighted 2008 as well.

<!-- Badda Bing, Badda Boom --> from page source

I try to leave little notes and jokes in HTML source because from when I was growing up playing with computers to now I still look at the source just to see if someone was expecting me. It's not very common now, unfortunately.

Neat! OTOH, Peter Suber's Nomic page must win the contest "who has the biggest dead-to-live link ratio":

https://legacy.earlham.edu/~peters/nomic.htm


Aleksandar Totic from the original mosaic team has his website still up.

http://totic.org/nscp/index.html

Personally I enjoyed this bit:

http://totic.org/nscp/swirl/swirl.html

If Aleksandar reads hacker news I hope he never takes that down.


"So far, I haven't received any swirl pictures from the outside world. I find this hard to believe that we are the only ones enjoying this activity."

Hard to believe, isn't it...


We called them "swirlies" in middle school/high school. But I've never actually seen someone get one, and it could well be mostly apocryphal. And it wasn't something you sought out, it was like, you were getting bullied.

The Oasis closed a couple of years back :(

I liked this quote from the page: "Tim Berners-Lee on home page: Q. The idea of the "home page" evolved in a different direction.

A. Yes. With all respect, the personal home page is not a private expression; it's a public billboard that people work on to say what they're interested in. That's not as interesting to me as people using it in their private lives. It's exhibitionism, if you like. Or self-expression. It's openness, and it's great in a way, it's people letting the community into their homes. But it's not really their home. They may call it a home page, but it's more like the gnome in somebody's front yard than the home itself. People don't have the tools for using the Web for their homes, or for organizing their private lives; they don't really put their scrapbooks on the Web. They don't have family Webs. There are many distributed families nowadays, especially in the high-tech fields, so it would be quite reasonable to do that, yet I don't know of any. One reason is that most people don't have the ability to publish with restricted access."

Basically was describing the concept of social networks before they existed on the web.


Except it's not really private; some random big company sees everything you do inside your house.

Oh wait TVs do that now. I guess the real world evolved to be more like social networks...


> Unfortunately, this only works on a Macintosh running Netscape [1]

This is the most specific "best viewed with..." message I have seen.

[1] https://www.spacejam.com/cmp/souvenirs/iconsframes.html


It's even crazier than that. That page suggests using ResEdit to modify the Netscape application to feature their spinnning basketball icon rather than the standard "N".

(ResEdit was Apple's editor for data in the resource fork of HFS files, which classic Mac apps used to store their assets. Mac OS X abandoned this interesting but unusual approach in favor of the NeXT way, ".app" directory hierarchies.)


I was surprised by that too, it would be kind of like the new Trolls movie advising kids to make tweaks in windows registry. Hope they don't make any mistakes :D

Well that really takes me back! It was pretty cool how classic Mac apps had most images, text strings, that sort of thing in the standardized resource fork structure. Which meant that you usually could alter a bunch of things about an app's appearance and sometimes behavior by editing those resources with a standard editor.

It’s not really a message about browser compatibility. They’re explaining how to change a specific icon on a Mac. Unsurprisingly, a tutorial written for a specific system will only work on that system.

I love it every time I see it linked

Almost all the links here, and all the interesting ones, are gone: https://www.spacejam.com/cmp/bball/nbaframes.html

The only remaining ones are ones I already knew about - nba.com and Yahoo! Sports.


1994 checking in!

http://www.lysator.liu.se/pinball/expo/

Is anyone from Linköping University reading this? I need to thank them for 26 years of free hosting. :-)


You should really get around to finishing those pages that link to http://www.lysator.liu.se/pinball/expo/unfinished.html any decade now.

Sounds like side projects for forgotten

We're still uploading photos from that QuickTake 100...

When I was in college I remember discovering the "Master Zap" website on this domain. It was a musician/software developer who made a few software odds and ends. One of those being Stomper an analog style drum synthesis program. I have great memories of spending hours trying to re-create each sound from a TR-808. Taught me a lot about synthesis. Also really got me writing code and learning C...

EDIT- ITS STILL THERE :D http://www.lysator.liu.se/~zap/


It's got a status page, too: https://twitter.com/spacejamcheck?lang=en

Great load speed!

looking through the source, there are a number of commented out links. Here is one:

https://www.spacejam.com/video/


My fan website for the Australian Band The Baby Animals from 1994 is still online.

http://southcom.com.au/~tim/


I love that 404 page

My mechanic’s website is a work of old-school art.

http://www.waspauto.com/


That meet the staff part was delightful.

I think funnily enough, one thing that made these old websites more interesting is how slow the web was back then.

In a way it was "animation"— I'd look at images more closely as they "scanned" into the page and notice details I don't think I would now. In a way the fact that all these pages load instantly now is a bit of a downer. Maybe because there's no anticipation any more, or maybe just because the page seems more static and unchanging.


I was webmaster for this site (and thousands of others at WB) back in 2001! I believe this was when we did the great www -> www2 migration, which was of course supposed to be temporary. In fact I think that was when we migrated from our own datacentre to AOL's but I could be getting the timing wrong.

Back then it was served from a Sun E4500 running Solaris (7?) and Netscape Enterprise Server. Netscape had been acquired by AOL which had also just bought Time Warner (that's why we moved to their datacentre) but somehow we couldn't make the internal accounting work and still had to buy server licenses.

Fun fact, unlike Apache, NES enabled the HTTP DELETE method out of the box and it had to be disabled in your config. We found that out the hard way when one of the sysadmins ran a vulnerability scanner which deleted all the websites. We were forbidden from running scans again by management.

Another fun fact about NES - they were really pushing server side Javascript as the development language for the web (and mostly losing to mod_perl). Also back in 2001 but at a different place I worked with the person who had just written a book on server side js for O'Reilly - he got his advance but they didn't publish it because by the time he had finished it they considered it a "dead technology".

Our job was basically to maintain an enormous config file for the webserver which was 99% redirects because they would buy every conceivable domain name for a movie which would all redirect to the canonical one. Famously they couldn't get a hold of matrix.com and had to use whatisthematrix.com. Us sysadmins ran our own IRC server and "302" was shorthand for "let's go" - "302 to a meeting". "302" on its own was "lunchtime".

I still mention maintaining this site on my CV and LinkedIn - disappointingly I've never been asked about it in an interview. I suspect most of the people doing the interviewing these days are too young to remember it.


Wow, every paragraph has a great story!

This is a wonderful post. Thanks for your insight.

> the person who had just written a book on server side js for O'Reilly

I'd love to see this unpublished book, if possible!


It will be great!

Unfortunately I can't even remember the author's last name and LinkedIn isn't helping. Let me ask around.

It would sell now like hot cakes. An interesting artefact


:) This made my day. Were you based in LA at the time? I’m out in LA now and love hearing stories like this from colleagues who’ve done engineering work in the entertainment industry.

Yes, Burbank.

Here's another one - Solaris had a 2GB file size limit (this was before ZFS). Which isn't as crazy as it sounds now - hard drives were 9GB at the time. So ordinarily this wasn't a problem, but when the first Harry Potter movie came out, harrypotter.com (which was being served off the same server as spacejam.com) was the most popular website on the internet and the web server log would hit the 2GB limit every couple hours and we would have to frantically move it somewhere and restart the process.


And somewhere was /dev/null ?

Most likely the main Oracle server which had a disk array attached. These days it's easy to forget how tricky it could be to get basic things like this working. rsync was in its infancy and I doubt we were using it. As I recall most servers didn't have ssh installed; telnet was standard. Lots of tar piped over rsh to get files from A to B.

7-8 years ago I've encountered a full-on rsh internal network in a fairly big datacenter. No rsync, rcp. It turned out that the sysadmins were too lazy to set up ssh keys on all the servers, and their manager was skilled at deflecting issues.

Hello, I run a monthly magazine about Harry Potter, could I interview you about this? Any email address where I could write to you? Thanks!

Sure, username at termcap.net

Made my day too! Wow.

I like how it loads instanly! :-) I remember staring at this thing for about 2 mins watching it load over dial up in a third world country on an laptop dad somehow smuggled into the country from dubai without paying import duties. Yeah.


97 on mobile and 99 on desktop. [0]

For a site that never had pagespeed as a tool, that is super impressive and just goes to show how bloated today's websites really are.

[0] https://developers.google.com/speed/pagespeed/insights/?url=...


I like to think you can still make a fast site by sticking to the basics; write plain static HTML and CSS, avoid JS, craft your images specifically for the site, and put it behind a decent webserver with compression enabled. I'm sure Pagespeed rewards HTTPS and HTTP/2 support as well but that's webserver config.

Or make it even faster by not using CSS at all! (spacejam.com predates CSS1)

Even gzipped jquery is already larger than the entire payload of the homepage, and jquery is already considered an old technology.

Server side JavaScript? That’ll never catch.

I love these examples of historical calls made too early...


But node js is alive now.

He was being sarcastic :)

Whooooosh

Read past sentence 1...

Having done it NES-style, I'm kind of glad that server-side JS didn't catch.

Node is an unlikely event. It hit the sweet spot precisely when async started being needed, there was a runtime environment that made it viable (v8, thanks to massive investment by Google on Chrome), and a language that made it kind of natural.


is there consensus on whether Node did or did not "catch"? like, sure, it was a launch language of AWS Lambda and Netflix uses it heavily. but every backend person I know uses Go or Python or Ruby, anything other than JS. pretty much only fullstack JS people consistently pick nodejs as first choice - but that's just my POV. I'd love some more definitive numbers.

Its components are part of every developer's toolbox (a lot of really essential command line tools use to be installed with npm). I don't see many backends powered by it, but I'm mostly a Python developer and that certainly biases my samples.

Node as a tool definitely did catch on: automated testing, package management, REPL integration, bundling etc.

Node as a server/backend language: I would say the say the ecosystem and usage are at least as large as Go/Ruby. This is hard to gauge. But I assume/expect a factor of 2 or more. If you look for web specific libraries it is unlikely you'd find something for either Go/Ruby but not for Node. Python is harder to compare because it is used much more broadly.

For me the biggest use-cases for Node are: server-side/static rendering (get all the power/expression of frontend libraries on your http/build server), web-socket ease of use and the fact that there are tons of programmers who are at least familiar with the language.

And even though it is steadily declining, the most popular web-backend language is still PHP by a long shot. And this won't change until the other languages get managed hosting that is as cheap and simple and similarly ubiquitous.


It's absolutely catching on with younger developers who don't want to bother using multiple languages. Anecdotally, like 100% of developers in their 20s at my company push hard for JS.

Paypal, Linkedin, Netflix, Uber

Haha

But I used a few servers that allowed for JS scripting way before Node.

I guess, most of them were proprietary, so it never caught on until Node.


At my first job out of college in 2008 we wrote JS for a desktop app on top of Mozilla's xulrunner platform. At the time I hadn't heard of using JS outside the browser and Node was still a few years off. It was a great experience but Xulrunner got killed by Mozilla and the company had to rewrite in C#.

Just an example of (sort of) non-proprietary, non-browser JS.


It used to be super slow. Like, you cannot imagine how slow it was.

The usual example I trot out is when I was writing a client-side pivot table creator in the mid-2000s, as far as I can remember, with just 100 items the js version took 20-30 seconds. I then tried it using XML/XSLT instead[1] and it was instant.

I haven't checked recently, but even a few years ago javascript is extremely slow with large datasets, I was mucking around with Mandelbrot generators and js compared to C# was like a bicycle vs a jet engine, they weren't even vaguely in the same league performance-wise. Just had a quick look at some js based versions looks like it's got a bit faster, but still slow.

[1] Awesome performance, super hard for others to understand the code. XSLT was great in some ways, but the learning curve was high.


In principle, Javascript on modern JIT can approach roughly 40-80% of the performance of C++, more or less across the board, not just in some toy benchmark. This benchmark is from 2014:

https://blogs.unity3d.com/2014/10/07/benchmarking-unity-perf...

Granted, this benchmark is for generated Javascript, not "idiomatic Javascript" (if there is such a thing). Of course you can write arbitrarily slow Javascript as well. There's a lot of stuff that can throw a wrench into the JIT.


That seems an optimistic reading of those graphs. As far as I can tell. a lot of them are 20-30% the speed of native, not 40-60%.

For the Firefox graph, none of them are below 35%, only three are below 40%. The average is well above 60%.

That said, there's a lot going on there. Javascript can not use SIMD or multiple threads. Some tests are heavier on C#, which is converted to C++ in the Javascript version and thus becomes "faster than native C#".

The point is not give an exact number on how fast Javascript is compared to any other language, there's wide variance across usage and implementations. The point is to show that Javascript to rival C# performance is feasible.


>> Server side JavaScript? That’ll never catch.

Server side javascript didn't catch on the first time around because you couldn't create threads. Because in the early 2000's, all technical interviews asked about Java style multi-threading. At some companies, the first round of technical interviews was a very, very detailed discussion about the Java threading model. If you didn't pass, you didn't get to round two. So everybody wanted to use threads.


I did my own "JavaScript Server Pages" using Rhino (JS for Java), which does compilation of JS to java bytecode. It worked great for me. I did two sites using it, then changed job.

Thanks for sharing this -- I love stories like these

thanks for sharing mate, great to know what was going on behind the scenes during this time.

Thanks for sharing. Is this your comment in html code?

<!-- Badda Bing, Badda Boom -->


<!-- Google Tag Manager -->

1996 site.


> We were forbidden from running scans again by management.

A scan detects a severe vulnerability and their reaction is to never run scans again...


Well, the issue seems to be the scan not only detected but realized the risk from the vulnerability, which is exactly what is the point of scans to help you avoid.

It's hard to detect what HTTP DELETE does without, well...

did they turn the scan loose on PROD without scanning a dev first? granted, I wouldn't blame anyone for assuming a scan is read-only

A what first?

What dev?

Real software engineers ssh directly into PROD to write code.


It was a different era. A lot of best practices we take for granted now as being common sense were stuff we (and I say this as myself also being an old time sysadmin) had to first learn...and often the hard way.

Plus the web wasn't as important business stratergy back then as it obviously is now. I doubt Warner Brothers would have been willing to invest in replica dev infrastructure when "developers can write code on their desktops". I know dev infrastructure is for more than just developing code, but common concepts we take for granted like IaC, CI/CD, config management etc wasn't formalised or widely used back then and servers were pets that were held together with duct tape and sacred rituals.

In many ways, that's what made being a sysadmin in that era fun. There wasn't any shame in hacking together a solution.


"In many ways, that's what made being a sysadmin in that era fun. There wasn't any shame in hacking together a solution."

hah true, and management was absolutely amazed! In the late 90s/early 2000s i worked for an independent pharmacy chain. I wrote what was basically just a proxy sitting between our dispensing systems and the central clearing networks for rx drug pricing. All it did was double check the price on the prescription (our stores weren't applying price updates which was a manual process at the time) and reject prescriptions that were priced wrong with a message telling the pharmacist to apply their price update. The CEO invented an annual award to give to me for the "work" hah


Back in 2000, it was the wildwest. You actually did run tests on production and editing production code was not as insane of an idea as it is today.

I don't think anyone (at least no one I worked for at the time) ran staging or dev servers.

It was always stupid, yes. But back then we didn't have the tools and testing suites that we have today. CI/CD setups didn't exist. Git wasn't even built until 2005. The only version control solution was SVN at the time, which was released in 2001. But it was clunky and immature.

Back in 2000, launching a site update meant someone would log into the server via FTP, drag the files over, and try to "be careful" while they did it. Using passwords that were written on a sticky note, stuck to the CRT monitor's screen (password managers weren't a thing).

It is easy to forget how immature and primitive the world of web development was at this time.

Also, the internet was still so new, that every C-level executive had built their careers by running businesses in the 70s, 80s, and 90s. Back before you built websites or relied on them for any significant impact on your bottom line. So to tell an executive from that era that your website broke when you poked at it, their solution would be to stop poking at it. Security wasn't really a major concern like today, and having a website was still mostly a novelty in the eyes of most executives.


> Back in 2000, launching a site update meant someone would log into the server via FTP, drag the files over, and try to "be careful" while they did it.

Heck, I remember just compiling Java classes on the server machine itself, copying them to a production directory, and restarting the app server (tomcat IIRC). Source control was the sysadmin running a nightly backup of source directories


> The only version control solution was SVN at the time

Pre SVN there was CVS, and before that was RCS. Now these didn't work the way we think about git today, but they did allow you to roll back to a known good state with some futzing about.


Website is still alive and kicking more then 20 year later, guess it worked.

/s


Bit like testing for COVID-19, if you don't test for it how can you have it?

no politics, but this is like the "if you stop testing numbers go down" lol

They might give you a call now, as the site just died from traffic.

Which is weird - this site comes up on reddits front-page often and it has never gone down. Is HN higher traffic??

Obviously not.

Posts like this make HN amazing. Thanks.

> they were really pushing server side Javascript as the development language for the web (and mostly losing to mod_perl).

Enterprise server-side JavaScript was the first stage of dynamic web servers that connected to relational databases. Netscape LiveWire, Sybase PowerDynamo, and Microsoft Active Server Pages (with interpreted LiveScript or VBScript) were early. The enterprise software industry switched to three tiered app server architectures with JVM/.net bytecode runtimes. Single-process, multi-threaded app/web servers were a novelty and none of the client drivers for the popular relational databases were thread safe initially.

It took some time for RESTful architectures to shake-out.


Apache (and mod_perl) was thread safe by being multi-process, single threaded. You were always limited by how many perl interpreters could fit in RAM simultaneously. Then came the era of the Java app server (Weblogic and WebSphere).

Everyone has mostly forgotten AOLserver (and TCL!). Almost everyone of that generation was influenced to use a RDBMS as a backend by Philip and Alex's Guide to Web Publishing which came out in 1998.[0] Although I never actually met anyone who used AOLserver itself! Everyone took the idea and implemented it in ASP or perl, until the world succumbed to EJBs.

[0] https://philip.greenspun.com/panda/


AOLserver was a joy. I was able to speak about it at the O'Reilly OSCON in 2000 or so.

I used AOLserver with OpenACS, https://openacs.org/ . AOLserver was apparently more optimized than Apache (at the time, at least).

My manager even had us (early 2000s) take a one-week(?) bootcamp at Ars Digita HQ in Boston. Though about the only thing I remember from it was the fancy aeron chairs.


That was also the era when DBMS locking was iffy, so competing inserts/updates would happen once sites got too busy, and everyone was encouraged to turn off transactions (or, on MySQL, just use MyISAM instead of InnoDB) and put it on XFS or even tempfs for speed.

So many websites I remember came back up with "the last two weeks of posts are gone, sorry" or just shut down for good because it was all too corrupted to fix, and so were the backups, if they had any.


Funny you mentioned matrix.com; back between Matrix 1 and Matrix Reloaded I knew someone who knew the guy who owned thematrix.com and apparently was not offered enough to sell it. I guess they must have agreed at some point because I also recall that the matrix.com at around the time of Reloaded then started to redirect to the canonical Matrix site (which it sort of does now). Wonder what the price turned out to be.

> matrix.com at around the time of Reloaded then started to redirect to the canonical Matrix site (which it sort of does now)

Actually matrix.com now appears to be a site relating to hairstyling and haircare products.


I think they meant thematrix.com which forwards to a WB site; maybe autocorrect added the space.

Yes, that is the case. Autocorrect strikes again.

"We were forbidden from running scans again by management"

Seems like management believes it is better to wait for real bad actor to purposefully destroy your site than have it done by your honest employees by accident.

I know security in 2001 was much more lax (I started in 2000) but this still shows ignorance of management.

The right way to handle this would be to ask your staff to ensure it is possible to restore services and to ensure you know what the tests are doing before you run them.


>Seems like management believes it is better to wait for real bad actor to purposefully destroy your site than have it done by your honest employees by accident.

From a politics standpoint that is completely true. Which would you rather tell your boss:

Q: Why is the website offline?

A: One of our sys admins accidentally deleted it.

OR

Q: Why is the website offline?

A: Some nation-state/teenager launched a sophisticated cyber attack, we need to increase the cybersecurity budget. It's the wildwest out there!


There is this saying that only a person that does absolutely nothing never makes any mistakes.

Mistakes are normal course of action at a corporation. Sane managers will understand that is not possible to not have people make any mistakes. Mistakes are part of the learning process.

Now, when somebody makes a mistake what I am looking for are:

Does this person show good judgment? Were precautions taken by the person reasonable?

Does the mistake show pattern of abnormality? Some people seem to attract failure, maybe there is some underlying cause?

Is the person learning from mistakes? Learning is expensive, if somebody made an expensive mistake I want as much learning as possible for the expense.

Is there some kind of external factor that made the mistake possible or more likely? Usually it is possible to improve the environment to reduce the number of mistakes.

As to preventing these guys from scanning ever again, that is bad decision because it is likely they would never make the same mistake again. What's done is done. The scan showed there are problems with the app, now we should want to know if there are more problems but without risking the application stability (too much).

---

-- Do you know what the Big Co. pays when they pay high salary for an experienced engineer?

-- They pay for all the mistakes he/she made at her previous place.


We have stopped all payroll payouts because one accountant mistyped a digit last month

Was the solution to bar accountants from typing?

I don't really agree with that decision, but maybe that phrasing is not 100% accurate. Maybe they just meant that they should run the scan on a local test environment and not in the production deployment. Obviously, there's value in running it against live, but at least for this particular issue, they could probably had caught it in a testing environment.

You are probably thinking in today's terms. Back then it was unlikely there was separate test environment. For application... probably, but not for what is a static website and HTTP server.

They may not have had a fully fledged staging environment which is an exact copy of production. But I'm sure they ran the http server locally in their desktop to test new configuration. I was a high school hobbyist "developer" (maybe a power user would be more appropriate) at the time and I certainly tested my pages locally before I'd upload them to Geocities. 2000 is not the bronze age :P. People were already talking about TDD then and the concept of testing things before shipping existed for much longer.

I think you're looking back at this from the modern perspective of cheap hardware and open source software. We didn't have Sun workstations as desktops - we were using FreeBSD on cheap Dells. Our production servers cost something like $200k each. And the http server cost money - we didn't have licenses to run copies locally. These days you can just run things in Docker and have a reproducible environment but it wasn't that easy back then.

I have been sysadmin at that time.

Usually, HTTP servers would be huge bloats of configuration running on huge machines. As a sysadmin, it was not typically feasible to replicate the configuration on the local machine.

With a static website nobody would figure you can accidentally actually do something harmful to the website. The worst thing that could happen was that you made a configuration error in which case you just rolled back the config.

Of course we were supposed to have backups in case the server failed. While the servers would typically use RAID arrays we knew it would not save against rm -fr so there was a backup but backups rarely could be restored instantly.


Interesting. I never thought it was so complicated to run the http server. What was the bottleneck that prevented them from running locally? The memory used by all the config?

The amount of hardcoded stuff that was specific to the machine. All the domain addresses, firewall rules, etc. There was no git or ci/CD to keep everything in sync (easily).

> I still mention maintaining this site on my CV and LinkedIn - disappointingly I've never been asked about it in an interview. I suspect most of the people doing the interviewing these days are too young to remember it.

This is astonishing to me. I check back to see if this site is still up once every year or two just to have a smile. If you were sitting across from me in an interview I am quite sure I'd lose all pretense of professionalism and ask you about nothing else for the hour.


>This is astonishing to me. I check back to see if this site is still up once every year or two just to have a smile.

The only reason I am sad about the death of flash is that it has all but killed my version of this: zombo.com.


wow, what a way to learn that zombo is dead. RIP zombocom

It isn't quite dead yet. It is just entirely Flash based so it likely won't make it to 2021. Safari has already blocked Flash completely while the other major browsers require you to manually enable it. Odds are those too will begin blocking Flash completely sometime in the next few months with Adobe set to end official support at the end of the year. So maybe add zombo.com to your Chrome whitelist and visit it one more time, because soon the infinite won't be possible.

https://html5zombo.com/ might assuage your grief a bit.

Thanks. This like that old trope of a child's pet fish dying and the parents buy a nearly identical fish to swap it in and pretend nothing happened.

> Another fun fact about NES - they were really pushing server side Javascript as the development language for the web

I started “web stuff” with Apache and Perl CGI, and I knew NES existed but never used or saw it myself. I had no idea “server side JavaScript” was a thing back then. That’s hilarious.


I entered around the time LAMP took over and the old-timers would always trash on server-side javascript.

> the old-timers would always trash on server-side javascript.

Plus ça change, plus c'est la même chose.


>the person who had just written a book on server side js for O'Reilly - he got his advance but they didn't publish it because by the time he had finished it they considered it a "dead technology".

Makes sense. This server-side Javascript thing will never take off. The Internet in general is really just a passing fad!


What machine is this being hosted now? Is it the same Solaris machine?

Your comment made me remember netcraft.com which was the tool we always used to see which web server and OS was running a site. There was a running joke back in the day on slashdot about "BSD is dying" as Linux started to take over, based around Netcraft which used to publish a report showing which servers were most popular. I'm glad to see they haven't changed their logo in 20 years!

Their site report used to show an entire history so you could see every time they changed servers but it doesn't look like it does anymore. Now it's running in AWS so certainly not the same Solaris server. Although those E4500s were built like tanks so it could be plugged in somewhere...

https://sitereport.netcraft.com/?url=https://www.spacejam.co...


> here was a running joke back in the day on slashdot about "BSD is dying"

Lol. I used to read Slashdot when I was in college and that time the running joke was "Had Netcraft confirmed it?", right?

> Now it's running in AWS so certainly not the same Solaris server.

Hmm. Would they have maintained the site to run on modern software stack or would they have just used legacy emulator like this: https://aws.amazon.com/blogs/apn/re-hosting-sparc-alpha-or-o...


I used this site as teaching Material for young adults the concept of A Webpage that is not an app.

Also Space jam has « style » to be compared to « wikipedia » that is « pure html text »


I submitted this and thought, "this is neat." Didn't expect to return to it being on the front-page nor the former webmaster to show up. The internet is a fun place.

That background story is fascinating. I wonder how many full-circles server side JavaScript has made up until now.


> Fun fact, unlike Apache, NES enabled the HTTP DELETE method out of the box and it had to be disabled in your config. We found that out the hard way when one of the sysadmins ran a vulnerability scanner which deleted all the websites. We were forbidden from running scans again by management.

Oh man, the early days were so exciting. Like that time I told my boss not to use Alexa on our admin page out of general paranoia... and a few days later a bunch of content got deleted from our mainpage because they spidered a bunch of [delete] links. I learned my lesson; secured the admin site a lil better and upgraded to delete buttons. Boss kept on using Alexa on the admin page tho.


Thanks for the story! Also, I wish 'webmaster' was still a title used in the industry.

Oh, that turkey. The movie, not the web site.

I once went to an industry presentation where someone on that project described the workflow.The project got into a cycle where the animators would animate on first shift, rendering was on second shift, printing to film was done on third shift. The next morning, the director, producer, and too many studio execs would look at the rushes from the overnight rendering. Changes would be ordered, and the cycle repeated.

The scene where the "talent" is being sucked out of players had problems with the "slime" effect. Production was stuck there for weeks as a new thing was tried each day. All the versions of this, of which there were far too many, were shown to us.

Way over budget. Cost about $80 million to make, which was huge in 1996. For comparison, Goldeneye (1995) cost $60 million.


Awful film. I was listening to the radio the other day and they had a 10 year old on saying Toy Story was cringe and he preferred Space Jam. I've never been so appalled just listening to radio.

What are you talking about? Space Jam is great fun!

If nothing else, the soundtrack is a really fun throwback. The monstar track with coolio, busta rhymes, method man et al is super catchy.

Why are you bothered by a child’s opinion on children’s movies? You’ll likely never meet the kid, and one’s sense of taste at that age is hardly definitive.

For someone that young, visuals tend to take precedent over story. I, for one, am glad to see a younger generation appreciate 2D animation (which still looks acceptable in Space Jam) over 3D (which looks dated in Toy Story).


No hit counter :(

god damn monstars

I love that 1996 HTML solved deep linking. It works flawlessly: https://www.spacejam.com/cmp/lineup/quiz1a.html

Look, I just linked to a wrong answer the middle of a quiz. It perfectly preserved all the state. (Fun quiz, too.)

This is mostly a tongue in cheek argument, but it has the benefit of being true.

Sadly the quiz seems broken at question 6. But you can even un-break the quiz by manually editing the URL to question 7: https://www.spacejam.com/cmp/lineup/quiz7.html

Imagine trying to do that with a React app. (And I say that as a fan of react apps.)

The ending of the quiz is hilarious, by the way.


Imagine trying to do that with a React app.

It'd work fine if the developer used the URL to maintain state with React Router's BrowserRouter or HashRouter.


In other words, not by default.

No, but even back in 1996 you didn't have to use the URL to store the state of a user's game. They could have used a cookie (supported in Mosaic since around 1994), or used a server-side session (I think was cgi::session a thing back then). That was a technical decision someone made. Those sorts of choices when you're speccing a web app haven't changed.

These days it would be a bit unusual to use routing to store the game state admittedly but that's why it's good to have older devs like 43-year-old me on the team. I actually think about these sorts of details rather than using the defaults.

It's also worth noting that storing the state in the URL shouldn't be the default. Being able to see the state by reading the address bar would be a security issue for most apps.


The elegance of using a static URL is that you are not really keeping the state - you instantiate all possible states and let the user traverse them according to whatever state they happen to be in (and able to see).

There is no spoon.


All thats wrong with the modern web is in this answer.

Probably 3/4s of the people who post on HN can attribute their entire career existing to "the modern web", so maybe there's some good things as well as bad things about it.

I believe some of the conceptual simplicity of the "olden web" is missing in the age of leftpad.

"My career exists because of it" isn't a great argument for its actual merit.

Nextjs wants to talk with you xd

I like how you can tell that the HTML was done by hand. There's a parallel here, something like: handmade furniture is to hand written html as mass produced furniture is to generated html. There's a quality to it.

But it’s on HTTPS now, so you can’t actually load it on an old computer

Via a proxy.

https://www.nytimes.com/2020/07/01/world/hong-kong-security-... The last part is reading in conjunction with this made me feel ... is web site more safe at least easier to be preserved than social media? I do not doubt whether one program can deep delete social media messages, at least one can ship one's web site as an archive and it is still readable. Priatebay like but for individual ... so easy to silence in social media compared with well Space Jam.

As how a web site ought to be, online forever!

https://www.spacejam.com/cmp/lineup/quiz2.html

Amusingly, exactly none of these answers are correct anymore (at least not until 2028, then again in 2031).


You know, back in high school when I was learning HTML + JavaScript, I was really looking forward to creating websites that took "longer" to load[1]. Because I have associated that with complexity (understandable), and I associate complexity with coding professionally.

Now that I _am_ coding professionally, I just wish websites would load simple as this, with interfaces as simple as this. None of that fancy image preloading, or disappearing/reappearing navbars, or those sidebars that scrolled independently from the main page content.

Then again, what memories are those which time will not sweeten, right?

[1] Caveat: with the dial-up connections then, all it took were enough images for a site to load slow. So I wanted mine to take "longer"!


There was an obsession for a while with advanced Flash websites to design interfaces that extensive and complication transitions and loading screens. Best example being "2advanced".

Artificial waits are still a thing all over the place, because people trust results more when they seem to take some work to produce:

https://www.fastcompany.com/3061519/the-ux-secret-that-will-...


Oh, sitemaps. Maybe something like this could be a way to give a summary of a website again. https://www.spacejam.com/cmp/sitemap.html

Fascinating. The web has changed so much and so quickly over 25 years.

Looking at https://www.spacejam.com/cmp/jump/linksframes.html

Self-evident how bad link rot can be! I think one of the links still work. A few in there point to the old Yahoo Directory.


Of course it is still alive. It's one of those fixed-space-time-points that all the bloody turtles and elephants are balanced on. WE take that down, who knows where and when we will end up !?

What would the "PageRank" value be, if this site links to you ? Such an "old , esteemed" site should have some "High-XP/Google-Juice" Value ?

Seriously, modern websites should load as fast as this one

I love/hate the fact it's had a cookie banner added

https://www.spacejam.com/cmp/pressbox/credits.html

Love the shout outs to the people who made the site


It is down now.

Looks like it's received the HN hug of death

  The connection has timed out

  The server at www.spacejam.com is taking too long to respond.

Same here...

*was

National treasure

Would be a shame if someone added a..... JavaScript framework to it.

..Did all the traffic from being on front page of HN take it down?

I use this site to calibrate my team's automated visual diff regression testing stack.

dark mode before it was cool.

Even the website cert has just been renewed (on Jun 12, 2020)

HTTPS is gaining traction :-)

I wonder if they will upgrade to HTTP/2


I miss the days when the web used to be this snappy and fast

I love this! "The jamminest two minutes of trailer time that ever hit a theater. It's 7.5 megs, it's Quicktime, and it's worth it. Click the graphic to download..."

Some day it'll get taken down and we'll all be sad.

Just curious, why this particular website is still being keep alive?

Nothing against it, though :D


The source code of each page contains a small little easter egg comment at the top, btw.

Legal | privacy