Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Maybe the answer is a rolling window of stability for OS APIs--something like 10 years (Windows 10 having Windows 95 compatibility mode is a bit absurd). On the other hand, if you have a large library of test software, maintaining API bridges might be doable, and for software more than 5 years old, performance on modern hardware shouldn't be a major concern.


sort by: page size:

I'm reading your thesis as "stable APIs are bad because it's hard to get rid of cruft, and if we just test enough, stability won't matter". As a net-consumer of APIs, I'm not sure I can get behind that.

Of course, an API can be expected to change rapidly early in its life. Such APIs are usually not considered ready for serious production use and breaking changes aren't really a big deal. Later on, people tend to expect more stability out of APIs and tend to stop using (or if that's not possible, post strongly-worded rants to HN about) APIs that remove features unexpectedly.

There are extreme cases to be sure. Microsoft is famous for keeping new versions of Windows bug-for-bug compatible with the past. Apple, on the other hand regularly deprecates and removes features that see little use. In light of Steve Yegge's recent post about liberal and conservative attitudes in software, I would call Microsoft's position on this issue very conservative and Apple's centrist.

I think what the world you're envisioning would actually look like is massive fragmentation, with people maintaining many more forks of libraries that have since removed some important functionality.

That's not to say the problem you describe doesn't exist or that there aren't solutions. An appropriate deprecation policy is certainly one part. Another component could be wider use of dynamic binding to help keep certain code out of APIs and in clients. A third could be more effort to keep APIs simple in the sense that Rich Hickey uses the word.


how stable are these apis for even the next five years?

That wasn't the point I was trying to make. I was trying to point out that ideally you could just leave the old API there for old software to use and build a new API for new software to use. If it wasn't for security bugs you could leave the old API untouched. Sadly, security issues prevent this and after a while it becomes uneconomical to supply even security fixes. This is why hardware that relies on software has an operational life shorter than the hardware itself has.

Exactly. I wonder if in the future we will more demand API stability of our host platforms so we aren't rewriting the same framework on every API change. I feel like I'm always building on sinking sand, that my programs have half-lives shorter then radioactive elements. Say what we want about Microsoft but they have always valued API stability.

Same, but its still a 1 year lag on relying on a new API. Probably 2 or 3 in practice if you support the old OSes for a bit. Could be zero lag.

There are benefits and drawbacks to maintaining an API vs updating it. Switching every few months will alienate your developers, but making sure everything is 20 years backwards compatible will lead to strange errors (particularly because programmers will expect their idiosyncratic workarounds to continue to work)

I think the more important question is how the groups responsible will handle the transition. A well planned and executed transition will give many developers a chance to switch gradually and by the time the old API isn't supported the older versions of the program will likely only be run on the older versions of the API anyway.


Windows and OS X are well-known for constantly deprecating APIs and replacing them with new ones. So clearly the best way to compete with them is to insist on dogmatic adherence to 30-40-year old APIs that are known to be horrible matches for how modern software stacks (including competitor OSes) actually work.

This is reasonable advice early on, when you only have a reasonable number of warts in your API and are still able to move rapidly. However, as you go, maintaining every API in a fully-backwards-compatible-forever environment will eventually create sufficient cruft that you as the API developer can no longer build new things quickly (or, eventually, at all), which means that eventually you are unable to keep your technology and feature set growing enough to stay relevant in the marketplace because you have an increasingly enormous investment in your historical support story.

At which point you eventually start losing your current customers too because they can do enough things with the new, fast-evolving API of your competitor that it's worth the cost of rebuilding.


Should mean a more long-term stable API.

Old stable APIs tend to get deprecated unless you're on Windows. Best to go where developer support and energy is -- with the new.

Unmaintained APIs are generally not a problem as long as the underlying platform doesn't change. Let's talk again in 18 months.

How does one build a software that will not deteriorate over time, if he uses unstable API?. I don't know. Just don't use unstable API if you don't have to.

The issue is that this is only of concern to application developers who don't want to use old APIs. Systems owners, managers, and users won't care if it gets the job done predictably and cost-effectively. And if the vendor is willing to keep that stack around and support it...

This is a false dilemma. Just because you set up some opposing conditions: backwards compatibility or unsuccessful api doesn't actually make it true. Just an assertion.

Having a reasonable end of life on support never hurt apple. There are lots of things that I can no longer run on the latest version of OSX. This happens every time the OS upgrades. Except of course they don't use versioned APIs, so it is much less planned for and not really as easy to deal with for devs.

I'm not saying make a shitty API, I'm saying, if you have the people who depend on your product know that changes happen, they can prepare for it. If you release a new version of the API every 6 months, and only support them for 18 months, you can make the api migrate slowly enough that it isn't a serious change for each version.

In native code: anything compiled to old APIs can link to old versions of your library, and just work the same way. In web apis: the world changes faster than the 18 month thing anyway, so whatever.

If people want to depend on your bugs, you told them not to, they can deal with the changes.


This is normal for Microsoft product-specific APIs. They are as stable as a drunk unicyclist from experience.

Well to be truthful, they last forever but you never know which one you should be using because there are about 3-4 concurrent APIs that do the same thing which may be deprecated at zero days notice.


That's Stasis, not Stability. Microsoft hasn't introduced a popular new native API in a decade!

As long as it doesn't break API compatibility, why wait?

Web APIs go through a standardization process and have multiple independent implementations. Can't say that about any operating system API except POSIX. Web pages from 30 years ago still load just fine - that's plenty stable. Third-party dependencies are an issue for all apps, not just the web.

Mobile gate-keepers historically do a pretty poor job and take a 30% cut for the privilege of denying your bugfix update. Can't recall the last time I had a misbehaving browser tab kill my phone's battery or harvest my contacts without consent; can't say the same about mobile apps.


My issue with this is that it needs to be retrained on a regular basis to make sure latest APIs are included. There needs to be a long term assessment to understand its viability in a commercial setting. Otherwise we'll jump in and after 6 months it will begin producing out of date suggestions for some edge cases. And then again if you need to support an old API how can you be sure it will produce the scoped results?
next

Legal | privacy