Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

This sounds like the stuff I worked on 15 years ago in the oil and gas sector (in fairness, IE + applets _were_ the best choice at the time).


sort by: page size:

I started writing backend software for banks, oil and gas distribution and utilities around 20 years ago. I wouldn't be surprised if most of it was still running in some form.

I worked on something very similar for my company about 10 years ago. The lesson I learned is it was much more robust to have a flow of data than a flow of code. I still think it's really cool though.

i remember making this kind of tool for a major telecommunications firm. it was a good project for me as a junior.

Unrelated, but interesting nonetheless: the current software industry has a strikingly similar feel to the early texas oil prospecting days.

Independents are suddenly able to take a little bit of risk, take on some money (then: banks, now: angels and VCs), and work by themselves or in small teams to produce something that lots of people need (then: oil, now: good software).

Some of the biggest fortunes in history were made back then (1900 - 1930s), just as some of the biggest fortunes are now being made in this industry.

The digger I deep (no pun intended), the more similar these two stages of history look alike. I'm sure there are other eras that also share these common threads.


Definitely. In spite of the insane hours I worked sometimes to keep that ship a float, it was a really amazing and unique experience. I feel really lucky to have started out there. I left the company about 13 years ago and the product is still running and profitable. I did one rewrite from raw servlets to Struts while I was there. They've tried two rewrites since I left to get it into more modern tech, but both rewrites failed, so a LOT of my code lives on. That's a really great feeling knowing that something I created is still valuable and useful over a decade later.

My second job was working for a small WebObjects consultancy in 2008. They developed bespoke line-of-business applications for small companies. Having the complete stack - from the HTML templating through to the ORM - provided by one vendor and designed to work together made development faster than anything I've worked with since.

It's a shame the cost was too high for too long, and even when Apple made it free, they never open sourced it.


Oh wow, my first job out of school (~2000) was writing FoxPro to extend Pegasus Opera accounting software - certainly brings back some memories!

My last company hired a guy who claimed to be one of the top engineers at EE. Crystallized ancient knowledge is appropriate. Then again the company I was at is still using classic ASP in production, so even that ancient knowledge can at times be a bit useful. I agree with icco, I've heard bad things from friends in town, never checked it out myself.

I built software that did this back in 2001 for a scammy company called CyberRebate, during a low point of my consulting career. They imploded before they could take it to production.

Incidentally, one of my first jobs was debugging automation failures around PowerBuilder apps still running in the early 2000s.

It's amazing how long enterprise software lives.


Oh wow. Yeah, that sort of thing seems to have been so common in ERP software.

My first professional programming job was working on a bespoke ERP and industrial process control suite (written in PowerBuilder). The program had a huge number of sub programs (dynamically loaded modules, each an MDI window accessed using a “program name”, something like a SAP transaction code).

We had a number of background services that would have to run, however writing Windows services in PowerBuilder was anything but easy. And we were reluctant to use anything else - the whole benefit of using a 4GL was a well integrated ORM and report generating functionality.

So we’d implement our background services as regular modules (with their little MDI window) within the main thick client app. Clients would have a number of workstations dedicated to running a single one of these processes. Nothing headless, each outputting it’s status or logs to he connected display. If the power, network or database ever dropped, each of these machines would have to be restarted and have its allocated sub program reopened.

For example, despatch label printing program would monitor a database queue table for new rows, bring up a report associated with the specified despatch note, print the report to the label printer then delete the row.

It seems so hackish but it worked incredibly well. Our clients were all food or paper manufacturers, running 24/7. Operations were rarely disrupted. Have a single screen per function to monitor for status changes was something operators were accustomed to.

This was over a decade ago, but I’ve never worked with a more productive team since. The constraints of the system let us focus on solving business problems. I can’t imagine writing anything of this scale in a modern environment. I’d love to see 4GLs like this make a comeback. The first class GUI, ORM, report generation were a huge productivity boost. And the simple programming language (with a very simple object model) put the focus on problem solving and not API acrobatics.

Simpler times.


i worked on a product that did something similar for telecoms. had a closed loop automation and graphical designer for object model. it was 10 years ago.

looking today at all the manual work with playbooks/etc, it's astonishing. feels like things didn't move forward at all in past decade


Places like Anderson Consulting and Perot Systems used the same approach back in the 80s and 90s.

Back when they were state-of-the-art, I used all the tools and techniques described in that paper.

I'm still a software engineer. Take my word for it: it's good


I worked for 2 years in a project in the 90s where we wanted to adapt the Apple Newton for a particular market. We were closely connected to the devs at Apole and it was a lot of fun to see how a technology develops and to shape something out of a wide open space. I probably learned more during that time than in my other 20 years of work.

Sounds like modern day software development.

You have a lot of survivorship bias in your post here. Having lived, albeit briefly in the software industry of the 90s, the world was full of massively overbudgeted, multi-million dollar messes.

22 years ago, I was working on software to run a single state's fish and game licensing. We had 4 product managers (backend, admin, retail-front-end, consumer-web), 6-8 architects, at least 10 project managers and probably 70+ engineers, 30+ QA engineers. It had its specs written by a separate group for 6 months, that was then thrown away, and then rebuilt using Rational Unified Process. It was 10x over budget money-wise and 3x time wise. It took around 3 years to release to the public.

Those four products,

back-end team wrote in pure Oracle stored procedures.

the retail-front-end team wrote in 95% JSPs, with a small library of java functions. When logic needed to split amoungst 4 types of licenses, they copied pasted an entire 2000 line JSP over and changed minor code, then just forwarded to the right JSP depending on the type.

The consumer-web code was written in Perl, which needed to call functions in the retail-front-end, so the solution was to wrap the JSP code, call them with, what we would call now, mocked request/response objects. The PERL software called this mess using SOAP. So care had to be strip down every interface to use primitive objects so PERL and Java could talk.

The admin-code, my team, wrote in our own Java MVC framework, Struts was just created. What started as a web-project turned into a 'recreate the mainframe interface using html' though.

So yeah, 30 years ago, we did things right, or perhaps you meant 50 years ago?


If it were 35 years ago, this was probably some logistics or finance systems? These were the first things adopting computers.

Nowadays I'd say it's 50/50 whether a new project there will last a while. More than half the work is pet projects that won't ever be used for real, if at all. While the other small half can live for many years. It's possible to get a pretty good idea of where it's heading from the start, with experience.


So, a concrete example from my recent past: a system designed to allow agents in the field to submit contracts, and have them crop up in an ancient legacy system.

This had a single purpose, but it was structured as several discrete apps. I'm going off memory here as I didn't work across all the apps, but it looked like this:

1. a single-page Javascript app (Angular, CoffeeScript, Nginx, static site)

2. a web service to accept the contracts (Ruby, Sinatra)

3. a web service to pre-process the contracts and queue them for insertion into the legacy system (Ruby, Grape, Amazon RDS)

4. a service to de-queue the contracts and insert them into the Windows-based legacy system (.NET, SQL Server, IIS)

There were many, many advantages to this structure:

* each individual component[1] was trivial to understand - in the order of hundreds of lines of functional code

* we could choose the technology appropriate to the job

* we only had to use Windows for the bit that interfaced with the legacy system[2]

* the only technology choices that spanned the entire stack were HTTPS and UTF-8

* status of each individual component was available through HTTP requests; most made RSS feeds of their activity available for monitoring

* we could easily experiment with new technologies as they emerged, without changing or breaking All The Things

Some caveats:

* we had a highly skilled dev team, with experience across a wide range of technologies and platforms

* 'the business' understood what we were doing, and why - in fact, the whole purpose of the project was to transform our previously monolithic architecture in this way

* log collection (using Splunk) allowed us to track the progress of individual contracts through the system, and alerted us when one stalled (e.g. was being rejected due to formatting issues)

[1] Except for the last one, because of the ludicrous complexity introduced by interfacing with the legacy system. But all of it except for the legacy interop was easily grasped.

[2] Not knocking .NET or C# here; both are pretty good these days. But the Windows ecosystem is just not as developer-friendly as *NIX.

next

Legal | privacy