As I've gotten older (about a decade's worth of professional experience at this point), I've started feeling the same way. It's so tempting to build up these abstractions and pipelines and introduce new tools but at the end of the day, what is it we're trying to achieve? It's often the case that the abstractions chosen were the wrong ones for the job and staying at a lower "layer" of abstraction results in more benefits in the longer term (i.e. lower layers are move at a slower pace). If a project's lifetime is longer than 2 hype cycles, it may be worth considering digging down a layer and doing things there.
Whenever a subject like this comes up, I'm always reminded of Stewart Brand's "pace layering" [0].
OK, so in the beginning there were no layers. People occasionally wrote a layer but it was common to just say "fuck it", and through it away. As late as the 90s, you read about C programmers writing hash tables all the time, wtf.
Then, somewhere along the way I don't know exactly when, we hit an inflection point where there were some layers that didn't work quite right, but were hard to do without, so we'd try to shim it.
Really good long-term engineering means also ripping up the under-performing layers, attacking the unneeded complexity. This does not mean giving up on abstractions altogether.
Ah, the grand cycle of layering and integration. You build a machine that does a thing, modify it to do another, and another, and another, until eventually someone says
We could have a layer that did all of this!
The layer is introduced, applications are rewritten to target the layer, and people slowly lose touch with what the world looks like beneath The Interface.
And some things that should not have been forgotten were lost. History became legend. Legend became myth.
As people target The Interface, it grows into a more and more general purpose machine. Layers of indirection build up. Once simple tasks must propagate up and down a big stack.
Until one day, someone stumbles across the layer beneath. Gosh, the underlying system does 99% of what we need, out of the box. Why do we need this layer at all? We can do most of what we need with a couple single purpose tools. And the rest of the complexity can be taken up by the application layer. I don't mind doing a little more configuration there if I can get a huge performance and complexity win.
And the developers, frustrated with how big and bloated their layers have been feeling, flock to this new simple tool, and they port their applications, with a little extra boilerplate and big complexity wins. And then they port another, and another. Until someone realizes
We could have a layer that did all of this!
And it might seem pointless, when I write it up in this snarky way, but it absolutely is not. This is the process by which we discover the fundamental building blocks of software. Each time we add a layer, and each time we take one away, we learn something new about what information is. I love it.
The mental model of "pace layers"[1] is a really insightful one. I've found it extremely useful to think about problems from a systems perspective - eg. what layer am I operating in and what layers may I encounter in the future? What might I expect from this layer (eg. dynamic, spontaneous vs. cautious, calculating), and how does that inform the decisions I should make?
I'd venture it's not so much the technique behind the individual layers but the understanding of the need for all the layers and their interactions and the best practices in given situations.
We're prone to tediously repeat the same conversations over and over and take the cosmetic approach rather than the fundamentals-first way of doing things.
What's happened is the philosophy has been that more layers allows less experienced developers to write business systems. This situation with layers on layers of cruft is the result of that.
Nah. It's always about 4 layers. Hardware/services/data stores/etc, wrappers/models/components, business logic / application, ops/deploy.
If you nest your models/components deep enough that they're forming new abstraction layers, you're nesting them too deep. Backup and use mixin-style stuff instead.
If your business logic or application are nesting pretty much at all, then you haven't succeeded at making good choices in your models/components.
Just hide the nasty layers behind more facades or layers of abstraction. Please never reinvent the wheel, we need more layers, its better software design to reuse the old stuff.
I just don’t buy it. We already work at a layer dozens if not more abstractions away from what’s really happening underneath the surface. This is essentially an argument that every layer below is good and justified, but that any additional layer is just too much.
The software that we build tends to gain layers. In theory, each layer adds new functionality that couldn't be done on a lower layer. The problem comes when someone unnecessarily implements lower functionality in an upper layer (like reading local files through a web server). It happens because systems are too complex now for a specific dev to know every layer, and new hardware is fast enough that it doesn't matter that you're taking an order of magnitude more resources than you actually need to do the task.
Yeah, I wonder if there's an "optimal target" for the number of layers up and down you'd ideally be aware of. It has to be at least yours, and ones immediately above and below, but I see innovation coming from people with unusually keen insight into layers further away--eg. people making brilliant architectural decisions because they really, really know what the consumers of an api need and how those people think about the domain. Or vice versa, someone making something radically better or faster in a web app because they really get how the linux kernal is implemented.
It seems like cases where that deep knowledge is an advantage are rare but also very high value. I wonder how the EV pans out, both for individuals and orgs.
One thing I often tell people is that if a particular technology makes it easier to work with code that's 10 layers deep, it will also make it more likely that people will write code 10 layers deep where they would previously do 5 layers.
Whenever a subject like this comes up, I'm always reminded of Stewart Brand's "pace layering" [0].
[0] https://imgur.com/V5oL5WZ
reply