I'm not so confident that's true. There's seemingly this prevailing attitude in IT that new is always better. The latest gadgets, the latest complicated tech stacks, and the latest languages. Rewrite everything again so it's all fresh and free of 'legacy cruft'. Don't fix bugs, prioritize slapping a new "modern" GUI on it so people don't think it's old. Why wouldn't that same mentality apply to people?
I think a lot of people associate "old technology" as inferior and ineffective and that flashy GUI stuff is the only effective way of doing things. Truth is a lot of older technology is brutally simple and efficient and meets the 90% cut off in usefulness vs newer more complex for the sake of complexity tech.
This is something I eternally argue with so many people. I can't understand this blind assumption that something made more recently is better just because.
Better is better. Something new may or may not be better, need to evaluate. Way too often new means a regression on functionality, which is worse.
I take this stance. Almost always, the new software is worse in some way than the old, either taking away some capability to make room for a new product you would have to buy, or by breaking support with hardware you already own. It's my position that software was almost universally better 20 years ago.
Old and working well. Newer is not always better, it seems software industry is increasingly having problems with stability precisely beause they keep introducing change that a lot of people don't actually want.
I found the repetitive use of "outdated" to be off-putting. It implies that much of the value of a new interface (including this one) lies in it's novelty. Except for a minority, the precise opposite is true - most people aren't interested in learning how to use their computer again.
Not at all. You have to be very careful about the tools you use and how to use them. Picking the latest thing because it has new tech is a bad idea, just like believing unmaintained software will keep working is a bad idea.
Not accusing you of this, but the primary reason people run outdated software is a really problematic insistence, mostly by front-end people, on using the new-and-shiny instead of the tried-and-true. Breaking changes galore - so people stick with the past as much as possible.
Example: Office 365 OWA doesn't work well on modern browsers other than the latest version of Edge on Windows. But it does work fine on browsers that are older or pretend to be older! I'm technical enough to spoof my user agent, but Mom & Pop are just going to say "I don't like the new one, it broke stuff" and that will be that.
> When a project proudly declares itself as "modern," it implies that it's up-to-date and built with the latest technologies.
> Would anyone suggest replacing TeX with a "modern" alternative simply because it's newer?
No, “modern” implies that the existing solutions have some shortcomings which are perceived to be due to old age or legacy/tech debt.
That’s why no one is proposing an alternative to TeX.
Whether that perception of old age therefore bad is valid or not is a different question. Chances are that the “modern” solution will end up reinventing the wheel and rediscovering why the old tools did things a particular way.
Other times modern means the new tool cherry-pick the best part of its predecessors and omit the bad parts.
It's not just you. Every new thing that's any good is usually a retake on something already though about in the early computer times. But some of the implementation back then were crude. With the reinvention there is usually a rethinking and polishing.
The opposing opinion would be that changes of vital OS features has happened to all the operating systems. Users expect a new version of software to run on newer operating systems. The web has never deprecated anything close to vital. We're stuck, seemingly, with everything ever conceived.
My problem is that it's not really a newer and easier approach, it's a newer and slightly different approach.
Very much this: to those who would say, "oh, you're just invested in the old technology" I would say yes, I am; but I also posit that you are overblowing minor differences in a newer approach merely because it is newer. Even if it is significantly different, that doesn't make it better; not all change is progress. When I see something that gives me a significant advantage, whether it's old or new, I'll take it. But I'm also very discerning in what I invest my limited time and brainpower in, and throwing away previous experience, sunk though that cost may be, is foolish without good reason.
Yeah. The weird thing is that in other industries, people have no trouble admitting that the old stuff is often problematic and needs to be replaced. In the supposedly forward-looking tech industry, though, we stick with our tools from 1978 and stubbornly resist admitting that we have learned anything since then. It's strange.
Old is subjective (when it was created or last updated), and even when old in all respects, it's not universally bad. Many would consider linux/unix, command-line interfaces, vim, emacs, common lisp, haskell, lisp machines, etc. to be old/useless, because they lack pretty graphics, support for the newest hardware, or is simply not windows nor mac.
But these old technologies have positive traits that are lost and sometimes recovered in the newer technologies. Examples of when older technology was recovered could be the introduction of lambdas and higher order functions into mainstream languages like c++ and python; that's something that "older" languages like lisp and haskell have had since their inception. I'm also to understand that Windows servers needed a graphical interface because not everything was doable from command line interface, and so setup had to be done manually instead of just making a script containing what you'd otherwise would've typed through a CLI. Now, they've improved the situation by making a CLI in the form of Powershell and making more of the system available through it.
As for other advantages of older tech that are not mainstream in "newer" tools, there is for example the fact that lisp machines are completely coded in lisp. Mind you, I haven't used a lisp machine myself, but I'm to understand that they have the ability to modify any part of anything running on the computer at any time even while the program is running, with minimum "build" time. I've experienced this on emacs, but on a lisp machine it's everything. Right now, I'm experiencing a bug on Firefox where some checkboxes and radioboxes are not rendered; this happens even with all add-ons uninstalled. If I wanted investigate this further, I'd have to download the firefox source, and read it in whatever multiple languages it's written in (as opposed to just lisp). Let's imagine I find the bug and fix it. I'd have to build the whole of firefox, not just the pieces I've changed. mozilla.org tells me I'd need "2G RAM with lots of available swap space" and, I imagine, lots of time. On a lisp machine, I'd just need to rebuild the functions I've changed.
Emacs also has support for live coding in multiple languages through addons like slime for common lisp, cider for clojure, skewer for javascript, etc. I'm not sure if things like that are available in "newer" IDEs.
Sorry, I ended up ranting. I just love to research "old" technology, as I find that they have much to offer over modern "equivalents".
EDIT: This post seems to stray a little from the context of this thread, but consider for example live coding. I'm not familiar with the workflow of the typical modern game developer, but if modern tools don't offer something similar to live coding, they might benefit from it. If a particular piece of code is only launched under very specific conditions in the game, something that would require a little time to setup, they might benefit from being able to change/add and load the new code while they're playing in that "deep context". I remember once seeing a video of someone developing an FPS in Common Lisp. They where shooting at a wall and checking how it impacted with it. They would switch windows to emacs, edit a bit inside a function, hit a keybinding, and have the shots altered immediately in the running game, without having to restart it or anything. They would develop the game while playing it!
I have a similar but slightly different take on that
Instead of "As I get older, I just don't care about new technology" it's more "As I get older, I'm more skeptical of new technology"
When I was a young programmer I thought every new framework, language, innovation was amazing. These days I just look at things with a much more critical eye and a higher bar when asking the question "why does this exist?"
My general stance is that most things we are using are not ideal but are sufficient. However, most new things are not sufficiently better enough to replace what is technically sufficient enough.
Now I'm confused. I've heard that we should learn and appreciate that nothing is new under the sun and most interesting concepts and technologies have already been invented in the past and not newly by newer technologies. Now you're saying that newer technology consistently improve things? I'm getting confused as to how the older hive mind is telling me to feel.
I'm suspicious that some people complain about dated technology and want something fresh not because the dated tech is obsolete or crippled but because they want to be an expert on new tech, starting from a level playing field and not have to catch up with people with a 20+ year head start.
reply