The most active user and development community for A2 seems to be in Russia. There's a Telegram group too but I don't speak the language so I can't follow very much of it at all.
I remember trying out an Oberon-07 compiler some years back. The speed at which it worked was impressive: it was able to build itself and the standard library modules in a small fraction of a second.
Probably a controversial opinion, but I absolutely wish this was more common. It's a thing in F# as well. With JS/TS projects I always ensure the ESLint rule for only using what's been declared before is enabled.
I don't want to scroll up and down a file constantly when referring to previous things that might reference something at the bottom of the file, which then references something near the top, which then references something in the middle, which then references something at the bottom again... I despise working in that manner.
That's not why Pascal compiles quickly. Resolving undefined symbols isn't difficult or slow, it just means that you have to keep track of what's resolved and what's unresolved and that takes up memory, which was very precious back in the old days of Pascal. Pascal is designed for very fast single-pass compilation, but symbol resolution is only a small part of it.
More like: "Tiling window managers only work well with resizable TUI apps" - there's still far too many text-mode programs that assume fixed-size console buffer dimensions (think: software still using ancient versions of ncurses) - also assuming that your chosen terminal emulator is capable of dynamically resizing (and re-flowing) normal stdout text.
I don't know what definition of TUI we're using, but if we're talking about ncurses style whole-terminal-window GUI, there's no inherent advantage. Responsiveness being limited by character width actually sets harder limits for "TUI"s.
It does favor the kind of text-centered UI done by Oberon-1 and imitated by Plan 9's ACME, where there's little to no use of x-y addressing and text just flows.
I think that it could be because text stream based apps (such as a TTY and the utilities that make no assumptions about screen geometry) are friendlier than, say, a web browser rendering a page that assumes its canvas is 1920 pixels wide.
OTOH, you can make a TUI that (rightfully) thinks the terminal window is always 80x25 (because God made the VT-100 that way for a reason) and it'll be very hostile to any tiling WM.
> Oberon is very much in active use at ETH Zürich. We used it as one of our main operating systems.
The article being from 2009, this must've been an interesting period for the students, as back then the introductory programming course was using Eiffel, taught by Bertrand Meyer himself.
My 0.02CHF: I studied at ETHZ from 2008 to 2015, and I've heard about Oberon once or twice back then, and read about it at least twice that much on HN.
Never used it, never knew anyone who used it, never knew anyone who worked on it (though I believe the chair had slots for theses available at the time).
When I started studying, some student pcs were even running Oberon (no login was required). Only some IT students used them as it was too complicated for students from other departments. They were then later replaced with linux/windows machines.
If my memory serves me right, when you bought a laptop through the university neptun program, you had oberon preinstalled too (or at least a dvd to install it) was provided.
As for Bertrand Meyer, I also visited his programming class. I still remember that he really focused on invariants, and you had to write each code line basically twice.
Oh the horror those classes were. Meyer is a great guy, and in hindsight I appreciate his ideas quite a bit. But the idea to take Eiffel for an introductory course, the questionable exercises (being encouraged/forced to write the code twice, once as invariant and once for the actual execution) plus the terrible Eiffel-IDE made those classes very easy to loathe.
I like to look back on my college years and ask myself "well, what did I actually learn".
And looking back, "I learned to deal with being completely overwhelmed with shitty tools, questioning if the opposite party has any clue what they're talking about, then had to work out how to pass that class" seems like one of the most useful career skills to pick up in your first year.
I was there from 2006 to 2012. The OS course (I think that's third or fourth semester) was being taught with Active Oberon.
Must have been in 2007 or 2008, so might have changed right after.
Eiffel, Oberon, Event-B are just the most notable examples of things that we were being taught that have never become mainstream.
Yet those are the kind of things universities are great, one get to learn about technologies that changes one's point of view (ideally), something that self taught or professional schools hardly achieve.
I think that my ideal CS education would include 3 things:
1. "We're going to ask you to build real software, with real tools, in a group. Even if it's only for one course."
2. "We're going to ask you to do some theory and write some proofs."
3. "We're going to show you some wild stuff that looks like it belongs on an alternate timeline."
(3) might be something like Oberon, or Racket, or some eccentric faculty project. Or even just teaching all your CS majors Haskell, if you don't have any genuinely eccentric faculty projects to inflict on students.
Innovation is often driven by people who are aware of what might have been.
> Or even just teaching all your CS majors Haskell, if you don't have any genuinely eccentric faculty projects to inflict on students.
We got taught Eiffel Freshman year, C and Haskell Sophomore year, and after that, whenever a course used another language (Java, C#, C++), the prof just said "you'll pick it up".
(That's leaving out the usual-for-academia-but-still-weird stuff like MATLAB and Verilog)
I think had I been a student with those options I wouldn’t have ended up a lawyer by profession. The American university’s CS, in general, is not that appealing (with a few exceptions obviously).
I think the choice of first language really says a lot about the whole program. More so today than yesteryear, where there was a larger contigent of people with prior exposure to programming.
Do you start more abstract, and if so with functional, imperative or OO underpinnings? Do you intend to switch languages rather soon or later?
The good thing about Oberon, language set aside, is that you could get started without a lot of adminstrative debris. No 80% of the screen covered in an IDE, not even a big ol' "public static void main" that you're told to gloss over.
I can see good arguments for sticking to one language throughout many courses, too. Sadly that often means C++.
Did start there in 1999. We had some Oberon exposure, but not that much. There were some workstations upstears in the computer room that did run the OS. Those were almost never occupied... So we simply used them to connect to some Solaris workstations using X forwarding:-)
I remember a cooperation with some ETH people (Sjur Vestli and his master brain Roberto?) trying to use realtime oberon to enter the midsize league of robocup end of the 90s. Never happened to participate which was a pity.
FTA: “When ETH Zürich started its own computer science program in the 60s, buying computers from the US turned out to be a bit of an issue. They were expensive and often unsuitable for European use (what with our strange umlauts and stubborn insistence on speaking languages different from English)”
I guess they started buying computers from the US :-)
I think it's a double-encoding error. Text was encoded into utf-8, whose bytes got misinterpreted as codepoints, and encoded again. ü in utf-8 is (c3 bc), and U+00c3, U+00bc are Ã, ¼ respectively.
> Oberon is very much in active use at ETH Zürich. We used it as one of our main operating systems.
I wonder if any blind students, or other disabled people who need accessibility tools, had to confront the general lack of accessibility in simple, niche GUI systems such as this one. I suppose someone could have been given the assignment to extend Oberon with a screen reader or other accessibility tool, ideally without having to implement their own speech synthesizer first.
Very cool, and forgive me for not doing more research myself. Also, I should have thought of using a Braille display instead of a speech synthesizer, since for that, all they would have had to do is send commands to a device connected to a serial port. Of course, that would be more complicated now with USB, never mind Bluetooth.
Not only have I heard of Oberon, I've written code for it!
The Oberon programming language is a nicer-than-average descendant of Pascal and Modula-2. The compiler was reasonably fast even in the 90s.
The UI was strange enough to feel really interesting. As the article mentions, you could type a command anywhere and click it. And the older version I used had tiling windows, although this seems to have changed.
But one especially neat feature is that the system was based around a rich-text format that you could extend! I wrote a little clock widget that I could instantiate anywhere, inside any rich text, and it would give me a live time readout.
In practice, Oberon already felt like a toy system when I used it, because it was missing tons of useful software. But as toy systems went, it was rich and original and well-worth a couple of weekends of exploring and coding.
If you enjoy the idea of operating systems that explored a different path, this is one of the more interesting ones. If you're fascinated by systems like the Lisp Machine, or SmallTalk, or OLPC, where the entire system is meant to be easily understandable and hackable, take a look.
We now have the whole Project Oberon OS running on an Artix-7 100T FPGA without any external RAM. It just uses the 512KB of Block RAM (BRAM) on the chip itself:
> ...all memory accesses are indirect ... module-data is position independent and may be moved in case memory should fail due to radiation damage.
I once used a debugger that shipped with a rather snarky easter egg command: "find my bug". One of its more useful suggestions was "Maybe a cosmic ray error? Use smaller chips!".
I think it jocularly meant "use something with fewer gates and a smaller cross section", not "use a smaller feature size". Or I could easily have misremembered: this was all last century.
That's an interesting paper, thanks for the link. Though the resulting language is barely Oberon anymore, since even type extension is removed, not only dynamic memory management (and with it the GC). It seems to have less features than Oberon-0 described in Wirth's compiler book.
But I often cite the Ignore the Code post when someone doesn't know what Oberon is. I used it in a comment elsewhere on HN, and thought "why not submit it?" and here we are. The word "language" wouldn't fit in the title, AFAICR.
It does mean high level language, and in this context it's trying to illustrate that Oberon is used to write the OS and drivers, despite having GC, a repl, objects and other high-level constructs (compare with C for example).
The general assumption is that if you don't have exact control over memory and threading, you can't write an OS.
I personally tend to agree with you, and yet, as a researcher and writer about OSes and so on, I never even heard of it until a few years ago. It is, sadly, quite obscure.
I made the same complaint when Apple discontinued the Newton. It was their most interesting computer. In the long run, an evolutionary dead-end (Unix won), but still interesting.
The zooming reminds me of Prezi, except that Prezi is in 3D... (maybe Oberon is where they got their original idea from?).
As interesting as it sounds and looks, I cannot imagine constantly zooming in and out (or panning) just to switch windows. Maybe I am so indoctrinated with the standard Windows task switcher concept that I don't want it to work in any other way. I was also completely lost when I tried to create a Prezi presentation instead of Powerpoint.
On the other hand it is great to see something different in the UI world. I remember only one such other concept that didn't make it into mainstream: the UI where you just move the mouse, but don't need to click...
Instead of zooming I think a better approach is Windows-within-Windows. Or is that the same thing? You don't need to "zoom" so much as "enter" into a sub-desktop. Which is different than multiple desktops side-by-side.
Smalltalk had/has this feature. You create a new "project-window". You can then "enter" it and within it you can open any number of windows including project-windows.
This works really well with programming where you are performing a task which requires a set of windows to be open, a class-browser, method-browser, change-list-window, debugger etc. As you are programming you soon discover you will need to do something else "first", perhaps fix a bug. You open a new project-window and "enter into it" to do that. Your current context of whatever you were doing remains as is in the parent window. Perhaps you leave an open debugger there halted at a given stack-frame, so you can continue that debugging later. Maybe even tomorrow.
When your bug is fixed you close its sub-project-window and come back up to what you were doing before.
If you can't fix the bug immediately you can leave its project window in place but exit into the parent-project and do something else there while keeping the bug-fix project open, and visible as a window-icon in its parent project-window.
The same approach could be adopted by the whole OS, and in a sense Smalltalk is an OS. So I'm waiting for MS or Linux or Mac come up with something similar. Not "multiple desktops" but "nested desktops". Or is there something like that already (outside of Smalltalk)?
Up until the mid-2000s, a lot of complex GUI applications (such as Visual Studio 6) in the Windows world used a paradigm somewhat ambiguously named MDI (cf. https://documentation.help/Win32/MDI%20Frames.htm) which is what you describe sounds like. Unfortunately, UI design/HCI fashions turned against it hard and it went all but extinct; last I checked, few GUI toolkits even support it. Gtk was on the vanguard of the anti-MDI Bildersturm since its inception, and Qt's implementation (https://doc.qt.io/qt-6/qmdiarea.html) already suffered from bit rot when I checked it out some 10 years ago.
I think the MDI was too complicated for most people. Simplicity wins. And wasn't it like an application had to implement it to use it, not part of the OS-GUI by itself?
But Smalltalk "project windows" was and is truly simple. I think you can still check them out in Pharo and/or Squeak.
They are not of much use to casual computer user I think. Their benefit comes when the computer is used to perform complex multi-level tasks like producing software.
> I think the MDI was too complicated for most people. Simplicity wins. And wasn't it like an application had to implement it to use it, not part of the OS-GUI by itself?
It's true that MDI generally existed at a different level of abstraction (the "toolkit" that draws widgets rather than the "windowing system" that assigns screen regions to applications and lets them draw to those regions, while giving the user control over which application gives what region), but in contexts where the OS also provided a canonical platform GUI toolkit (such as Windows, see the documentation I linked above), the MDI implementation would naturally also come from the same vendor.
I'm aware that a common argument of the anti-MDI push was in fact that the OS window manager should be able to handle management of (sub)windows better and more natively than an application vendor's own low-resource proprietary implementation in the context of subwindows, but this superiority of platform window management never actually materialised and in 2022 I'm still occasionally finding myself trying to chase down all the different subwindows of multi-window applications that wound up on separate workspaces. Pre-single-window GIMP was a particularly egregious offender in this regard.
Juts to think about the difference a bit more, in MDI you could open new windows within the application-window, say multiple text-editor-windows within the MS Word application-window, if I recall.
What you could not do is open a new application-window from within the application window. And that would seem rather useless. But it would not be useless if the whole desktop worked that way, open new child-desktops from current one, recursively.
Acme was inspired by Oberon, and Oberon was inspired by a system called Cedar[1] which was created at Xerox PARC.
From "Acme, A User Interface for Programmers"[2] by Rob Pike:
> Cedar was, however, the major inspiration for Oberon [Wirt89], a system of similar scope but much smaller scale. Through careful selection of Cedar’s ideas, Oberon shows that its lessons can be applied to a small, coherent system that can run efficiently on modest hardware. In fact, Oberon probably errs too far towards simplicity: a single-process system with weak networking, it seems an architectural throwback.
> Acme is a new program, a combined window system, editor, and shell, that applies some of the ideas distilled by Oberon. Where Oberon uses objects and modules within a programming language (also called Oberon), Acme uses files and commands within an existing operating system (Plan 9). Unlike Oberon, Acme has does not yet have support for graphical output, just text. At least for now, the work on Acme has concentrated on producing the smoothest user interface possible for a programmer at work.
Niklaus Wirth spent a sabbatical at PARC[0] in 1976, so the inspiration was direct (or at least via Mesa).
[0] in fact, he's said he built the FPGA Oberon[1] because his favourite mouse[2] was a departure gift from PARC, but of course, he can't buy any system today that would allow him to keep using it, so he built one. Now that's a yak shave.
Thanks to Oberon I learned what is possible to achieve with GC enabled systems programming languages, it was this experience to lead me to play archeology on our library, Usenet, gopher, FTP,.. and discover what the world of programming languages outside Bell Labs actually like.
Young pupil fell astray from UNIX church never to look into it again with the same wonder.
A fun point about the 3rd part. If you've ever been flummoxed by the basics of floating point math, part 3 discusses the fundamentals in 2 pages with a big fonts. One of the most succinct discussions I've seen on FP.
I can't seem to find the original 1992 edition in PDF. It feels like a lot of information (not just source code) is culled from the later versions (2005, 2013) of this book.
I have the hardcover version of the 1992 edition. AFAICS, the 2005 version that you can download from Prof Wirth's personal site is identical to the 1992 edition except that it has an additional appendix titled "Ten Years After: From Objects to Components"
Yes, but I think it's an updated 2013 version and it also includes the specification of his "RISC5" (not RISC-V, btw) processor he designed just for the purpose of running Oberon on an FPGA
This was going to be the next Pascal... back when Pascal was a thing. I do remember chatting with some Borland folks about how they would be supporting it (Turbo Oberon?), but they weren't really very enthusiastic about it. Of course, that was early 90s and they were still shipping floppies, so...
(edit: didn't remember the year, so left it as a range)
You're missing one step here, as the "next Pascal" was Oberon's predecessor Modula-2. For which there were some 8/16-bit compilers around, including a Turbo Modula-2.
Didn't seem to work out back than for a few reasons, one probably simply being that "Pascal" had a certain name recognition. Also, OO was coming around which Modula-2 didn't have, and by the time Oberon was invented, Object Pacal was already a thing.
Ah, Modula-3, one of my perennial favorites. Definitely a different continuation of Modula-2 than Wirth's own Oberon. Agree on the tooling issues. If it would've "sold" better, so many languages of the last two decades could've been avoided.
This is one of those,I wish D would take more inspiration from, instead Nim, Swift, C# (with post v7 improvements) seem to be the ones more closer to it, in language design for systems programming with GC.
The timeline does skip a bit too fast over Medos-2[1], the predecessor of the Lilith's operating system, using Modula-2. That plugs the gap between the Alto-Oberon timeline, with Oberon basically being both a continuation of that and a simplification of Smalltalk and the whole software environment in total (surprise: Wirth is not a fan of it [2][3]).
I installed the Oberon OS (A2) emulator and showed it to my friend. We burst out laughing at that program that makes a skeleton run across the screen. If you keep executing it, it just spawns more skeletons and we had a skeleton parade.
Interesting to learn about this! It's not as drastically different as expected from the initial write-up.
The zoom-to-switch-context reminds me of Figma (a design tool) -- personally I like the metaphor of managing hierarchy with scale while mapping spatial relationships to functional ones.
The idea of mixing "data" and "code" is intriguing. I might have to play with an install locally!
The idea is you define how the GUI should appear by describing it. The GUI changes in real time based on what you describe.
And I also designed a GUI primitive I called GUI thunking, inspired by Haskell. The idea is you can chain together behaviour on a GUI by interacting with the future directly.
You might be interested in Acme and Rio from the Plan9 system! It had some interesting UI paradigms like 3 button clicking and highlight to instantiate.
Additive guis seems like a really cool approach that I wish we had more of these days given that everything seems to be web-based anyway. Most interfaces sadly want to be a mishmash of MacOS and a catch-all mobile operating system.
If you want to try out only the Oberon language, you might be interested in oberonc [0] an oberon-07 self-hosting compiler for the JVM. There are other several Oberon implementations for different platforms listed here[1]
When I started to study computer sciences in Germany in 1991, we used Oberon on X11 terminals during the the initial "Praktische Informatik I" lecture as our first programming language. Oberon being new, the professor said he wanted to make sure that everyone starts under the same conditions and with no pre-knowledge. So we wrote our linked lists, hash tables, trees, sorting algorithms etc. in Oberon.
That time almost all students had a track-record coding on Apple, Commodore or Atari homecomputers and/or had worked with Turbo-Pascal and -C. Because the Oberon system was not very stable nor user friendly, the Oberon environment was met only with little support by the students. For example I remember that the compiler crashed reproducible on certain inputs. There also was no debugger and the editor did not meet expectations.
After finishing the lecture, most students (including) me, never touched Oberon again. But hey, let's re-visit it after 30 years again and see how it feels today.
As I described it to some folks recently, Go can be simply characterised as a slightly stripped version of the Oberon-2 (or Component Pascal) language, recast in a C like syntax, and with a number of small additions.
Removed from Oberon-2: Sets, Type Extension (classful OO scheme)
Added by Go: Maps, Slices, Strings, Interfaces, Channels, Coroutines.
Such that I suggested it is well worth reading the Oberon-2 language report.
As to the coroutines, possibly Oberon-2 had them in its standard module library (as Modulo-2 did)?
> Go can be simply characterised as a slightly stripped version of the Oberon-2
Well, not really; there are not much similarities between Oberon and Go besides the receiver syntax of Oberon-2 bound procedures (which was invented by Mössenböck btw) and the fact that both are garbage collected. In your list "removed from Oberon" you should add type inclusion (Go doesn't even have implicit coercion); there is an intersection of keywords and also := appears in Go, but the semantics are rather different; coroutines were defined as an option in the Oakwood guidelines, but by different people than the language authors; I never met an Oberon compiler which implements them.
Yeah - I skipped the type inclusion, as I just viewed it as a misfeature, assuming we're talking about the implicit casts between different number types.
As to ':=' in Go, yeah - a new thing over Oberon-2.
Since assignment in Go uses C style '=' whereas Oberon uses Pascal style ':=', I certainly was not confusing them.
The characterisation came about because of an implicit complaint (from C programmers) about a choice of Go for a project. Now never having used Oberon-2, but having read the report, I used the comparison to a stripped version it as a way of showing how simple the language actually was. Something like 25 pages being sufficient to describe it.
The things which struck me were:
O2 MODULE becomes Go package (and similar syntax use)
O2 NEW retained as Go new, but &Foo{} generally preferred
O2 export of symbol via '\*' tag becomes Go export via capital letter.
O2 Open arrays replaced by Go slices or strings.
O2 WITH because Go 'type switch'
O2 'type guard' becomes Go 'type assertion'
O2 VAR parameters to PROCEDURES become Go pointer parameters to funcs
But in the end it is very much a subjective thing, so unless using the non classful parts of Oberon-2 reveals significant differences, I'd have to stick with my evaluation.
> Something like 25 pages being sufficient to describe it.
Wirth attached importance to the fact that there are only 16 pages; on closer inspection, however, one realizes that not everything has been specified and the omission of redundant descriptions easily leads to ambiguity in the given writing style.
> in the end it is very much a subjective thing
The differences and little similarities, as far as specified, are objectively ascertainable. But of course there are far more important things. From my point of view Oberon (including Oberon-2 and especially Oberon-07) is too minimalistic for non-academic projects anyway. That's why I threw my hat into the ring with Oberon+ (http://www.oberon-lang.ch ); its specification is still small with about 50 pages.
Thanks. I don't know the Ulm compiler, and I unfortunately don't have System 3 (can you please provide a link with the source code and/or a working Linux binary?), but I have the source code of V2 and V4 (both ETH and Linz versions, from 1992 to Linz 1.7) where there are no coroutines.
In case you're interested: I realized that I indeed already have System 3 binaries and source code in my collection; unfortunately the original links at ETH don't work anymore; but I have a physical copy of the book "The Oberon Companion" which includes a CD; I now have uploaded the CD contents to Github; here it is: https://github.com/OberonSystem3/TheOberonCompanionCD.
Note that there is no Coroutines.Mod; I neither found one in the A2 sources.
Because of pjmlp's hint I just looked into the repositories https://github.com/btreut/a2 and https://github.com/metacore/A2OS, but didn't find a coroutines module. I'm aware that Active Oberon (Patrik Reali, 2004) includes concurrency, but this is a different language than the one used for System 3.
The characterisation was against Go 1.15 as that is what is current in the version of Debian we're using (11?).
So other than noting that Go 1.18 adds Generics but they can be ignored due to us targeting 1.15, they weren't mentioned.
As I recall the another thing we get in 1.18 (as a side effect of the Generics constraints mechanism) is a slight improvement in static checking for certain constructions of type switch.
Dangit. This is the sort of thing that's equal parts frustrating and exciting, in that I've had a lot of these ideas independently and this is the first time I'm discovering that someone has implemented them. Especially this infinite canvas idea, which I'm very confident would work wonders for me (and many other people) mostly owing to the power of "spatial memory," which I feel like is a wildly underused human feature. (e.g. memory palaces)
An important thing to be aware of is that spatial memory is not universal, and thee is quite a bit of variation among those who have the ability to use it successfully.
I don’t say this to discourage. Quite the opposite: I say this to say if someone doesn’t understand the benefits of your ideas don’t take it as a general “this will not work” but instead go share them with someone else
Oberon was a mostly successful experiment in minimalism. Once when I was surprised by a compile error that I didn't think was correct, it didn't take me too long to pinpoint the line in the Oberon compiler source (written by Wirth) which was causing it and understand why it was the way it was. You cannot do this with any 'modern' system today.
reply