I took an OS in college in 2006 and the big project that my prof required us to do was to make modification of visopsys. The software was primitive at that time but still had UI interface.
I emailed the author to ask some questions in my project. The author had connection with my prof and informed my prof about this. My prof told me that I was not allowed to ask the author regarding this project. So I had to figured out on my own.
It was fun to play around with and learnt how things work at deep OS level. It was a good memory for me :)
And you guys notice anything about my username? :)
No, I am not the author. I just liked the project and the name that I chose it for my hackernews account 14 years ago. No intention to misrepresent the project or the author.
Interesting. Never heard of this system before. It's apparently a monolithic kernel, developed almost exclusively by originally Canadian programmer Andy McLaughlin since 1997. The system has a graphical user interface, preemptive multitasking, and virtual memory. It is implemented in C and IA-32 assembly language. Here is a 2012 interview with the author: https://www.pingdom.com/blog/visopsys-operating-system/.
This is very very cool, and unlike a lot of other "hobby" OSes actually looks usable as a daily driver if your needs are basic (kids, elderly, older/cheaper hardware, etc).
While for nerds computers have become these monstrously powerful things that can do everything under the sun, there's definitely still plenty of people who just want a computer to write down notes, keep a calendar, use the calculator... eg the things home computers were originally made to do.
True in theory, but in practice due to our economy being based on growth at all costs, iOS doesn’t really fit the bill anymore.
Nowadays even iOS will randomly change its UI and send you “notifications” or “suggestions” (modern euphemism for “ads”) to subscribe to Apple TV* or iCloud.
I was forced to buy a new iPhone recently (my 16 was stolen), and had iOS 26 foisted on me.
My god, is it bad (for me, I'm sure some like it). The ugly glass UX, the weird floating controls, the always on display, blah blah. It's not innovative at all, it's like they just had to redo everything simply to make it seem "new".
Always-on display can be disabled but for the rest I agree. It doesn’t really do anything more that my 3rd gen SE but is way more annoying to use (bigger size, no fingerprint reader nor home button).
So what is better? I think you're wrong and a tablet with iOS or android is the best form factor for computer illiterate people to get something done. Despite whatever bullshit they added, everything else is worse. But maybe you know of something better?
Why with Tahoe did they get rid of the volume indicator that popped up middle of screen that they’ve had for 20+ years - a critical indicator that the volume controls are even working in the first place - in favor of a tiny set of bars at the top right of my screen in the menu bar where I can barely make them out? It’s also less precise about my volume level now. Why?
That sure seemed random. It sure isn’t functional.
Because before you many users complained "IT TAKES UP THE WHOLE SCREEN!!!!" and it was a bit annoying to be honest when it obscures a video or something else you're trying to view.
But it doesn’t take up the whole screen, it flashes for like 2 seconds, and it has been around so long that it has created a very ingrained user behavior.
I’d be very curious to see how many complaints they’ve actually gotten about it. This definitely struck me as “random”
The issue arises when my output is not what I think it is or the audio is otherwise not being adjusted (happens a lot when your pushing through an HDMI output). It could be going out my speakers, my headphones, or something else entirely. So when I’m pressing the volume up and down trying to see what is going on, I don’t want to have to squint at a very tiny set of bars in the top menu bar. That is annoying and far more distracting than an opaque layer that gives me clear information showing up for 2 seconds.
At the end of the day I want Apple to adhere to the “it just works” philosophy. That little pop up served as a critical source of information I needed daily that tells me more than just the volume level. It’s easy to understand, it’s been consistent for I believe two decades, and it provides information to multiple questions instantly. It did not need to be changed and what they changed it to is worse.
> What youre describing is called iOS on a large iPad.
iPad was my gateway drug into Apple when I got it as a gift for my aunt and saw how easy and intuitive it was to use, and also to develop for.
Then after Jobs' whip fell from his cold hands they went into the realm of "mystery meat" menus and arcane gestures where swiping from seemingly every different angle of the screen edge does something different. Swipe from the top-right corner to get the Control Center, but swipe from the center-top to see the Notifications?? Yeah not gonna bother training an elder on that. I can't dare get my mom a modern iPhone now where she has to swipe up to unlock: it has be an iPhone SE, the last iPhones with a Home button.
I am the filthiest of nerds but I still can't get myself to remember how the heck iPad multitasking works. Apparently they can't either, they changed it again in 26 and now I can't easily get Notes etc. by swiping in from the side when watching a video etc. and I haven't bothered to look up how to do that now.
In any case all this only shows that attempting a one-size-fits-all UI can't really go all the way. iPhones/iPad have had a respectable run, they were lucky to have an OS Usability tyrant in charge, but maybe it's time to accept that UIs need an option for Simple vs Expert or something.
> the realm of "mystery meat" menus and arcane gestures where swiping from seemingly every different angle of the screen edge does something different. Swipe from the top-right corner to get the Control Center, but swipe from the center-top to see the Notifications?
Ha, I'm a heavy long term iOS and MacOS user, and I still haven't learned what all the swipes and clicks in random places actually do exactly.
I just I know sometimes click by accident at the very bottom right of my display on MacOS and it swishes all the windows to the right (why? I have no idea?!), clicking again brings them back luckily.
On iOS I resonate with your comments about the swiping from different places to get different things. The only gesture I can ever remember is swiping from top right to get the quick system menu to turn wifi on/off etc. I can never figure out how to clear my notifications or why they're sometimes displayed and sometimes aren't. And the other swipes and menus are completely beyond me.
I'm a 40 year old life long software developer.
"iOS on a large iPad" has some good affordances but is definitely NOT some kind of panacea for elderly or computer illiterate users!
I fully believe that those inside Apple fighting for customized UI are relegated to hiding them as accessibility options. Apple has never been very fond of customization (one way, Apple's way, or the highway).
Most kids and most elderly need to run a mainstream browser from time to time, and this Visopsys will almost certainly never be able to run a mainstream browser.
I would love to see that happen, but its not going to.
It is the people with basic needs who need to stick to the mainstream stuff because they can get support and it does what they expect. People need bank and other complex websites to work. They want to watch online video. Kids will need educational apps.
Also do not make assumptions about elderly people. Not long ago I met a woman (guess in her 70s?) who used to write embedded software for nuclear reactors. I have known many people or similar or greater age who need quite complex stuff.
Its the geeks who can manage with the non-mainstream stuff.
Web browsers have three purposes: document viewer, remote paperwork machine, and cross-platform application framework. I could throw together a browser fully capable of the first two in a month. (Much less time, if you're okay shipping a prototype, which personally I'm not.) Bank websites are not complex, unless you count the business logic: there's no reason they shouldn't work in Dillo.
> and unlike a lot of other "hobby" OSes actually looks usable as a daily driver if your needs are basic (kids, elderly, older/cheaper hardware, etc).
While building a non-Linux OS is very impressive, however this is not useful as a daily driver at all.
If the OS doesn't even have basic browsers such as Chrome or Firefox, it can't be remotely used as a daily driver to anyone who isn't a computer enthusiast.
I wonder if Visopsys, Windows 3.11 and others could work as a daily driver running in qemu, started from a Linux initrd that has just a browser and qemu. "Opening" the browser in Visopsys actually switches to the browser running on the host, and Alt-Tab switches back to Visopsys.
Ahh this OS is small enough that a university professor used it as the basis for his class assignments: write a device driver for it, or a pipe implementation, if I recall correctly. I thought it was pretty genius at the time, and it was certainly quite a challenge for the students too.
That's 4000 pages, small enough that you could easily lift it. You could read it all in a week, though that wouldn't be enough time to understand much of it. It's half the size of glibc and a hundredth the size of Firefox or the Linux kernel.
It would also depend upon what you are trying to accomplish. You have simple filesystems and complex filesystems. You have simple video drivers, and you have complex video drivers. Simplicity gets the job done, but complexity may offer better reliability or performance or features.
Then there is the question of what one means by an operating system. While I'm sure that most people would agree that much of the software shipped with Windows, Mac OS, or the typical Linux distribution isn't part of an operating system proper, few would agree upon where the boundary lies.
as others have pointed out "just" is doin big overtime here. but also x86_64 saps the fun out by forcing you into archaic irrelevant details IMMEDIATELY. but really, it's a good filter
making toy os for a nice small board on a nice architecture like riscv is night and day more enjoyable. not that modern boards that have more device tree overlays than senses are a good starting point either.
a more modern mmix that builds further up, or nand2tetris, xv6 or any other riscv project going all the way to a user mode ui would be really cool
It's the kind of project that takes 20 years to accomplish on your own, and everything seems doable from moment to moment because you have to work very slowly, and the stepwise changes aren't hard.
Just get the thing to boot. Just boot into extended mode. Just get graphics running. Just get a userspace. Just implement cooperative multitasking. Every step is "just", but when you take a step back the complexity is enormous, and it becomes hard to explain to anyone how it works in its entirety.
Although it seems easy to the author because that's just how his brain works now -- by then end of it, you and the OS are one and the same, where your brain is essentially a map of the codebase and nothing more, because nothing else can fit.
In all probability, yes. I'm not sure how much easier it would be to develop though. Back then, most (if not all) of the operating system was developed in assembly language while there was far more to consider when it came down to performance and memory usage (which is often in conflict with each other). CP/M was also notorious for running on hardware that was incompatible with each other, relying upon the BIOS to smooth out those irregularities. While that may simplify the development in some respects, such as the hardware vendor developing hardware drivers, it complicates development in other respects, since CP/M development could not make assumptions about the underlying hardware.
take a look at AtheOS it's successor SyllableOS. created by a single developer, another single developer took it over (syllable) and it shortly became an open source project before it went defunct again. But it made impressive gains in the 3 years of initial development.
i miss those days of everyone and their mom creating an OS for giggles
Don't forget SkyOS. And there's plenty more, with SerenityOS being one of the latest notable examples. Those days never ended. Also ekhem ekhem TempleOS, as single developer as you can get.
SkyOS ? I actually paid to be on the Beta program and then suddenly out of the blue the developer pulled the plug on the project completely. Not sure what happened but there were rumours that the code may have "borrowed" from other operating systems but I am not sure.
I always found semantic versioning a little too verbose. Particularly when deciding when to release major versions. OSX was on version 10 for many years but of course released a new "major" version every year.
Semantic versioning is just something everyone does in software development, but is is really that necessary?
Semantic versioning is for APIs, not for functionality. So it's for developers consuming that API (whether a library, or a service).
For releases in production, use a calendar version. v2025-11-02 is a clear release tag. Add preciseness as required. There should be a SBOM/Manifest (Bill Of Materials) of the versioned major components and configuration for that overall release.
For users, it depends on the type of user and what they expect. Their focus is on functionality. So when there's a new feature, bump the number.
It's a bit like the car model. It can be random extension letters like "-X", or "6Si".
So, amongst others, they had Oracle 8i at the height of the dot com boom (i for "Internet"), then a few years later when clustering became big news there was Oracle 10g (the g standing for "grid", I think?), and so on.
Actually, it looks like they might still be doing it - I just checked, and their current version is 23ai...
Developers are "users" (of a library, API, tool...), and "API functionality" is a subset of "functionality": what purpose would such distinction serve?
For example, in end user desktop software (say a text editor), how would you indicate a security bug fix for an old version v2023-11-02 without forcing users to pay for a new version of v2025-09-15?
Again, versioning is a tool, and depending on the release structure of a project, SemVer might work well or it might not (including for APIs/libraries).
Versioning is a tool to communicate changes and backwards compatibility to the users. SemVer makes sense in a lot of cases, but it neither covers everything (eg. compare with Debian/Ubuntu packaging versions), nor is it always needed (think of REST API versions which usually only go with major versions, and commonly multiple major versions from the same codebase).
Speaking of these, does anyone recall the AtheneOS distribution/OS. There’s an archive.org copy of the desktop environment version of it, but I recall there was a really fast version with only 2D graphics and it was a full distribution.
Can anyone validate whether this is real? I tried contacting the guy who wrote it but the Companies House address for his company (Rocklyte) bounced the letter.
Huh, SyllableOS might well be it. I thought for sure it was a different version of Athene that I was running but now looking at the screenshots they have a weird familiarity to them so maybe it was just around the same time that I tried both and the memory of two decades has blurred. Thank you!
I think it's worth mentioning on a hobby OS, just because it's a decent bit more work to do preemptive multitasking. It's a badge of honor to have successfully implemented it.
There is Wirth and Gutknecht's Oberon System. It's still available
but is older than Visopsys -- it was created around 1990, then updated in 2013. I think it's now considered an historical artifact.
Around 1997 I learned the concept of RTFM! Obviously my father already taught me to look in the DOS and WordPerfect manuals to learn about features and commands one might use. Great learnings.
I took an OS in college in 2006 and the big project that my prof required us to do was to make modification of visopsys. The software was primitive at that time but still had UI interface.
I emailed the author to ask some questions in my project. The author had connection with my prof and informed my prof about this. My prof told me that I was not allowed to ask the author regarding this project. So I had to figured out on my own.
It was fun to play around with and learnt how things work at deep OS level. It was a good memory for me :)
And you guys notice anything about my username? :)
I can't tell if OP just really enjoyed the experience or if he is the actual author of the OS.
No, I am not the author. I just liked the project and the name that I chose it for my hackernews account 14 years ago. No intention to misrepresent the project or the author.
> liked the project and the name that I chose it for my hackernews account 14 years ago
Some people name themselves `__xXx_ultimatEWeapon420_xXx__` and some people name themselves after a random toy operating system.
Classic nerd habit.
He just named himself in honor of something he remembered fondly.
Reading your post until the end was like watching a movie...
Sounds like an insane ride xD
Interesting. Never heard of this system before. It's apparently a monolithic kernel, developed almost exclusively by originally Canadian programmer Andy McLaughlin since 1997. The system has a graphical user interface, preemptive multitasking, and virtual memory. It is implemented in C and IA-32 assembly language. Here is a 2012 interview with the author: https://www.pingdom.com/blog/visopsys-operating-system/.
Surprisingly only one small previous thread:
Visopsys - https://news.ycombinator.com/item?id=18147201 - Oct 2018 (6 comments)
Even dang was like "dang, this topic has only been submitted once before now?"
This is very very cool, and unlike a lot of other "hobby" OSes actually looks usable as a daily driver if your needs are basic (kids, elderly, older/cheaper hardware, etc).
While for nerds computers have become these monstrously powerful things that can do everything under the sun, there's definitely still plenty of people who just want a computer to write down notes, keep a calendar, use the calculator... eg the things home computers were originally made to do.
What youre describing is called iOS on a large iPad. Everyone from 4 year olds to my 77 year old computer illiterate Dad can figure it out.
This doesn't look very usable at all by someone who isn't basically a computer nerd.
> What youre describing is called iOS on a large iPad.
Is iOS able to work with files ? Asking for a friend. /s
True in theory, but in practice due to our economy being based on growth at all costs, iOS doesn’t really fit the bill anymore.
Nowadays even iOS will randomly change its UI and send you “notifications” or “suggestions” (modern euphemism for “ads”) to subscribe to Apple TV* or iCloud.
I was forced to buy a new iPhone recently (my 16 was stolen), and had iOS 26 foisted on me.
My god, is it bad (for me, I'm sure some like it). The ugly glass UX, the weird floating controls, the always on display, blah blah. It's not innovative at all, it's like they just had to redo everything simply to make it seem "new".
Always-on display can be disabled but for the rest I agree. It doesn’t really do anything more that my 3rd gen SE but is way more annoying to use (bigger size, no fingerprint reader nor home button).
So what is better? I think you're wrong and a tablet with iOS or android is the best form factor for computer illiterate people to get something done. Despite whatever bullshit they added, everything else is worse. But maybe you know of something better?
All you need to understand this is to watch a chimp browsing Chimstagram on an iPhone: https://www.youtube.com/watch?v=XTiZqCQsfa8
Forget computer illiterate, not even human let alone literate!
> Nowadays even iOS will randomly change its UI
You and I have very different ideas of “random” I think.
> “You and I have very different ideas of “random” I think.”
Indeed, not ‘random’. With respect to iOS26 what word should one use? Premeditated? Deliberate? Maliciously?
Maybe a better definition is “for seemingly no reason”?
“Arbitrary” is the word people often should reach for instead of “random”.
[dead]
Why with Tahoe did they get rid of the volume indicator that popped up middle of screen that they’ve had for 20+ years - a critical indicator that the volume controls are even working in the first place - in favor of a tiny set of bars at the top right of my screen in the menu bar where I can barely make them out? It’s also less precise about my volume level now. Why?
That sure seemed random. It sure isn’t functional.
Because before you many users complained "IT TAKES UP THE WHOLE SCREEN!!!!" and it was a bit annoying to be honest when it obscures a video or something else you're trying to view.
But it doesn’t take up the whole screen, it flashes for like 2 seconds, and it has been around so long that it has created a very ingrained user behavior.
I’d be very curious to see how many complaints they’ve actually gotten about it. This definitely struck me as “random”
what kind of video are you watching where you need to change the volume so often and missing two seconds of part of the video would be such an issue?
Look. I wanted to change the volume. My hand went to the keyboard. I felt the key. I felt the key press down. The volume changed.
That's all the feedback I need! I don't need my vision stuffed with that information.
But yeah, it did look cute and should be an option between "Expressive" or "Minimal" UI.
The issue arises when my output is not what I think it is or the audio is otherwise not being adjusted (happens a lot when your pushing through an HDMI output). It could be going out my speakers, my headphones, or something else entirely. So when I’m pressing the volume up and down trying to see what is going on, I don’t want to have to squint at a very tiny set of bars in the top menu bar. That is annoying and far more distracting than an opaque layer that gives me clear information showing up for 2 seconds.
At the end of the day I want Apple to adhere to the “it just works” philosophy. That little pop up served as a critical source of information I needed daily that tells me more than just the volume level. It’s easy to understand, it’s been consistent for I believe two decades, and it provides information to multiple questions instantly. It did not need to be changed and what they changed it to is worse.
* what they changed it to is worse _for you_
Some of us hated that floating overlay with a passion and wish it only the best riddance on its way out the door.
it’s ok to be wrong! (Purely a joke I hope that’s clear)
what are you trying to say here?
You mean the OS that "upgraded" to transparent background, sometimes hard to read text by default?
I can't recommend those in good concience ton elders anymore.
Kids always figure it out tho.
> What youre describing is called iOS on a large iPad.
iPad was my gateway drug into Apple when I got it as a gift for my aunt and saw how easy and intuitive it was to use, and also to develop for.
Then after Jobs' whip fell from his cold hands they went into the realm of "mystery meat" menus and arcane gestures where swiping from seemingly every different angle of the screen edge does something different. Swipe from the top-right corner to get the Control Center, but swipe from the center-top to see the Notifications?? Yeah not gonna bother training an elder on that. I can't dare get my mom a modern iPhone now where she has to swipe up to unlock: it has be an iPhone SE, the last iPhones with a Home button.
I am the filthiest of nerds but I still can't get myself to remember how the heck iPad multitasking works. Apparently they can't either, they changed it again in 26 and now I can't easily get Notes etc. by swiping in from the side when watching a video etc. and I haven't bothered to look up how to do that now.
In any case all this only shows that attempting a one-size-fits-all UI can't really go all the way. iPhones/iPad have had a respectable run, they were lucky to have an OS Usability tyrant in charge, but maybe it's time to accept that UIs need an option for Simple vs Expert or something.
> the realm of "mystery meat" menus and arcane gestures where swiping from seemingly every different angle of the screen edge does something different. Swipe from the top-right corner to get the Control Center, but swipe from the center-top to see the Notifications?
Ha, I'm a heavy long term iOS and MacOS user, and I still haven't learned what all the swipes and clicks in random places actually do exactly.
I just I know sometimes click by accident at the very bottom right of my display on MacOS and it swishes all the windows to the right (why? I have no idea?!), clicking again brings them back luckily.
On iOS I resonate with your comments about the swiping from different places to get different things. The only gesture I can ever remember is swiping from top right to get the quick system menu to turn wifi on/off etc. I can never figure out how to clear my notifications or why they're sometimes displayed and sometimes aren't. And the other swipes and menus are completely beyond me.
I'm a 40 year old life long software developer.
"iOS on a large iPad" has some good affordances but is definitely NOT some kind of panacea for elderly or computer illiterate users!
They removed the side thing in 26 and are bringing it back in 26.1.
There’s a complete lack of project leadership and it’s strangely worrying.
> There’s a complete lack of project leadership
I mean, that's fine, if there is no overarching vision. Just let users CUSTOMIZE the UI the way we want. That's it.
That would actually be easier on the UI designers too. Perhaps just a trifle bit complicated for the coders, but they have *AI* now, right??
I fully believe that those inside Apple fighting for customized UI are relegated to hiding them as accessibility options. Apple has never been very fond of customization (one way, Apple's way, or the highway).
I agree with you. I see this as a passion project, and I think it's really cool.
From the Visopsys "About" page:
> [...] realistically the target audience remains limited to operating system enthusiasts, students, and assorted other sensation seekers
I couldn't tell you how many operating systems fit those requirements, hobby or not.
>if your needs are basic (kids, elderly, . . .)
Most kids and most elderly need to run a mainstream browser from time to time, and this Visopsys will almost certainly never be able to run a mainstream browser.
Then we need to change what is meant by "mainstream browser".
I would love to see that happen, but its not going to.
It is the people with basic needs who need to stick to the mainstream stuff because they can get support and it does what they expect. People need bank and other complex websites to work. They want to watch online video. Kids will need educational apps.
Also do not make assumptions about elderly people. Not long ago I met a woman (guess in her 70s?) who used to write embedded software for nuclear reactors. I have known many people or similar or greater age who need quite complex stuff.
Its the geeks who can manage with the non-mainstream stuff.
Web browsers have three purposes: document viewer, remote paperwork machine, and cross-platform application framework. I could throw together a browser fully capable of the first two in a month. (Much less time, if you're okay shipping a prototype, which personally I'm not.) Bank websites are not complex, unless you count the business logic: there's no reason they shouldn't work in Dillo.
> and unlike a lot of other "hobby" OSes actually looks usable as a daily driver if your needs are basic (kids, elderly, older/cheaper hardware, etc).
While building a non-Linux OS is very impressive, however this is not useful as a daily driver at all.
If the OS doesn't even have basic browsers such as Chrome or Firefox, it can't be remotely used as a daily driver to anyone who isn't a computer enthusiast.
I wonder if Visopsys, Windows 3.11 and others could work as a daily driver running in qemu, started from a Linux initrd that has just a browser and qemu. "Opening" the browser in Visopsys actually switches to the browser running on the host, and Alt-Tab switches back to Visopsys.
Ahh this OS is small enough that a university professor used it as the basis for his class assignments: write a device driver for it, or a pipe implementation, if I recall correctly. I thought it was pretty genius at the time, and it was certainly quite a challenge for the students too.
it took me a while to find. here is the source code: https://sourceforge.net/projects/visopsys/files/visopsys-0.9...
Thanks for digging it out. It is still quite large code base. 274052 lines.
That's 4000 pages, small enough that you could easily lift it. You could read it all in a week, though that wouldn't be enough time to understand much of it. It's half the size of glibc and a hundredth the size of Firefox or the Linux kernel.
Michael MJD did a video on this recently :)
https://youtu.be/5MZljgXW2WA
Amazing! I find it extremely fascinating that somebody is able to create entire operating system. Not a easy task!
It’s not easy, but it’s more approachable than many realize.
Much of modern operating systems are the hordes and hordes of drivers; the fundamentals aren’t terribly complicated; just lots of detail.
It would also depend upon what you are trying to accomplish. You have simple filesystems and complex filesystems. You have simple video drivers, and you have complex video drivers. Simplicity gets the job done, but complexity may offer better reliability or performance or features.
Then there is the question of what one means by an operating system. While I'm sure that most people would agree that much of the software shipped with Windows, Mac OS, or the typical Linux distribution isn't part of an operating system proper, few would agree upon where the boundary lies.
as others have pointed out "just" is doin big overtime here. but also x86_64 saps the fun out by forcing you into archaic irrelevant details IMMEDIATELY. but really, it's a good filter
making toy os for a nice small board on a nice architecture like riscv is night and day more enjoyable. not that modern boards that have more device tree overlays than senses are a good starting point either.
a more modern mmix that builds further up, or nand2tetris, xv6 or any other riscv project going all the way to a user mode ui would be really cool
> just lots of detail.
"Just" is understating it.
It's the kind of project that takes 20 years to accomplish on your own, and everything seems doable from moment to moment because you have to work very slowly, and the stepwise changes aren't hard.
Just get the thing to boot. Just boot into extended mode. Just get graphics running. Just get a userspace. Just implement cooperative multitasking. Every step is "just", but when you take a step back the complexity is enormous, and it becomes hard to explain to anyone how it works in its entirety.
Although it seems easy to the author because that's just how his brain works now -- by then end of it, you and the OS are one and the same, where your brain is essentially a map of the codebase and nothing more, because nothing else can fit.
CP/M was also created by one person.
CP/M was far more simpler.
In all probability, yes. I'm not sure how much easier it would be to develop though. Back then, most (if not all) of the operating system was developed in assembly language while there was far more to consider when it came down to performance and memory usage (which is often in conflict with each other). CP/M was also notorious for running on hardware that was incompatible with each other, relying upon the BIOS to smooth out those irregularities. While that may simplify the development in some respects, such as the hardware vendor developing hardware drivers, it complicates development in other respects, since CP/M development could not make assumptions about the underlying hardware.
OP didn't mentioned complexity, nor any kind of comparison.
Nice then. OTOH, CP/M 2.2 has been open sourced, but I think there are no libre assemblers for it.
Cross-assemblers, there's one: http://john.ccac.rwth-aachen.de:8000/as/ but it's tedious to build under OpenBSD.
As for software, the ZMachine and V3 games don't count as 'libre examples'.
Libre was also not part of the comment I replied to.
take a look at AtheOS it's successor SyllableOS. created by a single developer, another single developer took it over (syllable) and it shortly became an open source project before it went defunct again. But it made impressive gains in the 3 years of initial development.
i miss those days of everyone and their mom creating an OS for giggles
Don't forget SkyOS. And there's plenty more, with SerenityOS being one of the latest notable examples. Those days never ended. Also ekhem ekhem TempleOS, as single developer as you can get.
SkyOS ? I actually paid to be on the Beta program and then suddenly out of the blue the developer pulled the plug on the project completely. Not sure what happened but there were rumours that the code may have "borrowed" from other operating systems but I am not sure.
You will be blown away by Serenity OS then.
Or Linux
I believe you are referring to GNU/Linux, or as I've recently taken to calling it, GNU plus Linux.
Ever heard of TempleOS?
It’s the only OS endorsed by God.
Made by the greatest programmer that ever lived.
Was looking for this
The most impressive thing is being on 0.9 after nearly 30 years
It's so old, that the 3D icons and window borders will be new again when 1.0 is released. Talk about some long-term vision.
But jokes aside, I always enjoy reading about custom OSes.
You joke but the first thing I thought when I saw the icons was that they were nice. Flat everything has run its course.
https://0ver.org/
I always found semantic versioning a little too verbose. Particularly when deciding when to release major versions. OSX was on version 10 for many years but of course released a new "major" version every year.
Semantic versioning is just something everyone does in software development, but is is really that necessary?
Semantic versioning is for APIs, not for functionality. So it's for developers consuming that API (whether a library, or a service).
For releases in production, use a calendar version. v2025-11-02 is a clear release tag. Add preciseness as required. There should be a SBOM/Manifest (Bill Of Materials) of the versioned major components and configuration for that overall release.
For users, it depends on the type of user and what they expect. Their focus is on functionality. So when there's a new feature, bump the number.
It's a bit like the car model. It can be random extension letters like "-X", or "6Si".
Oracle used to do that, didn't they?
So, amongst others, they had Oracle 8i at the height of the dot com boom (i for "Internet"), then a few years later when clustering became big news there was Oracle 10g (the g standing for "grid", I think?), and so on.
Actually, it looks like they might still be doing it - I just checked, and their current version is 23ai...
Developers are "users" (of a library, API, tool...), and "API functionality" is a subset of "functionality": what purpose would such distinction serve?
For example, in end user desktop software (say a text editor), how would you indicate a security bug fix for an old version v2023-11-02 without forcing users to pay for a new version of v2025-09-15?
Again, versioning is a tool, and depending on the release structure of a project, SemVer might work well or it might not (including for APIs/libraries).
Versioning is a tool to communicate changes and backwards compatibility to the users. SemVer makes sense in a lot of cases, but it neither covers everything (eg. compare with Debian/Ubuntu packaging versions), nor is it always needed (think of REST API versions which usually only go with major versions, and commonly multiple major versions from the same codebase).
> zerZerover Jesus Christ.
Very impressed by the screenshots in the website. This is no small feat.
Speaking of these, does anyone recall the AtheneOS distribution/OS. There’s an archive.org copy of the desktop environment version of it, but I recall there was a really fast version with only 2D graphics and it was a full distribution.
Can anyone validate whether this is real? I tried contacting the guy who wrote it but the Companies House address for his company (Rocklyte) bounced the letter.
Syllabe OS?
Huh, SyllableOS might well be it. I thought for sure it was a different version of Athene that I was running but now looking at the screenshots they have a weird familiarity to them so maybe it was just around the same time that I tried both and the memory of two decades has blurred. Thank you!
It’s amazing how one person kept this project alive since 1997, that’s real passion and love for coding!
It mentions preemptive multitasking as one of its features. Are there any operating systems that still use cooperative multitasking?
I think it's worth mentioning on a hobby OS, just because it's a decent bit more work to do preemptive multitasking. It's a badge of honor to have successfully implemented it.
> Are there any operating systems that still use cooperative multitasking?
RISC OS uses cooperative multitasking: http://www.riscos.info/index.php/Preemptive_multitasking
There is Wirth and Gutknecht's Oberon System. It's still available but is older than Visopsys -- it was created around 1990, then updated in 2013. I think it's now considered an historical artifact.
https://www.projectoberon.net/
Many RTOS support it, eg FreeRTOS’s co-routines: https://www.freertos.org/Documentation/02-Kernel/02-Kernel-f...
Took me a while to realize it's not a linux distro. Incredible!
Hmmm nice to see the OS is still under development.
First time I saw it was during undergraduate days.... 2006 or 2007?
It’s short for “visual operating system” but there are no screenshots anywhere. That would have felt off even in 1997.
Maybe they mean something else by visual.
Around 1997 I learned the concept of RTFM! Obviously my father already taught me to look in the DOS and WordPerfect manuals to learn about features and commands one might use. Great learnings.
Oh and:
https://visopsys.org/about/screenshots/
In the homepage there's literally a link on the top bar called "Screenshots"
https://visopsys.org/about/screenshots/
> PC compatible computers
That takes me back.
Naive question: would using such an OS bring some security by obscurity ?
still getting 403 after a few hours
oh i get it, geo blocked. had to fire up tor out of spite just to have a look
I can't seem to find if it has a web browser or if you can install the web browser
Just use temple os
TempleOS with a BeOS GUI - that's the vibe
By now, especially in linux, there should emerge an OS that is purely scripts to generate OS. Or is it already ?
NixOS is what you are describing: https://nixos.org
Depending on how you mean it, that exists variously in at least yocto, gentoo, or ALFS. Although I should point out this (visopsys) isn't Linux distro
[dead]