As a longtime NeXTSTEP user, I still remember the first time I perused the filesystem of Mac OS X 10.0 (Cheetah) in 2001. And I distinctly remember thinking to myself that Apple basically took NeXTSTEP and slapped their own window manager and app framework on it only. The filesystem and toolset was nearly identical, and I even found several config files (I don't remember which ones) that still had NeXTSTEP listed in the comments.
As I evolved to develop iOS apps in the 2010s, NSObject (NS=NeXTSTEP) continued to be a reminder of this same lineage.
> Apple basically took NeXTSTEP and slapped their own window manager and app framework on it only
Yup, that’s precisely it, and Apple made no secret of this. As early as the mid 90s, Apple knew their OS was a dead end and they needed to start over with a new OS as the base. Their in-house effort (Copland) was becoming an obvious failure and it was apparent they needed outside help. At the time a lot of people thought BeOS was going to be it, but the deal fell through when it became apparent that having Steve come back was a priority, and buying NeXT came with him. But it was always gonna be the case that the OS was going to be totally replaced and only surface level UI conventions would be maintained from classic macOS.
I think Steve ended up being a better salesperson than Jean-Louis Gassé. And to be fair, for all its incredible effort I don't think that BeOS was as mature as NextStep.
As I recall, there were some early UI efforts that essentially copied the Classic MacOS feel (the MacOS 8/Copland look) onto NextStep, but they were dropped in favor of OS X's Aqua design (which took inspiration from the iMac design)
As I recall, BeOS was asking on the order of $80 million, NeXT was acquired for $400 million.
I found this reference, so 80 valuation, Be wanted upwards of 200, “In 1996, Apple Computer decided to abandon Copland, the project to rewrite and modernize the Macintosh operating system. BeOS had many of the features Apple sought, and around Christmas time they offered to buy Be for $120 million, later raising their bid to $200 million. However, despite estimates of Be's total worth at approximately $80 million,[citation needed] Gassée held out for $275 million, and Apple balked. In a surprise move, Apple went on to purchase NeXT, the company their former co-founder Steve Jobs had earlier left Apple to found, for $429 million, with the high price justified by Apple getting Jobs and his NeXT engineers in tow. NeXTSTEP was used as the basis for their new operating system, Mac OS X.”
I don’t remember the exact number, but BeOS was too incomplete at the time to spend what they were asking, and maybe to purchase at all. There was no way to print documents, which still mattered a lot for a desktop OS in 1996. It needed a lot of work.
Now, in retrospect, Apple had time; Mac OS X wasn’t ready for the mainstream until 2003-2004.
Send PostScript, done. Today it's figured out what driver will properly rasterize exotic things like ligatures because we decided that throwing a real CPU in the printer was a mistake.
From the wikipedia page it was that they were just adding feature after feature. As an aside it is very interesting that Classic Mac OS was attempted to be replaced by a microkernel twice but was successful because they figured out how to ship NeXT with fresh paint and an slightly improved window manager that really was just NeXT if you compare the two. (A/UX https://en.wikipedia.org/wiki/A/UX, Copland
https://en.wikipedia.org/wiki/Copland_(operating_system)
Yes. Focus is about saying “no.” None of the Copland management or marketing people ever said “no”. So instead of just porting the Mac toolbox to C and a microkernel and shipping that as a first step, it had to have a new filesystem API (so abstract that it will support non-hierarchical filesystems!), and a new OO system for handling text strings, and a new UI toolkit, a new help system, oh, and an OpenDoc-based Finder… A lot of potentially good ideas, but you simply cannot do that all at once.
It wasn’t actually a completely lost cause. They could have shipped it in six months if they’d just simplified the damn thing to a native UI toolbox on top of a microkernel, API additions cut to the bone. (One of the problems with that plan, though, is that product marketing wants all the features you can cram in, so you have to be willing to tell them “no” too.)
Anyway, Gil Amelio and Ellen Hancock also didn’t know how to manage software projects, so instead of fixing what they had, they decided to buy outside tech. Which means we got the iPod, the iPhone, etc. But, in context, they were total dumbfucks; they could have gotten there with what they had.
I have an original "NeXTstep Concepts" manual with the NX prefix. I can't remember where I heard the "NeXT and Sun" explanation for the first time, but a Google search shows that the prefix change could've actually occurred before the collaboration with Sun. It could stand for NeXT Software, as it's closer to the transition to a software company.
The "NS" prefix was introduced with OpenStep in 1994[1]. OpenStep was a collaboration between NeXT and Sun[2]. So the S in "NS" referring to Sun is certainly plausible!
A tangent I know, but looking at those old screenshots really made me miss that era of OS X. The first versions of Aqua with pinstripes were a bit busy for my liking, but by the Mountain Lion time frame it was just lovely. Actual buttons! Soft gradients! Icons that had colour!
I am still very sad that the point we started getting high-DPI displays everywhere was about the same time we decided to throw away rich icons and detail in UIs in favour of abstract line art and white-on-white windows.
Maybe it was on purpose? Those fancy textures and icons are probably a lot more expensive to produce when they have to look good with 4x the pixels.
iOS 4 on an iPhone 4 and OS X whatever-it-was that was on the initial retina MacBook Pros are still very clear in my memory. Everything looked so good it made you want to use the device just for the hell of it.
It’s because the higher the resolution, the worse those kinds of design effects look. It’s why they’re not much used in print design and look quite tacky when they are.
At low resolutions you need quite heavy-handed effects to provide enough contrast between elements, but on better displays you can be much more subtle.
It’s also why fonts like Verdana, which were designed to be legible on low resolution displays, don’t look great in print and aren’t used much on retina interfaces.
> the point we started getting high-DPI displays everywhere was about the same time we decided to throw away rich icons and detail in UIs in favour of abstract line art and white-on-white windows.
I might have an alternative explanation.
I often think about something I saw, a long time ago, on one of those print magazines about house decoration, which also featured sample house blueprints. That particular issue had a blueprint for a house which would be built on a terrain which already had a large boulder. Instead of removing the boulder, the house was built around it; it became part of the house, and guided its layout.
In the same way, the restrictions we had back then (lower display resolutions, reduced color palette, pointing device optional) helped guide the UI design. Once these restrictions were lifted, we lost that guidance.
> Maybe it was on purpose? Those fancy textures and icons are probably a lot more expensive to produce when they have to look good with 4x the pixels.
That's an interesting observation. If it was indeed on purpose, I wonder whether they were weighting it based on the effort on Apple's designers/developers/battery usage or the effort it would have drawn from 3rd party developers.
I run an iMac G4 with 10.5 as a home music player. The strange thing is that it feels so easy to use. All the ingredients are the same in modern macOS but the feel is very different.
It’s hard to say why. Clarity in the UI is a big one (placement and interaction, not the theme, ie what we’d call UX today). But the look of the UI (colour, depth) really adds something too. Seeing a blue gel button sparks a sense of joy.
Mac OS 8.5 and above technically the theming support as well (presumably salvaged from Copland), but Apple removed the themes from the final version of 8.5, and never released any of them. I'm not sure if many 3rd party ones were made either, as another commentator notes Kaleidoscope was already fairly established as the theming tool for classic Mac OS, and worked with older stuff like System 7.
Those who romanticize the past tend to highlight the best points and gloss over the low points, which is likely better than dismissing it altogether.
It's also worth noting that some points mentioned either didn't matter as much, or aren't true in an absolute stuff. Slow networking wasn't as much of an issue since computers as a whole didn't have the capacity to handle huge amounts of data, while limited functionality depends upon the software being used. On the last point, I find a lot of modern consumer applications far more limiting than older consumer applications.
Apple is pretty clear in it's intention of making SwiftUI the blessed UI toolkit, however, they haven't deprecated AppKit or UIKit in any way and keep updating it, as they demonstrate at every WWDC with "what's new in AppKit" (e. g. for 2024: https://youtube.com/watch?v=McKYDogUICg).
They also provide means to mix-and-match AppKit and SwiftUI in both ways. In no way are they trying to "have you believe it's all SwiftUI now". It is simply the next generation of UI frameworks.
I feel that SwiftUI still has a ways to go. I like the ideas and philosophy, but the execution is still a work in progress.
First, they really need to beef up the docs.
Next, they need to stop punishing people for "leaving the lane." Not everyone wants their app to look and behave like a bundled Apple app.
I dislike Autolayout, and UIKit definitely has a lot of "old school" flavor, but with IB and UIKit, I can make an app that can do just about anything that I want.
I love autolayout. I've never felt it's been so easy to learn how to lay stuff out. InterfaceBuilder felt so confusing and arbitrary and like having to learn an entirely new language just to get basic behavior working correctly. Plus, it didn't compile to code but to a "nib" file you had to work with in abstruse and unintuitive ways. At least I can debug code; how the hell can i debug a nib? Most of the exposed functionality was difficult to google and assumed you knew what all the obscure icons (like springs and arrows and lines) meant. Very confusing and frustrating.
Meanwhile autolayout was very intuitive. Define your variables, define your constraints, and it just magically works.
This is a common refrain I hear w.r.t. Apple and while
I write very little native mobile code I have to agree. It’s so sparse and rarely has more that I should be able to find out by hovering over a function/variable in my IDE.
Would it kill them to add a paragraph explain why or how you use the thing? To add code samples? Must be too much for one of the most valuable companies in the world… What kills me is that this is actively hurting their own platforms. I can understand some of Apple’s moves but this one is so incredibly short-sighted.
Those of us "of a certain age," know that Apple used to have the best documentation in the industry. Many of the folks that wrote the docs, had more credentials than the ones writing the code.
> What kills me is that this is actively hurting their own platforms
I don't think they care. Any useful application will be re-created by Apple and bundled with their OS eventually anyway. Outside devs aren't really necessary for anything but games.
I wrote a small SwiftUI app, to display simple bar charts of data from our social app. Number of active users, vs. ones that haven't signed in, user acceptance/rejection rates, etc.
The SwiftUI Charts library is pretty good for this. I consume a CSV file into a dataframe, and add some extra computed properties, etc.
I wanted to add the ability to pinch to zoom, so users don't need to look at the entire dataset, and try to find individual days, etc.
I banged my head for a couple of days, getting it working the way I needed (TL;DR, I did. Just took a while). I can add a magnification gesture adornment, but the docs for it suck, the docs for the charts suck, the docs for the viewbuilder suck, the docs for the view suck, and even the docs for the dataframe suck. I basically had to find out what I needed, by trial and error, and examining the exposed protocols. Some of the stuff is crazy easy, like translating the pinch to a usable number, but then, I wanted to add a bit of custom handling, for the edges, and allowing the user to pan the chart, while also being able to individually inspect bars (I ended up giving up on that, and have an ugly scrubber scrollbar under the chart). It also doesn't work on the Mac, which sucks, because I do a lot of admin stuff on the Mac.
It's a long story, and I'm sure that you would find fault with my work (one of the things geeks love to do, is point out errors made by other geeks), but it shipped, it works, and we've been using it for days.
And, in the end, even though it exhibits nonstandard behavior, it looks exactly like a Settings App panel. That's fine. It's a backend dashboard, but I'd be upset if it was something we needed to expose to end-users of the app.
I remember the Unix-ness was a big part of OS X’s nerd popularity. People were talking about real Unix terminals, for example.
Later Windows also aimed for the same thing with their new console app and Linux support. Yet macOS has remained the same. The Terminal app feels essentially unchanged and there’s no good app package service (eg brew etc - these are third party and can mess up your system.)
Even Xcode is, well… look how extensions were restricted.
Modern macOS feels boring, but also not aimed at developers.
Back in early 2000s it was a top choice if you wanted some kind of unixy box with a polished GUI desktop that "just worked", especially if you wanted a laptop. BSD and Linux were fine, but as a desktop OS they were a very different experience from today, took way more tinkering even on a desktop PC as anyone who had to write their own X11 config will tell you. Today installing a Linux desktop distro is so easy and hardware compatibility is so good that the tables have turned, also if you are the type of user that wants a big DE (no judgement) the Linux DEs today are far more polished, people still complain today but if you go back in time it was a mess. These days MacOS seems extremely restrictive and awkward by comparison, on the one hand a huge chunk of the userland got stuck in time, while Apple have become more and more hostile to any kind of changes and customisations to the more unixy side of the system.
Sun had an agreement with Toshiba for Solaris laptops, but they were rather pricey.
UNIX is stuck in time, hardly anything improved beyond file systems, and small API improvements, and that is what macOS is measured against, POSIX certification.
To note that the only standard UNIX UI is CDE, and anything 3D isn't part of POSIX.
ZFS, BcacheFS, HammerFS... I think OpenBSD will have a better FS soon.
On modern FS', the plan9/9front ones are pretty much ahead of almost anything; but plan9 it's a Unix 2.0. It went further. On 3D, forget POSIX. GL was the de facto API and now Vulkan, and the most common middleware multimedia API it's SDL2.
While IrisGL was born on Irix, it was placed under ARB stewardship, which after Long Peaks disaster became Khronos.
Vulkan only exists thanks to AMD offering Mantle to Khronos, an API designed originally for game consoles, very much not UNIX, and had it not been for AMD, Khronos would still be thinking what OpenGL vNext was supposed to look like.
SDL also has very little with UNIX history, as it was created originally to port games from Windows to Mac OS (not OS X) and BeOS.
Is this supposed to be a bad thing?! It's a rock-solid workhorse. If they changed it I would stop trusting macOS to be any better than the linux flavor of the month
The App Store install what you would install through .dmg or .pkg. This is, if you install, for example, Android Studio, Docker and UTM, you will have three QEMU executables, one for each app.
Homebrew does quite a good job as a package manager for Mac, however, it's far from how the package managers work in Linux distros. For example, by running ``sudo pacman -Syu`` I upgrade everything that is installed, including the kernel, standard libraries, Python packages, language packages, manpages and so on. In Mac, I have to upgrade the system through system updates, homebrew packages through ``brew upgrade``, Python packages through pip, the App Store installed stuff through App Store and the manually installed apps through whatever the way they are upgraded.
> For example, by running ``sudo pacman -Syu`` I upgrade everything that is installed, including the kernel, standard libraries, Python packages, language packages, manpages and so on.
I actually view this as a liability. System package management should be distinct from userspace package management.
It won't happen if the packages are well maintained. Note that if you want a different version for your project you should use a pyenv or similar and have finer control on its dependencies
You can still use pip inside pyenv or similar. Pacman would install the system-wide stuff, so you won't need to bother about the libraries that a package that you have installed is using.
Mentioning this classic XKCD: https://xkcd.com/1987/. This only made sense after using a Mac. While using Manjaro it was quite organized: only the pacman-installed libraries and the ones of my user projects. Now in Mac I have the default Python, the several Python versions installed from several dependencies in Homebrew, and so on.
Do developers use the app store? 99% of what I install on my computer isn't available through the app store. I just use it for Apple apps (Pages etc). Pretty much everything else is freely available and more fully featured outside the app store.
Plus, it's spammed with low-quality for-profit crapware—the iOSification of an otherwise fantastic platform
Yes, they publish their apps on the App Store and make money from customers.
As for Apple making life easier for developers by making their OS more like Linux, that is not good for the rest of their users, and these users are more important than developers. It's preferable that developers jump through some hoops, rather than making the OS worse for non-developers.
Sure, it's horrible. It only makes it easier than any other platform for developers to make money by selling their software to users. And it only makes it easy and secure for users to purchase and install software. Awful.
Absolutely. WO was a brilliantly designed framework (especially for the time) and being somewhat disillusioned with the state of web development in the last decade, I'm still using it as the UI layer for some of my own applications. It just can't be beat when it comes to throwing together a quick app, essentially being AppKit for the web. And as you say, it's influence was great, although I often wish it had a little more influence.
EOF was a great ORM framework as well and I never really understood ORM hate - until I had to use ORM frameworks other than EOF which generally feel … not that great. I ditched EOF a decade back though, due to it being, well, dead, and replaced it with Cayenne which is an excellent, actively developed ORM that feels very much inspired by EOF's design principles.
In the last few years, I've been working on a WO inspired framework (to the point of almost being a WO clone on the component/templating side) as a side project. It's still very raw when seen from the outside, no documentation and still operating under a bad codename - but hoping to make a release and port my remaining WO apps in the coming year. Hopefully it will add at least a bit to WO's influence on the web development world :).
Especially hilarious when you think of the rising popularity of HTMX.
WebObjects at the time revolutionary model of using the URL for state management would work really well with the new trend back towards server side rendered components.
Totally. I've been very happy to see the world embrace htmx in the last year and it's given me confidence knowing I'm doing the right thing with ng-objects.
The methodology htmx uses is in many ways identical to what we've been doing in the WO world for almost 20 years using Ajax.framework (which I don't know if you're familiar with), a WO plugin framework that most importantly adds "partial page updates". So you can wrap a part of a page/component in a container element, and target it so only that element gets rendered/replaced on the client side when an action is invoked (link clicked, form submitted etc.).
And yes, combined with WO's stateful server side rendering and URLs, it's ridicilously powerful. I usually design my WO apps so users never actually see a stateful URL, they always land on "static URLs" while stateful intra-page work happens through page replacements. I love it.
Around 2010, I started learning Objective-C to be part of the whole native mobile development movement. What I didn’t know when getting into this was how much of a history lesson I would have to participate in to understand the background behind so many aspects of the language and the core frameworks.
Yes, I had a similar experience with Objective-C. While I found it generally odd, it makes complete sense as a C/C++ alternative with reflection capabilities and a lean event loop. I disliked the memory management. The language hasn’t aged well but that doesn’t mean it lacked clever ideas for its time.
> Along with analysis and debugging tools, Apple still gives away everything needed to build apps for the Mac, iPhone, or iPad.
Very conveniently glossing over the fact that developers still have to pay an annual Apple Developer Program subscription fee in order to be able to distribute their apps.
TANSTAAFL, as always.
Very conveniently glossing over the fact that if are developing for the Mac, no you don't. You can distribute it outside the store without paying anything.
If you choose not to pay Apple for the privilege of macOS development, you will need to teach users increasingly more arcane tricks to get the app running. As of the latest macOS release, the old trick of "right click -> open" stopped working, and the new trick is "open -> go to system settings and click on a magic button -> open again".
You don't pay Apple for the privilege of development, you pay them for the privilege of guaranteeing your users you are a legit developer who cares about their safety by registering and letting your app be reviewed.
Considering it would take less than a day for Apple's registration scheme to be overrun with billions of fake app builders if they don't put in a small monetary roadblock I don't see how this situation could be improved.
This has little bearing on desktop software, which usually doesn't go through the App Store. Apple does not (yet?) require review for traditionally distributed desktop app bundles or executable binaries. The developer fee is paid in that case just to get a signing certificate. The increasing number of hoops necessary to get unsigned things to run seems to just be funneling more developers into paying up and becoming beholden to Apple so they can stop the nagging of their users.
I think GPs point still stands for signing certificates. The need to pay something increases the barrier to entry. You can't just create a million developer accounts to get a million signing certificates after Apple bans one of them.
I think this is fine. If you're a business, the developer fee is not a significant expense, and it makes the whole ecosystem work smoothly. If you're a hobbyist, student, open source developer, or otherwise in a position where you won't make that money back quickly, macOS provides a workaround for opening unsigned apps. This is so different from the terrible situation on iOS.
Steve Jobs was very open about taking things from elsewhere and refining them for consumption.
Lisa and Mac were products of his seeing the Smalltalk GUI at his visit to PARC. There was nothing off-the-shelf, so they had to be built from scratch.
Of NeXT he said that he had been so bamboozled by the GUI at his PARC visit that he missed the other two, arguable more important concepts: OO and networking.
NeXT used as much off-the-shelf components as possible: Ethernet + TCP/IP for the network, Unix for the OS, Adobe's Display Postscript for graphics, Stepstone's Objective-C for the OO parts (which in turn mashed together C and Smalltalk). It bundled TeX, Sybase SQL Server, a bunch of scripting languages, Webster's dictionary, etc.
They only built themselves what they absolutely had to to get the machine and user experience they wanted.
> Steve Jobs was very open about taking things from elsewhere and refining them for consumption.
See also, forking KHTML into WebKit to build Safari when MS cancelled Internet Explorer for macOS and the platform was left without a robust browser choice. For two reasons: That they were somewhat comfortable letting MSIE reign for so long rather than making an inhouse option, and for not starting over when they did.
Well, no. They evaluated the existing choices and decided that KDE's code was a better fit.
> Melton explained in an e-mail to KDE developers that KHTML and KJS allowed easier development than other available technologies by virtue of being small (fewer than 140,000 lines of code), cleanly designed and standards-compliant.
According to Ken Kocienda's book (he was one of the original developers of Safari), that email is a post-hoc rationalization. The "evaluation" was literally him and another guy trying to build Mozilla for weeks and failing, and then someone else joining the team and quickly building Konqueror instead.
I’m pretty sure your history is off here. There was a 5 year agreement around 1998 to keep Office available for the Mac, and to make IE the default (but not only bundled) browser available.
Safari was shipped almost exactly at the end of that agreement, and the announcement as to IE for Mac being discontinued was 6 months later.
He wasn't, his position regarding UNIX beards was well known.
Supporting UNIX was a business opportunity to go against Sun and other graphical workstations.
There are recordings of NeXT meetings, and his famous appearance at USENIX, regarding this.
Note that everything that matters on NeXTSTEP is based on Objective-C and Framework Kits, zero POSIX, beyond what was need for those government and graphics workstation contracts.
Steve Jobs left Apple and founded NeXT in late 1985 with the intent of developing a 3M computer: 1 MB of memory, 1 million pixels and 1 million instructions per second; or powerful enough to run wet lab simulations.
Jobs bought Pixar in 1986 when they developed their own computer systems. Luxo Jr. was shown at SIGGRAPH that same year, one part advertisement for their computer, and one part fun hobby project because some of the Pixar guys aspired to one day do a fully computer animated full length feature film of their own. This worked out very very well for them. Eventually, but they also stopped developing the Pixar Computer System in 1990 in part because Jobs was losing a lot of money propping up both NeXT and Pixar.
Development of NeXTSTEP began in 1986 under Avie Tevanian based upon the Mach kernel he had co-developed at Carnegie Mellon which was developed with the intention to replace the kernel in BSD, which at this point I believe is still just BSD and years away from fragmentation. NeXTSTEP 0.8 was previewed in October 1988 and all the core pieces were there: the Mach kernel, BSD, DriverKit, AppKit, FoundationKit, Objective-C runtime, and the NeXTSTEP GUI. 1.0 came in 1989.
IRIX 3.0 was released in 1987 debuting the 4Sight window manager which isn’t too similar to what was released in NeXTSTEP but does use NeWS and IRIS GL, however it was based on System V UNIX. It’s not until Pixar started making movies, I think actually starting with Toy Story, that they bought Silicon Graphics workstations. For Toy Story, the render farm also started off using SGI but eventually moved to Sun computers.
So if anything, IRIX and NeXTSTEP are probably a decent example of convergent evolution given they were both (at least initially) in the business of making high end graphical workstations and neither needed to reinvent the wheel for their target market.
Sure, but given the timeline, it’s unlikely the decision came about simply because he was influenced by “the Pixar guys”. I pointed out that the goal for the first NeXT computers was to be able to do wet lab simulations, and this was due to a conversation Jobs had with Paul Berg while Jobs was still at Apple. They met again after Jobs founded NeXT before drawing up the initial spec in September 1985.
More likely the decision to use Mach/BSD was because Avie Tevanian was the project lead for the operating system.
4Sight also didn’t debut until IRIX 3.0 (1987, also when it picked up the IRIX name), prior to that they used mex which I traced back as far as 1985 and prior to that I’m not sure, but I don’t think they had a window manager and it seems unlikely they would prior to 1985.
it's a far-fetched idea anyways. It's a five months difference; NeXT in sep '85, and pixar in feb '86.
More likely scenario is they wanted to come to market as fast as possible with limited resources, so porting Mach kernel and BSD (both proven/robust things) to their platform was probably the fastest route; It'd also have an existing base of developers to attract and carried some weight if they targeted workstation market.
edit:
this is what made me think why maybe he was influenced, since Steve Jobs did actually launch another "cube" two years before NeXTcube, which was developed in the time before him buying pixar. This thing required an SGI/Sun to be attached: https://en.wikipedia.org/wiki/Pixar_Image_Computer
But Unix workstations were a thing even before then: 68k-based systems were already around in the 1980s, with Sun (taking just one example) releasing their first product in 1982:
I mean, Mach 2 was cutting-edge and freely available from CMU. Probably less a love of UNIX and more the necessity of having a practical base for a new workstation OS.
As a longtime NeXTSTEP user, I still remember the first time I perused the filesystem of Mac OS X 10.0 (Cheetah) in 2001. And I distinctly remember thinking to myself that Apple basically took NeXTSTEP and slapped their own window manager and app framework on it only. The filesystem and toolset was nearly identical, and I even found several config files (I don't remember which ones) that still had NeXTSTEP listed in the comments.
As I evolved to develop iOS apps in the 2010s, NSObject (NS=NeXTSTEP) continued to be a reminder of this same lineage.
> Apple basically took NeXTSTEP and slapped their own window manager and app framework on it only
Yup, that’s precisely it, and Apple made no secret of this. As early as the mid 90s, Apple knew their OS was a dead end and they needed to start over with a new OS as the base. Their in-house effort (Copland) was becoming an obvious failure and it was apparent they needed outside help. At the time a lot of people thought BeOS was going to be it, but the deal fell through when it became apparent that having Steve come back was a priority, and buying NeXT came with him. But it was always gonna be the case that the OS was going to be totally replaced and only surface level UI conventions would be maintained from classic macOS.
I think Steve ended up being a better salesperson than Jean-Louis Gassé. And to be fair, for all its incredible effort I don't think that BeOS was as mature as NextStep.
As I recall, there were some early UI efforts that essentially copied the Classic MacOS feel (the MacOS 8/Copland look) onto NextStep, but they were dropped in favor of OS X's Aqua design (which took inspiration from the iMac design)
The BeOS guy asked too much from what I recall. I think the BeOS thing fell through before Mr Jobs quietly returned in 1997.
As I recall, BeOS was asking on the order of $80 million, NeXT was acquired for $400 million.
I found this reference, so 80 valuation, Be wanted upwards of 200, “In 1996, Apple Computer decided to abandon Copland, the project to rewrite and modernize the Macintosh operating system. BeOS had many of the features Apple sought, and around Christmas time they offered to buy Be for $120 million, later raising their bid to $200 million. However, despite estimates of Be's total worth at approximately $80 million,[citation needed] Gassée held out for $275 million, and Apple balked. In a surprise move, Apple went on to purchase NeXT, the company their former co-founder Steve Jobs had earlier left Apple to found, for $429 million, with the high price justified by Apple getting Jobs and his NeXT engineers in tow. NeXTSTEP was used as the basis for their new operating system, Mac OS X.”
https://en.m.wikipedia.org/wiki/Jean-Louis_Gass%C3%A9e
“BeOS did not have printing” was the insult thrown around at the time.
I don’t remember the exact number, but BeOS was too incomplete at the time to spend what they were asking, and maybe to purchase at all. There was no way to print documents, which still mattered a lot for a desktop OS in 1996. It needed a lot of work.
Now, in retrospect, Apple had time; Mac OS X wasn’t ready for the mainstream until 2003-2004.
To be fair, printing in 1995-6 was a huge can of worms and hell on earth.
Send PostScript, done. Today it's figured out what driver will properly rasterize exotic things like ligatures because we decided that throwing a real CPU in the printer was a mistake.
ISTR those of us using HP or Apple printers were generally in pretty good shape at the time. Can’t vouch for other brands.
I wonder why exactly Copland went off the rails, do we have anyone from the Copland team here on HN who can share their view?
From the wikipedia page it was that they were just adding feature after feature. As an aside it is very interesting that Classic Mac OS was attempted to be replaced by a microkernel twice but was successful because they figured out how to ship NeXT with fresh paint and an slightly improved window manager that really was just NeXT if you compare the two. (A/UX https://en.wikipedia.org/wiki/A/UX, Copland https://en.wikipedia.org/wiki/Copland_(operating_system)
I think “slightly improved” and “fresh paint” undersells the work Apple did. Replacing the NeXTstep Display PostScript server wasn’t a small thing!
I’d love to know more about its design too
Apple published a book about it: https://dl.acm.org/doi/10.5555/524838
Yes. Focus is about saying “no.” None of the Copland management or marketing people ever said “no”. So instead of just porting the Mac toolbox to C and a microkernel and shipping that as a first step, it had to have a new filesystem API (so abstract that it will support non-hierarchical filesystems!), and a new OO system for handling text strings, and a new UI toolkit, a new help system, oh, and an OpenDoc-based Finder… A lot of potentially good ideas, but you simply cannot do that all at once.
It wasn’t actually a completely lost cause. They could have shipped it in six months if they’d just simplified the damn thing to a native UI toolbox on top of a microkernel, API additions cut to the bone. (One of the problems with that plan, though, is that product marketing wants all the features you can cram in, so you have to be willing to tell them “no” too.)
Anyway, Gil Amelio and Ellen Hancock also didn’t know how to manage software projects, so instead of fixing what they had, they decided to buy outside tech. Which means we got the iPod, the iPhone, etc. But, in context, they were total dumbfucks; they could have gotten there with what they had.
NS stood for NeXT and Sun. NX was the original prefix before OpenSTEP.
EDIT: I might be wrong about the first part, see my comment below.
This is what I’ve heard. If true it’s funny as it effectively eliminates Sun’s inclusion in the acronym because everyone assumes it’s NextStep.
According to Wikipedia, NS stood for NeXTSTEP: https://en.wikipedia.org/wiki/Foundation_Kit#:~:text=This%20....
I do remember people in my circles thinking it stood for NeXT Software (which was likely a more suitable name for its use at the time).
Uh, that‘s interesting! Do you have a source for this (or is it firsthand knowledge?)
I first read that in a Fabien Sanglard book, probably Game Engine Black Book: DOOM?
I have an original "NeXTstep Concepts" manual with the NX prefix. I can't remember where I heard the "NeXT and Sun" explanation for the first time, but a Google search shows that the prefix change could've actually occurred before the collaboration with Sun. It could stand for NeXT Software, as it's closer to the transition to a software company.
The "NS" prefix was introduced with OpenStep in 1994[1]. OpenStep was a collaboration between NeXT and Sun[2]. So the S in "NS" referring to Sun is certainly plausible!
[1] https://developer.apple.com/library/archive/documentation/Co...
[2] https://en.wikipedia.org/wiki/OpenStep
A tangent I know, but looking at those old screenshots really made me miss that era of OS X. The first versions of Aqua with pinstripes were a bit busy for my liking, but by the Mountain Lion time frame it was just lovely. Actual buttons! Soft gradients! Icons that had colour!
I am still very sad that the point we started getting high-DPI displays everywhere was about the same time we decided to throw away rich icons and detail in UIs in favour of abstract line art and white-on-white windows.
Maybe it was on purpose? Those fancy textures and icons are probably a lot more expensive to produce when they have to look good with 4x the pixels.
iOS 4 on an iPhone 4 and OS X whatever-it-was that was on the initial retina MacBook Pros are still very clear in my memory. Everything looked so good it made you want to use the device just for the hell of it.
It’s because the higher the resolution, the worse those kinds of design effects look. It’s why they’re not much used in print design and look quite tacky when they are.
At low resolutions you need quite heavy-handed effects to provide enough contrast between elements, but on better displays you can be much more subtle.
It’s also why fonts like Verdana, which were designed to be legible on low resolution displays, don’t look great in print and aren’t used much on retina interfaces.
> the point we started getting high-DPI displays everywhere was about the same time we decided to throw away rich icons and detail in UIs in favour of abstract line art and white-on-white windows.
I might have an alternative explanation.
I often think about something I saw, a long time ago, on one of those print magazines about house decoration, which also featured sample house blueprints. That particular issue had a blueprint for a house which would be built on a terrain which already had a large boulder. Instead of removing the boulder, the house was built around it; it became part of the house, and guided its layout.
In the same way, the restrictions we had back then (lower display resolutions, reduced color palette, pointing device optional) helped guide the UI design. Once these restrictions were lifted, we lost that guidance.
> Maybe it was on purpose? Those fancy textures and icons are probably a lot more expensive to produce when they have to look good with 4x the pixels.
That's an interesting observation. If it was indeed on purpose, I wonder whether they were weighting it based on the effort on Apple's designers/developers/battery usage or the effort it would have drawn from 3rd party developers.
Long live Snow Leopard! It made my mac fly. A whole release dedicated to making Leopard better. It was amazing, peak macOS.
100% agree; if I could revive it to run it on modern arm hardware I would in a heartbeat.
I run an iMac G4 with 10.5 as a home music player. The strange thing is that it feels so easy to use. All the ingredients are the same in modern macOS but the feel is very different.
It’s hard to say why. Clarity in the UI is a big one (placement and interaction, not the theme, ie what we’d call UX today). But the look of the UI (colour, depth) really adds something too. Seeing a blue gel button sparks a sense of joy.
> by the Mountain Lion time frame it was just lovely. Actual buttons! Soft gradients! Icons that had colour!
You may be thinking of Tiger, because Apple already started removing color from Finder icons and such in Leopard.
Leopard also introduced a transparent menu bar and 3D Dock.
flat UI was the triumph of mediocre repeatability over humane user interaction
If only there was theming available to recreate those old formatting and styles.
Copland the failed OS NeXT was acquired to replace had themes.
https://lowendmac.com/2005/apples-copland-project
You also had Kaleidoscope[0]. That had some crazy themes[1].
[0] https://www.macintoshrepository.org/1706-kaleidoscope
[1] https://web.archive.org/web/20191021204432/https://twitter.c...
Mac OS 8.5 and above technically the theming support as well (presumably salvaged from Copland), but Apple removed the themes from the final version of 8.5, and never released any of them. I'm not sure if many 3rd party ones were made either, as another commentator notes Kaleidoscope was already fairly established as the theming tool for classic Mac OS, and worked with older stuff like System 7.
For me seeing old OS'es always remind me of the bad stuff. Slow CPU's, slow networking, slow disks, limited functionality.
Maybe I'm a bit too negative but for example when people romanticise stuff from the middle ages I can't help but think of how it must have smelled.
Those who romanticize the past tend to highlight the best points and gloss over the low points, which is likely better than dismissing it altogether.
It's also worth noting that some points mentioned either didn't matter as much, or aren't true in an absolute stuff. Slow networking wasn't as much of an issue since computers as a whole didn't have the capacity to handle huge amounts of data, while limited functionality depends upon the software being used. On the last point, I find a lot of modern consumer applications far more limiting than older consumer applications.
To me, Finder often seems slower now with SSD and Apple silicon that it was with spinning drives and PPC. And the Mac boots slower!!
Apple's software today is poorly optimized. They're depending on hardware to do all the work.
It's amazing that still today, you find NSStrings and NS prefixed stuff all over working code.
It's actually hard not to know anything about the old Appkit, as much as Apple would have you believe that it's all SwiftUI now.
Apple is pretty clear in it's intention of making SwiftUI the blessed UI toolkit, however, they haven't deprecated AppKit or UIKit in any way and keep updating it, as they demonstrate at every WWDC with "what's new in AppKit" (e. g. for 2024: https://youtube.com/watch?v=McKYDogUICg).
They also provide means to mix-and-match AppKit and SwiftUI in both ways. In no way are they trying to "have you believe it's all SwiftUI now". It is simply the next generation of UI frameworks.
I feel that SwiftUI still has a ways to go. I like the ideas and philosophy, but the execution is still a work in progress.
First, they really need to beef up the docs.
Next, they need to stop punishing people for "leaving the lane." Not everyone wants their app to look and behave like a bundled Apple app.
I dislike Autolayout, and UIKit definitely has a lot of "old school" flavor, but with IB and UIKit, I can make an app that can do just about anything that I want.
I love autolayout. I've never felt it's been so easy to learn how to lay stuff out. InterfaceBuilder felt so confusing and arbitrary and like having to learn an entirely new language just to get basic behavior working correctly. Plus, it didn't compile to code but to a "nib" file you had to work with in abstruse and unintuitive ways. At least I can debug code; how the hell can i debug a nib? Most of the exposed functionality was difficult to google and assumed you knew what all the obscure icons (like springs and arrows and lines) meant. Very confusing and frustrating.
Meanwhile autolayout was very intuitive. Define your variables, define your constraints, and it just magically works.
I generally use storyboards, but I do need to occasionally do programmatic stuff. That's a pain, but I can make it work.
> First, they really need to beef up the docs.
This is a common refrain I hear w.r.t. Apple and while I write very little native mobile code I have to agree. It’s so sparse and rarely has more that I should be able to find out by hovering over a function/variable in my IDE.
Would it kill them to add a paragraph explain why or how you use the thing? To add code samples? Must be too much for one of the most valuable companies in the world… What kills me is that this is actively hurting their own platforms. I can understand some of Apple’s moves but this one is so incredibly short-sighted.
Those of us "of a certain age," know that Apple used to have the best documentation in the industry. Many of the folks that wrote the docs, had more credentials than the ones writing the code.
> What kills me is that this is actively hurting their own platforms
I don't think they care. Any useful application will be re-created by Apple and bundled with their OS eventually anyway. Outside devs aren't really necessary for anything but games.
I’m achieving very custom looks in SwiftUI. What exactly do you find punishing?
Here's an example from a couple of weeks ago:
I wrote a small SwiftUI app, to display simple bar charts of data from our social app. Number of active users, vs. ones that haven't signed in, user acceptance/rejection rates, etc.
The SwiftUI Charts library is pretty good for this. I consume a CSV file into a dataframe, and add some extra computed properties, etc.
I wanted to add the ability to pinch to zoom, so users don't need to look at the entire dataset, and try to find individual days, etc.
I banged my head for a couple of days, getting it working the way I needed (TL;DR, I did. Just took a while). I can add a magnification gesture adornment, but the docs for it suck, the docs for the charts suck, the docs for the viewbuilder suck, the docs for the view suck, and even the docs for the dataframe suck. I basically had to find out what I needed, by trial and error, and examining the exposed protocols. Some of the stuff is crazy easy, like translating the pinch to a usable number, but then, I wanted to add a bit of custom handling, for the edges, and allowing the user to pan the chart, while also being able to individually inspect bars (I ended up giving up on that, and have an ugly scrubber scrollbar under the chart). It also doesn't work on the Mac, which sucks, because I do a lot of admin stuff on the Mac.
It's a long story, and I'm sure that you would find fault with my work (one of the things geeks love to do, is point out errors made by other geeks), but it shipped, it works, and we've been using it for days.
And, in the end, even though it exhibits nonstandard behavior, it looks exactly like a Settings App panel. That's fine. It's a backend dashboard, but I'd be upset if it was something we needed to expose to end-users of the app.
> Not everyone wants their app to look and behave like a bundled Apple app.
As (now just) a user of macOS, that’s EXACTLY what I want - consistency, not the output of some branding or UX person trying to mark their territory.
As with everything, "it depends."
It's entirely possible to have unique branding, yet maintain consistency with standard Apple UX.
Just takes flexibility and compromise. Many designers aren't so good at that.
On a related note, did you ever try out Kai's Power Tools[0]? Now that was a nonstandard UX. Some folks really loved it, but it was an acquired taste.
[0] https://mprove.de/script/99/kai/2Software.html
I remember the Unix-ness was a big part of OS X’s nerd popularity. People were talking about real Unix terminals, for example.
Later Windows also aimed for the same thing with their new console app and Linux support. Yet macOS has remained the same. The Terminal app feels essentially unchanged and there’s no good app package service (eg brew etc - these are third party and can mess up your system.)
Even Xcode is, well… look how extensions were restricted.
Modern macOS feels boring, but also not aimed at developers.
Back in early 2000s it was a top choice if you wanted some kind of unixy box with a polished GUI desktop that "just worked", especially if you wanted a laptop. BSD and Linux were fine, but as a desktop OS they were a very different experience from today, took way more tinkering even on a desktop PC as anyone who had to write their own X11 config will tell you. Today installing a Linux desktop distro is so easy and hardware compatibility is so good that the tables have turned, also if you are the type of user that wants a big DE (no judgement) the Linux DEs today are far more polished, people still complain today but if you go back in time it was a mess. These days MacOS seems extremely restrictive and awkward by comparison, on the one hand a huge chunk of the userland got stuck in time, while Apple have become more and more hostile to any kind of changes and customisations to the more unixy side of the system.
Sun had an agreement with Toshiba for Solaris laptops, but they were rather pricey.
UNIX is stuck in time, hardly anything improved beyond file systems, and small API improvements, and that is what macOS is measured against, POSIX certification.
To note that the only standard UNIX UI is CDE, and anything 3D isn't part of POSIX.
ZFS, BcacheFS, HammerFS... I think OpenBSD will have a better FS soon.
On modern FS', the plan9/9front ones are pretty much ahead of almost anything; but plan9 it's a Unix 2.0. It went further. On 3D, forget POSIX. GL was the de facto API and now Vulkan, and the most common middleware multimedia API it's SDL2.
Yeah, but none of that is UNIX(tm).
While IrisGL was born on Irix, it was placed under ARB stewardship, which after Long Peaks disaster became Khronos.
Vulkan only exists thanks to AMD offering Mantle to Khronos, an API designed originally for game consoles, very much not UNIX, and had it not been for AMD, Khronos would still be thinking what OpenGL vNext was supposed to look like.
SDL also has very little with UNIX history, as it was created originally to port games from Windows to Mac OS (not OS X) and BeOS.
> The Terminal app feels essentially unchanged
Is this supposed to be a bad thing?! It's a rock-solid workhorse. If they changed it I would stop trusting macOS to be any better than the linux flavor of the month
> I remember the Unix-ness was a big part of OS X’s nerd popularity.
* https://www.gocomics.com/foxtrot/2002/02/25
> there’s no good app package service
It's called the App Store.
Not exactly...
The App Store install what you would install through .dmg or .pkg. This is, if you install, for example, Android Studio, Docker and UTM, you will have three QEMU executables, one for each app.
Homebrew does quite a good job as a package manager for Mac, however, it's far from how the package managers work in Linux distros. For example, by running ``sudo pacman -Syu`` I upgrade everything that is installed, including the kernel, standard libraries, Python packages, language packages, manpages and so on. In Mac, I have to upgrade the system through system updates, homebrew packages through ``brew upgrade``, Python packages through pip, the App Store installed stuff through App Store and the manually installed apps through whatever the way they are upgraded.
> For example, by running ``sudo pacman -Syu`` I upgrade everything that is installed, including the kernel, standard libraries, Python packages, language packages, manpages and so on.
I actually view this as a liability. System package management should be distinct from userspace package management.
Agreed. Joining system and user packages leads to cases like "how did my python upgrade break yum".
It won't happen if the packages are well maintained. Note that if you want a different version for your project you should use a pyenv or similar and have finer control on its dependencies
You can still use pip inside pyenv or similar. Pacman would install the system-wide stuff, so you won't need to bother about the libraries that a package that you have installed is using.
Mentioning this classic XKCD: https://xkcd.com/1987/. This only made sense after using a Mac. While using Manjaro it was quite organized: only the pacman-installed libraries and the ones of my user projects. Now in Mac I have the default Python, the several Python versions installed from several dependencies in Homebrew, and so on.
Do developers use the app store? 99% of what I install on my computer isn't available through the app store. I just use it for Apple apps (Pages etc). Pretty much everything else is freely available and more fully featured outside the app store.
Plus, it's spammed with low-quality for-profit crapware—the iOSification of an otherwise fantastic platform
Yes, they publish their apps on the App Store and make money from customers.
As for Apple making life easier for developers by making their OS more like Linux, that is not good for the rest of their users, and these users are more important than developers. It's preferable that developers jump through some hoops, rather than making the OS worse for non-developers.
There's the third option of "not enshittifying the os but also not making it more like linux"
When's the last time you used the App Store to install a CLI utility or a shared library?
I don’t think you can call the Mac App Store “good”.
Sure, it's horrible. It only makes it easier than any other platform for developers to make money by selling their software to users. And it only makes it easy and secure for users to purchase and install software. Awful.
I just wish it would be more widely available.
Anyone using GNUstep successfully?
Also the influence of WebObjects has been unappreciated.
EOF was probably the first ORM and Direct To WS the first web-based no-code tool.
Absolutely. WO was a brilliantly designed framework (especially for the time) and being somewhat disillusioned with the state of web development in the last decade, I'm still using it as the UI layer for some of my own applications. It just can't be beat when it comes to throwing together a quick app, essentially being AppKit for the web. And as you say, it's influence was great, although I often wish it had a little more influence.
EOF was a great ORM framework as well and I never really understood ORM hate - until I had to use ORM frameworks other than EOF which generally feel … not that great. I ditched EOF a decade back though, due to it being, well, dead, and replaced it with Cayenne which is an excellent, actively developed ORM that feels very much inspired by EOF's design principles.
In the last few years, I've been working on a WO inspired framework (to the point of almost being a WO clone on the component/templating side) as a side project. It's still very raw when seen from the outside, no documentation and still operating under a bad codename - but hoping to make a release and port my remaining WO apps in the coming year. Hopefully it will add at least a bit to WO's influence on the web development world :).
https://github.com/ngobjects/ng-objects
https://www.youtube.com/watch?v=-obvt93wSFc
Especially hilarious when you think of the rising popularity of HTMX.
WebObjects at the time revolutionary model of using the URL for state management would work really well with the new trend back towards server side rendered components.
Totally. I've been very happy to see the world embrace htmx in the last year and it's given me confidence knowing I'm doing the right thing with ng-objects.
The methodology htmx uses is in many ways identical to what we've been doing in the WO world for almost 20 years using Ajax.framework (which I don't know if you're familiar with), a WO plugin framework that most importantly adds "partial page updates". So you can wrap a part of a page/component in a container element, and target it so only that element gets rendered/replaced on the client side when an action is invoked (link clicked, form submitted etc.).
And yes, combined with WO's stateful server side rendering and URLs, it's ridicilously powerful. I usually design my WO apps so users never actually see a stateful URL, they always land on "static URLs" while stateful intra-page work happens through page replacements. I love it.
It is basically a whole generation rediscovering what we were doing in the 2000's, now that SPA craziness went too far.
It also influenced the design of Distributed Objects Everywhere at Sun with OpenStep, which eventually got rewritten into what became Java EE.
Anyone familiar with Java EE will find a familiar home in WebObjects, specially the Java rewrite.
Around 2010, I started learning Objective-C to be part of the whole native mobile development movement. What I didn’t know when getting into this was how much of a history lesson I would have to participate in to understand the background behind so many aspects of the language and the core frameworks.
Yes, I had a similar experience with Objective-C. While I found it generally odd, it makes complete sense as a C/C++ alternative with reflection capabilities and a lean event loop. I disliked the memory management. The language hasn’t aged well but that doesn’t mean it lacked clever ideas for its time.
I miss that era!
I remember seeing finder running on NeXT at a Halloween party at he Omni group in 1999. That was a cool experience.
> Along with analysis and debugging tools, Apple still gives away everything needed to build apps for the Mac, iPhone, or iPad.
Very conveniently glossing over the fact that developers still have to pay an annual Apple Developer Program subscription fee in order to be able to distribute their apps. TANSTAAFL, as always.
Very conveniently glossing over the fact that if are developing for the Mac, no you don't. You can distribute it outside the store without paying anything.
iOS, yep you're right.
If you choose not to pay Apple for the privilege of macOS development, you will need to teach users increasingly more arcane tricks to get the app running. As of the latest macOS release, the old trick of "right click -> open" stopped working, and the new trick is "open -> go to system settings and click on a magic button -> open again".
You don't pay Apple for the privilege of development, you pay them for the privilege of guaranteeing your users you are a legit developer who cares about their safety by registering and letting your app be reviewed.
Considering it would take less than a day for Apple's registration scheme to be overrun with billions of fake app builders if they don't put in a small monetary roadblock I don't see how this situation could be improved.
This has little bearing on desktop software, which usually doesn't go through the App Store. Apple does not (yet?) require review for traditionally distributed desktop app bundles or executable binaries. The developer fee is paid in that case just to get a signing certificate. The increasing number of hoops necessary to get unsigned things to run seems to just be funneling more developers into paying up and becoming beholden to Apple so they can stop the nagging of their users.
I think GPs point still stands for signing certificates. The need to pay something increases the barrier to entry. You can't just create a million developer accounts to get a million signing certificates after Apple bans one of them.
I think this is fine. If you're a business, the developer fee is not a significant expense, and it makes the whole ecosystem work smoothly. If you're a hobbyist, student, open source developer, or otherwise in a position where you won't make that money back quickly, macOS provides a workaround for opening unsigned apps. This is so different from the terrible situation on iOS.
Until the mid-2010s, most apps were unverified and people trusted the distribution channels where they got them from.
iOS can sideload. Is that not allowed in the development license?
Not just distribute, even to run them locally on your own devices for longer than a few days.
It surprised me that Steve Jobs would be so open to unix.
I thought with his not invented here syndrome and desire to control everything and attraction to simplicity and graphical UI he would have hated unix.
How did he come to love unix enough to build NextStep on it?
Steve Jobs was very open about taking things from elsewhere and refining them for consumption.
Lisa and Mac were products of his seeing the Smalltalk GUI at his visit to PARC. There was nothing off-the-shelf, so they had to be built from scratch.
Of NeXT he said that he had been so bamboozled by the GUI at his PARC visit that he missed the other two, arguable more important concepts: OO and networking.
NeXT used as much off-the-shelf components as possible: Ethernet + TCP/IP for the network, Unix for the OS, Adobe's Display Postscript for graphics, Stepstone's Objective-C for the OO parts (which in turn mashed together C and Smalltalk). It bundled TeX, Sybase SQL Server, a bunch of scripting languages, Webster's dictionary, etc.
They only built themselves what they absolutely had to to get the machine and user experience they wanted.
> Steve Jobs was very open about taking things from elsewhere and refining them for consumption.
See also, forking KHTML into WebKit to build Safari when MS cancelled Internet Explorer for macOS and the platform was left without a robust browser choice. For two reasons: That they were somewhat comfortable letting MSIE reign for so long rather than making an inhouse option, and for not starting over when they did.
It’s funny that Apple originally wanted Mozilla (proto-Firefox) but couldn’t figure out how to build it on Mac OS X in a reasonable amount of time.
> couldn’t figure out how to build it
Well, no. They evaluated the existing choices and decided that KDE's code was a better fit.
> Melton explained in an e-mail to KDE developers that KHTML and KJS allowed easier development than other available technologies by virtue of being small (fewer than 140,000 lines of code), cleanly designed and standards-compliant.
https://www.wikipedia.org/wiki/WebKit
According to Ken Kocienda's book (he was one of the original developers of Safari), that email is a post-hoc rationalization. The "evaluation" was literally him and another guy trying to build Mozilla for weeks and failing, and then someone else joining the team and quickly building Konqueror instead.
If the people evaluating your code can't get it to build, I'd say that's a good sign that your code isn't ready to be taken up by others.
I’m pretty sure your history is off here. There was a 5 year agreement around 1998 to keep Office available for the Mac, and to make IE the default (but not only bundled) browser available.
Safari was shipped almost exactly at the end of that agreement, and the announcement as to IE for Mac being discontinued was 6 months later.
And WebKit eventually birthed Chromium. Truly the circle of life.
He wasn't, his position regarding UNIX beards was well known.
Supporting UNIX was a business opportunity to go against Sun and other graphical workstations.
There are recordings of NeXT meetings, and his famous appearance at USENIX, regarding this.
Note that everything that matters on NeXTSTEP is based on Objective-C and Framework Kits, zero POSIX, beyond what was need for those government and graphics workstation contracts.
Maybe he got influenced by Pixar guys: https://www.youtube.com/watch?v=iQKm7ifJpVE
Even though IRIX had its quirks.
I'm not sure the timeline adds up for that - maybe Next cam before he bought Pixar?
Steve Jobs left Apple and founded NeXT in late 1985 with the intent of developing a 3M computer: 1 MB of memory, 1 million pixels and 1 million instructions per second; or powerful enough to run wet lab simulations.
Jobs bought Pixar in 1986 when they developed their own computer systems. Luxo Jr. was shown at SIGGRAPH that same year, one part advertisement for their computer, and one part fun hobby project because some of the Pixar guys aspired to one day do a fully computer animated full length feature film of their own. This worked out very very well for them. Eventually, but they also stopped developing the Pixar Computer System in 1990 in part because Jobs was losing a lot of money propping up both NeXT and Pixar.
Development of NeXTSTEP began in 1986 under Avie Tevanian based upon the Mach kernel he had co-developed at Carnegie Mellon which was developed with the intention to replace the kernel in BSD, which at this point I believe is still just BSD and years away from fragmentation. NeXTSTEP 0.8 was previewed in October 1988 and all the core pieces were there: the Mach kernel, BSD, DriverKit, AppKit, FoundationKit, Objective-C runtime, and the NeXTSTEP GUI. 1.0 came in 1989.
IRIX 3.0 was released in 1987 debuting the 4Sight window manager which isn’t too similar to what was released in NeXTSTEP but does use NeWS and IRIS GL, however it was based on System V UNIX. It’s not until Pixar started making movies, I think actually starting with Toy Story, that they bought Silicon Graphics workstations. For Toy Story, the render farm also started off using SGI but eventually moved to Sun computers.
So if anything, IRIX and NeXTSTEP are probably a decent example of convergent evolution given they were both (at least initially) in the business of making high end graphical workstations and neither needed to reinvent the wheel for their target market.
SGI use within Lucas Film (and thus Pixar) goes way back to IRIS 1000/2000 era, so definitely 83/84 afaik.
Sure, but given the timeline, it’s unlikely the decision came about simply because he was influenced by “the Pixar guys”. I pointed out that the goal for the first NeXT computers was to be able to do wet lab simulations, and this was due to a conversation Jobs had with Paul Berg while Jobs was still at Apple. They met again after Jobs founded NeXT before drawing up the initial spec in September 1985.
More likely the decision to use Mach/BSD was because Avie Tevanian was the project lead for the operating system.
4Sight also didn’t debut until IRIX 3.0 (1987, also when it picked up the IRIX name), prior to that they used mex which I traced back as far as 1985 and prior to that I’m not sure, but I don’t think they had a window manager and it seems unlikely they would prior to 1985.
yeah, makes sense.
it's a far-fetched idea anyways. It's a five months difference; NeXT in sep '85, and pixar in feb '86.
More likely scenario is they wanted to come to market as fast as possible with limited resources, so porting Mach kernel and BSD (both proven/robust things) to their platform was probably the fastest route; It'd also have an existing base of developers to attract and carried some weight if they targeted workstation market.
edit: this is what made me think why maybe he was influenced, since Steve Jobs did actually launch another "cube" two years before NeXTcube, which was developed in the time before him buying pixar. This thing required an SGI/Sun to be attached: https://en.wikipedia.org/wiki/Pixar_Image_Computer
> I'm not sure the timeline adds up for that - maybe Next cam before he bought Pixar?
Jobs became majority stakeholder of Pixar in 1986. NeXT was incorporated in 1988.
* https://en.wikipedia.org/wiki/Pixar#Independent_company_(198...
* https://en.wikipedia.org/wiki/NeXT_Computer
But Unix workstations were a thing even before then: 68k-based systems were already around in the 1980s, with Sun (taking just one example) releasing their first product in 1982:
* https://en.wikipedia.org/wiki/Sun-1
IRIX-based systems on MIPS were big in the 1990s (post-Jurassic Park), but SGI also started with 68k-based systems.
They launched NeXTcube in 1988, but they incorporated in sep 1985.
I mean, Mach 2 was cutting-edge and freely available from CMU. Probably less a love of UNIX and more the necessity of having a practical base for a new workstation OS.