Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
Sign in to follow this  
Maes

Are computing platforms all about the OS now?

A computing platform is just its OS?  

28 members have voted

  1. 1. A computing platform is just its OS?

    • Yeah. Go back to the 80s if you don`t like it, you dinosaur!
      13
    • By now it is, but it really shouldn`t.
      8
    • Hell no. Amigas didn`t kick IBM PC butt by using Linux!
      6
    • The hell I care, I use a game console.
      1


Recommended Posts

In the olden days, you would have e.g. Spectrum vs C64 vs Amstrad, then Atari ST vs Amiga and eventually IBM PCs. Then there was always Mac vs IBM PC (funny thing how Mac fans generally avoided direct comparisons with the Amigas and Atari STs, which used similar hardware and were also aimed at graphics, music and TV production).

In any case, these were well defined hardware platforms, with their own hardware, CPUs, custom chipsets, and of course OS, games and applications. It was the combination of all these (with a particular weight on the hardware or lack thereof) that made each platform special. Sure, the Amiga OS was wonderful and all and kicked MacOS in the butt anyday, but the Amiga would be just another 68000 platform (and there were plenty in the day) if it wasn't for its custom chips.

Plus, the OS you used was largely defined and limited by the platform. With 8-bit micros you sometimes had a choice between the built-in BASIC, CP/M or at most GEOS. Macs, Atari ST and Amiga were limited to their own OSes and maybe some version of Unix, on some later revisions. IBM PCs could pick between various "DOSes" and maaaaaaybe by stretching it a lot, UNIX (for whatever reason you'd even install that on a 386SX with ultra-expensive RAM, by Golly!).

But today...I keep reading e.g. "Windows vs Linux vs Mac" or "Debian vs Windows", or "PC or Mac? The answer: Ubuntu"... wtf?! Since when the debate between two hardware platforms can be settled by the choice of OS (which would also imply using an IBM PC compatible...).

Granted, Macs now have the same hardware as IBM-PCs and can even run Windows XP, and they even announced a new AmigaOS....for IBM PC compatibles, but still, wtf?! Can you call an IBM PC running something called "AmigaOS" an Amiga? Or call it "MacOS" and call the result "a Mac"?

I mean, what happened? The hardware doesn't matter anymore? A "platform" is only defined by the OS you use? What happened to "IBM PC vs Macintosh" debate like in the old days? Different CPUs, different hardware, different everything (OK, how Apple drastically changed their hardware architecture over the years to make them more and more similar to IBM PCs is another matter). Please discuss.

Share this post


Link to post

Why not?

I mean, if they can make computers use similar hardware so that the only architecture difference for software developers to take into account is in the OS, then isn't that just better? Also, unifying the used hardware lowers its price, since when the sales volumes are bigger (as everyone uses the same stuff) the components can be sold for less. Now if we had ten different systems with incompatible hardware and evenly distributed users, each of the system would sell one tenth of the hardware it does today and that would definitely show up in the prices.

I'm all for standardizing hardware, and if they could get around to even standardize native OS APIs that would be even better, but I know we'll never get around to that.

Share this post


Link to post

I think it really depends on the market segment. In the world of consumer desktops and business desktops, yes it's all about the OS these days. But in other areas, the hardware still makes a difference. While I would say that the OS is starting to make more of a difference in servers, what about mainframes? You don't buy a dual quad-core Xeon Dell server to act as a mainframe because that's just silly. You buy a mainframe with special mainframe hardware to increase throughput. Mainframes are not servers.

It's the same thing in the embedded market. I wouldn't buy a Core 2 Duo to put inside my latest cell phone. I'd look at something else, like an ARM, VIA, or Atom.

Share this post


Link to post

I think the unification of hardware and architecture is what is making the market more and more exciting. When Apple decided go Intel with their new macs, it was the best move they made from both performance and compatibility standpoints. It means less time for software developers to port versions of software from one OS to another (assuming architecture is unified).

Share this post


Link to post
Maes said:

In the olden days, you would have e.g. Spectrum vs C64 vs Amstrad, then Atari ST vs Amiga and eventually IBM PCs. Then there was always Mac vs IBM PC (funny thing how Mac fans generally avoided direct comparisons with the Amigas and Atari STs, which used similar hardware and were also aimed at graphics, music and TV production).

In the olden days, you had several manufacturers that had proprietary hardware and OS. Then IBM decided to allow people to make computers compatible with its PC standard. (For a long time, PCs were only IBM's computers, and others were called "clones" or "compatibles" instead...)

As a result, increased diversity and competition followed. The Mac managed to survive because it was widely used in some industries (especially imaging and such graphic stuff). While Atari ST computers were popular in the music world, they died anyway because PCs had sound cards. Commodore Amigas and Amstrad CPCs died because they couldn't compete with PCs on the "computer stuff" front and with consoles on the "cheap game platform" front.

Really, the only surprise is that Apples managed to survive. The PC is the perfect illustration that an open standard works better than a closed one.

Share this post


Link to post

OK, there were "similar enough" hardware platforms in the past, too, with the most notable example being the Commodore Amiga's ability to perfectly emulate any 680x0 Mac (down to the point that before the switch to PowerPCs, the fastest Mac in the world was one emulated in an Amiga using an 68060 CPU!).

But still, there was enough diversity in areas such as memory management, graphics, sound etc. to really set one platform apart from each other, they were not just a bunch of more or less similar computers achieving the same results with different hardware, nor "supersets" of existing architectures (with the exception of the IBM PC clones, which took the superset concept to the extreme). E.g. you wanted to do games and TV production? The Amiga was ready for you. You really couldn't live without that integrated MIDI port? Pick the ATARI ST, and so on.

I am wondering, because now it makes even less sense listening to e.g. people preaching about how wonderful and whatnot Macs are, how things "just work with them", how better application x is on the Mac and not on the PC etc....WTF, it's just an IBM PC clone running a different OS, there's nothing inherent *in the hardware* anymore that would make a Mac better or worse than a PC.


Plus, that "leveling" of hardware differences seems to be concentrated only in the consumer desktop market (and, by extension, to Servers, unless you still use Alpha or Sun...) and to the only two (vaguely) distinct hardware platforms still existing.

To find a drastically different hardware configuration, one must look at game consoles (with the exception of the XBox...which could also be dragged into the subject).

I dunno, this leveling of hardware differences makes it all but stupid to still be treating everything as a separate platform, still having to pay for separately developed software etc.

Share this post


Link to post

Well, now that modern computer hardware is a "superset" of all former hardwares, I don't know what sort of specialized hardware would even have a market. You could produce computers with more video, graphical or audio processing power than the typical personal computer, but in the personal computer market, a person who wants more power for one thing will usually want more power for everything, so the easiest way to sell a computer to these people is to give them the same base hardware as everyone else and an array of expensive cards to complement it. When consumers want their computers to do everything, hardware designed for a certain niche would be seen as being weak in every other area instead of strong in its specialty.

Also, keep in mind that "Windows vs. Mac" is the only question that the typical computer shopper can understand. You may not realize it, after hanging out with the computer geeks that are known as Doom players for so long, but it does take a fairly advanced grasp of computer science to understand what makes one CPU better than another one. Honestly, I don't have the first clue what makes CPU X better than CPU Y, even though I know a bit more about computers than most.

Share this post


Link to post

My apologies; I've edited out my Palin-esque turn-of-phrase. I don't know what convinced me to use it, knowing full well that it wouldn't appeal to my fellow Doomworlders - I guess I'm just a maverick.

Share this post


Link to post

What happened? The hardware platform matters less.

Not that there aren't any key advantages of one hardware over another (if you want performance and reliability, you'll be picking UltraSPARC over amd64 any day), but many operating systems span multitudes of hardware. Windows may be stuck on i386/amd64, but even Mac OS X is on PowerPC, i386/amd64, and even ARM (iPhone). You can run Debian and OpenBSD on i386, amd64, PowerPC, SPARC, Alpha, etc with relatively little user interface change; all the same programs run on all the platforms, so the hardware matters less to your workstation or desktop user. Hey, both of them even support m68k still, and OpenBSD still supports VAX, though you'll probably desire something more than those machines if you're a desktop user (web, DNS, mail servers will be fine on them) :)

Share this post


Link to post

Now that buying a computer that doesn't implement the i386 instuction set is practically impossible - hardware isn't as big an issue as it used to be and selecting key components is often a binary decision. CPU/motherbosrd - Intel or clone. Video card - nvidia or AMD. Sound - Creative or integrated audio. Keyboard/mouse - Microsoft or Logitech. Even the choice of OS basically boils down to OEM Unix or open source Unix.

Share this post


Link to post
Guest DILDOMASTER666

Windows is UNIX based?

Share this post


Link to post

Despite its flaws, Windows (or rather, the Dos-Wintel platform) has the most far-fetched compatibility record of any computer system built to date.

Plug in a 5.25" drive with DOS 1.0 on a dual core intel, and it will work. Try running a windows 1.0 application in Windows XP (the 32-bit version, at least) and it will also run, buttons menus and all. The only exception are those DOS applications that bang too much directly on the hardware (practically, most video games of the 90s and scene demos) but you can run most 80s and some 90s DOS games without sound, while a compatibility layer like VDMSound will enable you to run a lot of them natively (without using a CPU emulator like DOSbox).

On the converse, e.g. Mac OS was infamous for its incompatibility between successive versions, something aggravated by the continuous changes in the hardware. Hell, they even gave up on their own "superior" floppy disk format, leaving new Macs with a drive that couldn't read 800K disks...

Compare also to how "compatible" Linux distros and distribution packages are between them and how many development forks there are...a nightmare.

Share this post


Link to post
Maes said:

Compare also to how "compatible" Linux distros and distribution packages are between them and how many development forks there are...a nightmare.

This has just been simply untrue for a long time now

Share this post


Link to post

I think it's just a time-based issue. When a new PC is coming out, and Windows 7 is coming out about the same time, they'll get bundled together.
It's basically what's out at what time, makes the OS the PC will use.

Share this post


Link to post
MikeRS said:

This has just been simply untrue for a long time now


If any important app actually used it, that is. I'll admit I'm not the greatest Linux user, but the few times I had to use it, getting a particular application required downloading a version specifically made for any distro I happened to be using at the time (and also to find e.g. that an app I needed only came in Redhat's installer flavour).

Of course, the whole argument can be conveniently circumvented by treating each Linux distro as a different OS, so it would be perfectly legit to have separate versions and installation packages (after all, even now, there are several flavours of windows, not always 100% compatible). We don't expect Windows msi packages to install on Mac OS, so why expect rpm packages to work with everything?

Share this post


Link to post
MikeRS said:

Windows NT is actually a VMS clone (and thus the name, V->W, M->N, S->T; WNT)

Bullshit, it was a rebranded OS/2 and Windows NT originally meant Windows New Technology to differentiate it from the older DOS-based Windows.

Share this post


Link to post
Maes said:

If any important app actually used it, that is. I'll admit I'm not the greatest Linux user, but the few times I had to use it, getting a particular application required downloading a version specifically made for any distro I happened to be using at the time (and also to find e.g. that an app I needed only came in Redhat's installer flavour).

I have no idea what distribution you were using or the application, but any self-respecting project that provides binaries (be it open source or proprietary) tends to follow the LSB, and quite often even using 5-year-old distros as the lowest common denominator (I've actually seen 90s distros still in use today, though that's not a common occurrence). Smaller projects (like Doom ports) tend to just give out source code, which may or may not be a hassle to compile (eg, Chocolate Doom and PrBoom are pretty dead-simple, whereas Doomsday and ZDoom tend to require all sorts of hacks to teh code to make it compile).

Share this post


Link to post
MikeRS said:

I have no idea what distribution you were using or the application, but any self-respecting project that provides binaries (be it open source or proprietary) tends to follow the LSB, and quite often even using 5-year-old distros as the lowest common denominator


Wow, we were both a bit slow on this one...I had *distribution packages* in mind, aka RPM, pck, dpck or precompiled/source code distributions. What about some software that's only available in RPM or DPB?

You're pretty much fucked if you use e.g. Ubuntu and what you need is only available in RPM...unless you are t3h 1337 h4xx0r and can compile everything by yourself (btw, I've never seen a makefile that works as intended on the first shot outside the machine it's been developed on, unless it's pretty trivial).

I'm not Linux-savvy enough to know how good inter-distribution compatibility is (although LSB seems to "guarantee" a 6-year grace period and a minimum common denominator...no idea what happens with GUI programs and device drivers though).

Share this post


Link to post

You can convert RPM to DEB and vice-versa (among other package formats), though you need to use the command line to do so. I'm quite surprised that no one has made a GUI for Alien yet.

Share this post


Link to post
Mindless Rambler said:

You can convert RPM to DEB and vice-versa (among other package formats), though you need to use the command line to do so. I'm quite surprised that no one has made a GUI for Alien yet.


That still counts as a direct incompatibility though -I don't need to convert an msi installer to an exe one-, and one extra potential frustration point for new and experienced users alike.

Share this post


Link to post
Maes said:

(btw, I've never seen a makefile that works as intended on the first shot outside the machine it's been developed on, unless it's pretty trivial)

I have, not counting what I've written for work. That's not including auto-generated ones with Autotools/QMake/whatever, either. You just need to know how to properly author them, and to also be sure to let end users know if they require a specific flavor of Make (GNU make, Unix's nmake, BSD make, Microsoft's nmake, etc.)

Now using a Makefiles without Autotools to build software across platforms is another story... those could fail if not properly done.

Mindless Rambler also said:

Bullshit, it was a rebranded OS/2 and Windows NT originally meant Windows New Technology to differentiate it from the older DOS-based Windows.

The whole VMS->WNT name thing is bullshit, but VMS developers were hired to work on the NT kernel originally. So I'm pretty sure that a few of their design patterns and philosophies are inevitably in there.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  
×