Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
hex11

2015: IBM PC demoscene is beginning

Recommended Posts

Move over Future Crew and other lamers, here's what real coders are capable of:



Btw, the Trixter in the credits is the same guy who founded MobyGames.

Share this post


Link to post

Were there original PCs with digital audio capabilities? The music throughout most of it is all PC speaker beeps, but towards the end, I hear what sounds like bit-crunched SID-chip music.

[EDIT] Looks like the uploader edited in music during the credits. I found a demo on an actual IBM 5150 and it's actually silent near the end.

https://www.youtube.com/watch?v=aibZKrXc8Nk

[EDIT EDIT] Never mind. There's music. I'm just deaf.

Share this post


Link to post
hex11 said:

Move over Future Crew and other lamers, here's what real coders are capable of:


I'm pretty sure that if Future Crew had access to the tools that Hornet + CRTC + DESiRE had, they'd just write their demos to run on the tools that Hornet + CRTC + DESiRE had.

Unless it was a wild compo.

Either way, anything to whoop the Amiga fanboys' asses is something I approve.

Share this post


Link to post
Stygian said:

Were there original PCs with digital audio capabilities? The music throughout most of it is all PC speaker beeps, but towards the end, I hear what sounds like bit-crunched SID-chip music.


I didn't read all of Trixter's tech breakdown yet, but I'm guessing the music is just pulse width modulation. Some old CGA era games used it for speech or maybe title screen music. It works, but burns up CPU cycles and doesn't sound very good compared to an actual programmable sound chip. The music playing during the demo effects was pretty simple compared to the credits screen, probably because that's all the cycles they could spare...

I think the only early "multimedia" PCs were IBM PCjr and Tandy 1000. Those ones had sound chips on-board, and could even do 16 color graphics.

Overall though, back then you'd have been better off buying a C64 or Amstrad CPC if games were your main interest. C64 always had a good demoscene, but for some reason Amstrad CPC only had "meh" productions until just a few years ago:
https://www.youtube.com/watch?v=YJosZfm560Q
^ Btw, that's also running on 4.77 MHz CPU (a Z-80 in this case)

Share this post


Link to post

For a MOD player that'll work, because you're dedicating the entire CPU to music routine (and some minor screen updates). Won't work for demo or action game though, at least not on an IBM XT class computer. That's why in the Hornet demo they let the PIC (interrupt controller) babysit the PC speaker during the demo, and only switch to PWM at the end credits. But the PIC can't be used to do any intricate speaker manipulation (it's 100% on or off, no in-between) so you only get those basic beep-beep tones.

OTOH, I forgot that in Japan they actually had PC-9801 and such in the early 80's, and those *did* have sound chips on-board, and very good ones at that:
https://www.youtube.com/watch?v=SwKnV9-8i6E
If they had sold such machines in the west, I'd have been more interested in the PC hardware, instead of vastly preferring the various other 8 and 16 bit home computers (which in those early days all seemed more powerful and less expensive than IBM PCs and clones).

And here's another cool demo I just found. This one's for unexpanded VIC-20. That means a 1 MHz 6502 with just 5120 BYTES of RAM! :-)
http://www.youtube.com/watch?v=2SdGkkp1aq8

I really gotta get me an old 80's computer of some kind...

Share this post


Link to post

Yay, Trixter strikes again! Time to pull out the old 8086 (or NEC V-20, tee hee) out of the basement ;-)

For whoever missed it, here's his previous masterpiece:



Now, I wonder if he could produce a demo for 8086 + 8087 math co-pro...

Share this post


Link to post
Maes said:

For whoever missed it, here's his previous masterpiece:

There was a sequel to that:

Share this post


Link to post

^ It's pretty sad once you realize that today, with 30+ years of hardware advancements, we still have to worry about a stupid animated GIF slowing down our NEW and IMPROVED Web 2.0.

OK, nobody said that programmers HAVE to extract the very last juice of CPU power for everything they do (even if people like Trixter would beg to differ), and I accept the fact that a lot of these advances in computer power just went towards allowing higher overheads to make the developers' lives easier. But that being said, choking on a single stamp-sized animated GIF is ridiculous. No amount of virtualization, VMs, APIs or crazy OOP shit can justify the abysmal performance and inefficiency.

Edit: speaking about the technical aspects of the demo...first of all, you'll only see colors with a composite display, as they are essentially produced with a sort of 1-bit trick-mode.

I guess the same one is used for the CGA video playback: this way when writing out a single byte to the screen buffer, you actually output several pixels at once. Neat, either way.

The "second act" with the hi-res monochrome display was also impressive: I recall seeing a presentation by Trixter where he sort of anticipated, by showing an example of how a video with a low color depth (actually, 1-bit B&W) but smooth animation was more visually pleasing than using the same bandwidth for a more colorful, but less smooth one. And here's an entire feature showcasing the tradeoff O_O

What underwhelmed me however was the sound of the demo: even with plain PC speaker beeps, I recall it's possible to make much more impressive pseudo-multichannel music through heavy arpeggiation (very common on the Speccy). I wonder why they didn't follow that approach here *shrug* Maybe the timing precision required to pull off the composite color tricks precluded them?

Share this post


Link to post

That's funpossible! Trixter must be using ancient secret alien techniques!

Here's something a little bit different...
https://www.youtube.com/watch?v=sNCqrylNY-0

They don't sell real home computers anymore, except on ebay and places like that. So this is probably the closest thing today (RPi, plz, you're just another Linux box). Of course it doesn't have ROM BASIC or any kind of simple OS or easy way to do stuff. You really need a special hardware programming device to use it. So it's not really equivalent, but chips like these could be used as a component for a retro home computer (along with simple video and audio chips, and a bit more RAM and small bit of flash storage). Because I don't think they manufacture the Z-80, M68000, or other old stuff like that...

Share this post


Link to post
hex11 said:

They don't sell real home computers anymore


Thank God. It's nice knowing that software will work without having to look at a huge table and hope my computer is listed.

Share this post


Link to post
Maes said:

^ It's pretty sad once you realize that today, with 30+ years of hardware advancements, we still have to worry about a stupid animated GIF slowing down our NEW and IMPROVED Web 2.0.

Well keep in mind 8088 Domination is a series of x86 instructions (long string of mov instructions for the most part) which produces the video. Can't really talk about efficiency and point at demos as examples. :P

Share this post


Link to post

@Csonicgo

But old school home computers were cooler
https://data.archive.moe/board/vr/image/1429/89/1429896512319.jpg
(dat Amstrad Mega PC)

and sexier
https://data.archive.moe/board/vr/image/1401/91/1401919356649.jpg
https://data.archive.moe/board/vr/image/1401/68/1401683852856.jpg
https://data.archive.moe/board/vr/image/1429/15/1429157562090.png

I bet you secretly have waifu posters of old computer ads around your bedrooms walls! Everyone does! :-)

Point is: these old machines had some personality, because you could really get to know them at a very deep level (if you put in the time and effort). They meant something special, because a Spectrum 48K was something unique, and so was an Amiga 500. Now we just have boring machines that nobody really understands what is going on under the hood (unless they're Intel engineer who writes microcode). Now everyone just uses layers upon layers of libraries to program generic machines. And of course those have bugs and problems too. It's just not as fun as getting your hack on in 6502 asm or whatever.

Share this post


Link to post

A lot of talk about the audio (which is understandable, I suppose, given how atrocious PC speaker is) but is no one else curious how they got all those colors out of a CGA display?

Share this post


Link to post

By finding characters in the font set that had pixels at just the right positions in the top row to produce desirable color artifacts, somehow drawing only the top row repeatedly instead of the character proper, and then changing the color of the BG/FG of the resulting pattern to get the desired color.

It's fascinating how you can abuse terrible video signals to produce imagery like that; I know that Sega Genesis developers often did something similar to fake transparencies or make gradients that'd be impossible otherwise by abusing how the system's horrible composite output handled vertical lines of alternating colors. Still, it's a shame that this particular setup is so resource-intensive that it's only really good for static images. I suppose it could have made for one helluva text adventure had these tricks been known back when the hardware was relevant...

Share this post


Link to post

ETTiNGRiNDER said:
is no one else curious how they got all those colors out of a CGA display?

Be curious no longer! The effect is explained in detail here and in less detail (but with pictures) here.

(This makes me want to write a CGA 1k-color image viewer, except the only CGA-compatible device I have with a composite output is my Tandy 1000, and as I understand it the video output from that is slightly out of phase with respect to a real CGA and will produce different artifact colors...)

Share this post


Link to post
hex11 said:

That's funpossible! Trixter must be using ancient secret alien techniques!

Here's something a little bit different...
https://www.youtube.com/watch?v=sNCqrylNY-0

They don't sell real home computers anymore, except on ebay and places like that. So this is probably the closest thing today (RPi, plz, you're just another Linux box). Of course it doesn't have ROM BASIC or any kind of simple OS or easy way to do stuff. You really need a special hardware programming device to use it. So it's not really equivalent, but chips like these could be used as a component for a retro home computer (along with simple video and audio chips, and a bit more RAM and small bit of flash storage). Because I don't think they manufacture the Z-80, M68000, or other old stuff like that...


What is a "real home computer" because this computer is real, it computes, and it's in my home and I built it.

Share this post


Link to post
hex11 said:

They don't sell real home computers anymore, except on ebay and places like that. So this is probably the closest thing today (RPi, plz, you're just another Linux box). Of course it doesn't have ROM BASIC or any kind of simple OS or easy way to do stuff.


That's not entirely accurate. Yes, those cheap-o computer type Famiclones are IMO the closest there is to a current production "classic" home computer.

8-bit? You got it. Awesome custom chips for video and sound? Sure. RAM? Well, just 2K NES RAM to play with, but all those "educational" carts implement a static 64K RAM upgrade, so hey presto, you're back in business! They also come with BASIC (TWO implementations, please! F-Basic and G-Basic!)

In fact, I'm surprised there isn't a hardware modding scene focusing on adding functionality and expansion ports/docks through modified carts. There's currently software being developed for the platform too, but IMO somebody really oughta yank that "privilege" out of the hands of underpaid chinese sweatstop programmers, and start pumping out some decent oldskool-y software for those machines!

Presumably, you could have disk controllers, USB ports, or anything else, really.

Share this post


Link to post
Woolie Wool said:

What is a "real home computer" because this computer is real, it computes, and it's in my home and I built it.


http://en.wikipedia.org/wiki/Home_computer

Basically we're talking about simple architectures that were meant for gaming and educational purposes. These are systems that can be fully understood and programmed/hacked on by a single person, to the extent of making games and demos that exploit all or most of the computer's capabilities. Doing the same on a modern machine requires a large team of programmers, artists and game designers. That's why in the 80's it was common to hear about "bedroom programmers" releasing hit games. This seldom happens now, except maybe a few indie games now and then (like Minecraft, for example) but even that is only half the story. The dude who made Minecraft doesn't really understand whatss going on under the hood of his machine. He's just coding to libraries, which run on an X server (or equivalent), which was built on a C library, which makes OS system calls, which hits the CPU, which maybe does some CISC->RISC translation, and I'm probably missing some layers in there somewhere... In contrast on home computer you typically just write directly to video memory and then your pixels show up on the screen. It's just that simple, and there are no layers in between, because you typically write in assembly language (or at least those machines facilitate such type of direct programming). Instead of using layers of libraries, you directly hit the custom chips of your machine. That makes the code non-portable, of course, but it's kind of the point (getting to know your machine, learning to use it fully and optimizing your code so it runs well on a fairly weak machine).

Not only the hardware of today is overcomplicated and unsuited to that kind of hobbyist programming, but the OS have also followed the same path. As Terry Davis puts it, Linux is a "mainframe OS" when compared to the simplicity of the ROM BASIC environment of C64 and other home computers. You can write buggy code and crash those old machines and just hit reset button - no harm done. You just get back up and start riding again in a second or two. If you crash modern system, you can get filesystem errors, or fuck up your system enough that you have to reinstall. You can also get malware or virus that fucks things up. There's no easy, convenient reset button that takes you back to a clean ROM BASIC slate that encourages crazy, no-holds-bared experimentation as was common on home computers.

Share this post


Link to post

I don't really see the problem here. Ancient '80s hardware with direct-to-metal programming had extremely limited capabilities. More capabilities usually means more complication. '80s computers by modern standards are novelties, they are worse at everything than modern computers and are barely functional (if at all) for most tasks users today would put them to. Modern PCs are designed the way they are because such designs work better for the sorts of hideously complicated things they're expected to do. A "computer for gaming and educational purposes" is a toy. This computer does my taxes, pays my credit card bills, buys things I want, plays music at CD quality, renders 3D graphics that John Carmack has compared to rocket science, and more, all at the same time.

BTW, modern operating systems won't let you fuck up your filesystem with a single error in your program. I'm not even sure if Windows 9x would let you do that. As for malware, do you know how much money, manpower, and resources goes into malware nowadays, and how much money is to be made by using malware? How much money do you think an organized crime syndicate could have made 30 years ago by trying to disable people's puny little microcomputers that had little to nothing of commercial value on them? BTW, there were actually 1980s PC and microcomputer viruses, though they were primitive (like their host systems) and spread through disks. I've heard some of them could manipulate hardware in such a way as to permanently damage it.

(Actually if you want to see programmers who know absolutely every quirk and feature of modern Wintel PCs, the gangsters who write malware are probably the greatest experts at x86 PCs in the world.)

Share this post


Link to post

Bare metal programming and the related skills are still relevant when talking about "embedded systems" or "system programming", but those tend to be a minority of today's IT jobs. Today many kool kidz wannabe programmers, but only as long as that means fancy "web programming", "enterprise web solutions", twitter buttons and and the such.

As for fucking up filesystems....yup, with any kind of FAT it's pretty easy to junk it (just open a file handle in R/W mode, and crash/reset the machine without closing it: POOF the file's size is zero, and its contents will be likely overwritten by the time you realize the damage).

With UNIX, I recall the warning was to NEVER turn or reset a workstation: that was GUARANTEED to fsck up your file system (pun intended), at least before "journaling file system" became a word.

Viruses and malware are another special case: "oldschool" viruses were more like showcases of the programmer's ability, they had to hide themselves in a single-tasking, barebones environment (where you'd think, it'd be easier for them to be detected), and they usually did not do any (intentional) damage, though overwriting
executable files and boot sectors wasn't particularly nice...

There was not much to steal from a user's computer, though, and no permanently-open internet connection to exploit (I dunno if there were modem dialer viruses...). Today's malware is more like a specialized class of "business" software, than an oldschool hack. No doubt their creators know a lot about the underlying OS, about networks, about security flaws in web browsers etc. but I doubt that knowing about the hardware would help much. Even the most sophisticated modern viruses use vanilla Win32 API calls, when they are not written in Visual Basic....

Share this post


Link to post

Those machines weren't just toys at the time. They were multipurpose microcomputers designed for all manner of things. Some of them could also boot CP/M, which was the de-facto business OS for small stuff before DOS (small stuff being not mainframes - think office workstations). Those machines ran WordStar, VisiCalc/SuperCalc, dBase, and other such typical office tools. The ones that could't handle CP/M had their own equivalents, and were used by small offices, mom & pop shop owners and such, or just individuals doing their home accounting, word processing, and yes... taxes as well (it's actually funny that you need quad-core with gigs of RAM to do identical things today).

Btw, you can easily fuck up a journaling filesystem if you just erase or corrupt the disklabel or equivalent partition meta data. At least you don't have to worry much about hitting reset when you system locks because of some driver bug. But it's not as simple as hitting reset and being back in clean ROM BASIC within two seconds. If you or some malware corrupts system files, or a hacker invades your system, you have to re-install things, maybe even wipe the system and start over from scratch, which can take many hours. There are even quite a few reports of various Linux package managers fucking up things for you. :-)

And that's not even getting into the "fun" aspects of library dependencies and other shit that modern systems are full of. Oh boy, I sure would like to play me some xgalaga like I used to in the 90's. Let me go compile it and have some fun. What's this... gcc is barfing with 1000 errors! Yeah, I'd rather just have a stable platform that I can count on working properly.

Share this post


Link to post

Speaking of which, do you guys think $75 is a good price for a Toshiba T1000 xe?

Share this post


Link to post
Blzut3 said:

There was a sequel to that:
Touhous

I don't know why that Zun guy just doesn't make a smartphone collection of all the Touhous and make turbobank already.

Aldo, did you know that Ecco the Dolphin was written in C?

Share this post


Link to post
Maes said:

That's not entirely accurate. Yes, those cheap-o computer type Famiclones are IMO the closest there is to a current production "classic" home computer.


Wait, that site says you get computer/keyboard, gamepads, BASIC carts, and maybe some other stuff for only $10? How's this possible? Are they using "real" famicom components (actual MOS 6502 and such) or is it a re-implementation in FPGA or something?

Share this post


Link to post

They are probably using the infamous NoaC (NES-on-a-chip) ASIC, that has been used on practically all famiclones made after 1995. If you manage to find a discrete-component Famiclone from the 80s/early 90s, with actual cloned CPU, PPU etc. hold on to it: they are rare, probably moreso than the real thing.

And yes, they can really be made that cheap. They're literally one cartridge port, a blob IC, and a couple of controller ports (and/or a cheapo keyboard, on "computer" models). But they work (almost) as well as the real thing ;-)



There are a few more modern consoles that have a NES emulation mode, but none of them gives 100% speed or is smooth enough to be really enjoyable, and the video/audio is even more off than the NoaC implementations. BTW, the NoaC existed way before FPGA was even known outside of research labs. It's an oldschool, gray market ASIC all the way ;-)

The important thing when getting any famiclone, is that it has a cartridge port. With that in place, you could do -almost- anything ;-)

Share this post


Link to post
hex11 said:

Those machines weren't just toys at the time.

Yes, they were. People expected computers to multitask, and these computers did not do it. The Operating Systems they ran didn't allow it. The 6502 didn't allow it. It wasn't until the Intel 80386 that computers could do something like copy text from one program to another without having to close a program. Anything before Protected Mode was an array of kludgy hacks that would come tumbling down as soon as a program started misbehaving. And even then, it took years before it was solid enough to use (Windows NT 4, anyone?)

Home computers were seen as "wonderful devices" because of the potential they might have had if they were truly multitasking machines. They weren't. They were, at their core, relatively-inexpensive, finite-state machines with so little RAM that video games were the only thing keeping those computers from being fancy "word processors", which Brother, Wang, and IBM were already selling in the 80s and making a killing doing it. If I wanted a large, typewriter-like machine that makes correcting mistakes easy, would I pick one that had a high-res monochrome display worth a damn, or a VIC-20 plugged up to a flicker-fest of a PAL TV with wide text that made my eyes hurt?

By the late 80s the home computer market collapsed because people realized they were lied to, and the IBM PC was resurrected as the platform of choice. Turns out that standards are what won the computer wars, not affordability, because standards allow streamlined production and lower prices. Meanwhile, in Home Computer Landâ„¢, This was not the case. Software had to be ported from platform to platform, sometimes from scratch, and even between computer families! And they all used the bloody 6502 and Zilog Z80! Unbelievable.

Now don't get me wrong, the MOS 6502 is an amazing piece of hardware, but it required a lot of help to make it do something besides spit out values from registers. That "help" was usually not standardized (except for the C64 series, which sold like crazy BECAUSE of this) and caused total hell for programmers, who had dozens of targets to code for, and no idea if half of them were going to be selling the next Christmas season. It's no secret why Commodore is out of business and IBM, Microsoft, and Apple is still on the Dow Jones. Solid products with solid standards make solid profit. Duh.

That... and you shouldn't give the reins of your entire company to your dumb-ass son.

hex11 said:

Oh boy, I sure would like to play me some xgalaga like I used to in the 90's. Let me go compile it and have some fun. What's this... gcc is barfing with 1000 errors! Yeah, I'd rather just have a stable platform that I can count on working properly.

Yeah, about that. Looks like it's still maintained.

hex11 said:

The dude who made Minecraft doesn't really understand whatss going on under the hood of his machine


And why should he have to? All that accomplishes would be a game that would only run on his machine. The entire point of high-level programming is to ensure that Minecraft works on everyone's computer that has Java installed, which is pretty much every modern PC on the planet. See above for why this matters.

The reason for "bare metal" programming in the 80s was because they were simple, slow, memory-starved machines, to keep the price below $200. PCs that didn't have that problem were thousands of dollars. However we all know people get what they pay for. At least, we do now...

I don't know what's going on under the hood of my computer because I am not in the business of writing operating system drivers and modules. Even those are written in ISO C these days because you have no idea what endian-ness the CPU may have, how much cache it has -- CPU extensions being the exception, as those are still useful to know. There isn't any "level of conciousness" unlocked if you know from back to front the entire specification of a modern x86 processor. The 6502 can be described on a poster. Literally.

I do Atari 2600 programming in my spare time, but if I didn't have the years of documentation, tutorials, mailing lists, specialist software, and modern tools to do the debugging, I wouldn't stand a chance in hell. That didn't exist in the 80s, which guaran-frickin-teed that anyone that could break out in the computing field then was either a great businessman with some Electrical Engineering chops, or a Rain Man that dreamed in code like Rob Fulop and Steve Cartwright. Those guys were the equivalent of movie stars because they could see the forest and the trees. That isn't a skill you intuitively pick up, unless you can... intuitively pick that up. Carmacks, Cranes, and Cartwrights are rare diamonds, indeed.

(Disclaimer: My VIC experience was when I didn't have glasses, and I would get nauseated quite easily from squinting at the flickery TV screen. My mother blames that thing for my terrible eyesight.)

Share this post


Link to post
Csonicgo said:

Yes, they were. People expected computers to multitask, and these computers did not do it. The Operating Systems they ran didn't allow it. The 6502 didn't allow it. It wasn't until the Intel 80386 that computers could do something like copy text from one program to another without having to close a program.


Uhm...not really. Apple's Machintosh, Atari's TOS, Amiga's Workbench etc. ring a bell? They were all multitasking, GUI, and allowed you to do the awesome things you mentioned way before a 386 stopped costing 4 digits ;-)

You could even use GEM on C64 which allowed similar multitasking. Of course, it's debatable how many of those OSes (or shells?) had true pre-emptive multitasking, and not just simple task switching or more restrictive forms of multitasking (Mac OS was particularly notorious for its fragile runtime environment). I recall there were endless debates at the time, how e.g. "Only Amiga has true multitasking, Windows 3.x suxx!" or "Windows 95 pre-emptive multitasking? Ha! Macs had that since 1984!"

And being "single tasking" isn't necessarily a bad thing: don't forget that IBM Pee-Cees were essentially relegated to single-tasking while MS-DOS reigned supreme (that was until 1996), despite better CPUs and everything, but even with plain old MS-DOS it was possible to multitask to a limited degree, thanks to the magic of TSRs.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×