Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
K3K

The /idgames archive in the '90s

Recommended Posts

What was it like uploading WADs to the archive? Was it a pain in the a**? Or was it easy like it is now a days? Sorry if you think this topic is stupid but I'm only 13 and don't know too much about DOOM when compared to someone whos 20's-30+.

Share this post


Link to post

They probably still used FTP even back then. It wouldn't have been a pain in the ass if you had a stable connection, but it would certainly be slow.

Share this post


Link to post

Projects were generally smaller, so some of the speed of uploading these days would have been negated, but yes, it was just a slower, more tedious FTP process. What's a lot more interesting (and something you children wouldn't know about either) was that when Doom first came out the internet was practically non-existent - certainly when compared to how it is now. As I was 4 at the time I only have the vaguest of memories myself, but it is truly a very stark contrast.

Share this post


Link to post

FTP would probably have been done via DOS commands rather than with any sort of client. But to be honest, DOS commands are rather a simple and easy way to do FTP. Compare the instructions here (item 5) with a similar step-by-step list of specific instructions for uploading using an FTP client.

Many wads in the early years were uploaded to the file libraries at services such as CompuServe or AOL. The process for doing so was not too complex.

I'm talking early and mid-1990s here. By the late 1990s, things had changed a lot.

Phobus said:

when Doom first came out the internet was practically non-existent

Replace "internet" with "World Wide Web" and that's a bit closer to the truth.

Share this post


Link to post
Phobus said:

Projects were generally smaller, so some of the speed of uploading these days would have been negated, but yes, it was just a slower, more tedious FTP process. What's a lot more interesting (and something you children wouldn't know about either) was that when Doom first came out the internet was practically non-existent - certainly when compared to how it is now. As I was 4 at the time I only have the vaguest of memories myself, but it is truly a very stark contrast.


I don't recall a lot of the earliest stuff. I know my dad hit up the BBSes to download shareware games like Wolf and Doom (and many Apogee greats). My earliest personal experience involved an ancient web browser that i think of today as "8-Bit". I know we used it to look up hints for one of Sierra's Quest games. More than that, I fondly recall a sort of slideshow demonstration from Nova that explained the properties of black holes, going so far as to throw a robot into the phenomenon, which tore into halves as it neared the singularity.

Share this post


Link to post

Most people probably uploaded their stuff to local BBS, unless they were already subscribed to a service like Compuserve. Hardly anybody had real Internet service back in 1994, and most of those who did only had a shell account, rather than a SLIP or PPP account. In the case of a shell account, you had to dialup to your provider just like you would any BBS (with a program like Telix, Qmodem, etc.), then upload your new PWAD to your provider's Unix box, and finally use your shell's ftp program to upload to ftp.cdrom.com or idsoftware.com, or whatever FTP site...

And yeah, that means those with shell accounts had a 100% text-mode Internet experience. If you wanted to check out those nekkid pix from Usenet alt.binaries.* you first had to download them from your Unix shell, just like you would from any BBS. Them were the glorious days of Gopher, Lynx, MUDs, ASCII-art, and all that good stuff.

Share this post


Link to post

My friends and I would upload and download deathmatch wads and such from a local BBS. One dude knew the chick who ran the BBS personally, so she was pretty cool with us and our constant Doom crap.

Share this post


Link to post

And then there were the enterprising folks who downloaded wads from various BBS, put them on diskettes, and sold them at computer fairs. I still have my 4-diskette set that contains all the files needed to play Star Wars DooM.

Subsequently, those same folks trawled through cdrom.com, put the files onto CDs, and sold them at places like CompUSA. I am, I'm embarassed to admit, one of the people who rushed to the store to slap down my ten bucks and grab my copy of D!Zone.

All of this commerce was possible because few people had the computational power and connectivity to download files en masse.

Share this post


Link to post

There's a reason there are so many shovelware CDs full of wads. In 1993 very few people had an internet connection at all and over the next several years it was all dialup and awful. I never was big on wads but I did download my fair share on local BBSes (starting at 2400 baud (that's 300 bytes/sec)).

Over FTP I would assume many clients had a resume function for downloads, but not uploads. So if you get disconnected you start over.

Share this post


Link to post

Meh. That period was really awkward in that there was a lot of new media and content which was skyrocketing in size (CD-ROM FMV games, WAV files, MP3 files, etc.) but Internet connections in general sucked, and the most affordable/popular forms of removable storage sucked even more.

Your best option for backing your own data in a RW fashion was still diskettes until the early 2000, as the other stuff like ZIP, LS-120, tape streamers etc. didn't really catch up and become mainstream, and I didn't see practical flash drives until maybe early 2002-2003 (and then, with ridiculous sizes, like 32 and 64 MB).

CD-RW didn't exist until 2000-2001, and CD-Rs were not to become so cheap as to be disposable until the late 1990s (however recorders weren't really cheap or easy to install yet).

Share this post


Link to post
Maes said:

Your best option for backing your own data in a RW fashion was still diskettes until the early 2000, as the other stuff like ZIP, LS-120, tape streamers etc. didn't really catch up and become mainstream, and I didn't see practical flash drives until maybe early 2002-2003 (and then, with ridiculous sizes, like 32 and 64 MB).

I remember copying everything from the 100mb hard drive in my 486 to my Pentium on floppies. The worst part wasn't the 1.44mb size, it was how horribly slow floppies are.

Share this post


Link to post

Sheeeet mon, I didn't have no room for PWADs on floppies. Not after downloading some 50+ Slackware 2.3 disk images so I could try this nifty "Linux" thing. Heck, even had to trim down the DOS partition to a measly 50MB, but that was enough for DOOM and a whole bunch of PWADs. Didn't bother to run DOOM in Linux at the time, because the little 486DX/33 only had 4MB RAM (which was hideously expensive at the time).

Share this post


Link to post
Maes said:

Your best option for backing your own data in a RW fashion was still diskettes until the early 2000, as the other stuff like ZIP, LS-120, tape streamers etc.


I still have an Iomega Zip Drive with SCSI interface, disks too.

Share this post


Link to post
Use3D said:

I still have an Iomega Zip Drive with SCSI interface, disks too.


Yeah, a few of the most trendy/1337 guys used those but they suffered from a major flaw (other than the Click Of Death): you could not really consider the disks alone a portable solution unless you also carried the drive with you, or carried them between places that both had Zip Drives.

It was bad enough asking people if e.g. you could stick the PARALLEL version (and associated drivers) to their computers so you could copy some things, imagine trying to convince them to let you install a SCSI card and troubleshoot the thing for 2-3 hours ;-)

It would have worked better if USB had been more popular when Zip Drives were at their heyday, but it didn't quite work out that way.

The only RW medium you could be 100% sure everybody would have were floppies, and maybe CD-ROM drives if you wanted to bring your own stuff over. A few trendy Macs G3/G4 came with an internal (blech) Zip Drive but that was about it. Now, if Sony weren't such assholes about their Minidisk format, they could have revolutionized the removable storage market several years earlier: minidisks were smaller than a floppy or zip disk, had more capacity than a first generation ZIP drive (about 170 MB), and were also usable for music, and came several years earlier (1992/1993). A pity.

Share this post


Link to post

It's so interesting to see how far computers have come in only 2 decades. Who knows where it will end up in the next 20 years. Maybe there will be websites where you can play doom3 on site.

Share this post


Link to post
chopkinsca said:

Maybe there will be websites where you can play doom3 on site.


Doesn't Quake Live come close enough? To be fair, they are using browser and platform-specific high performance custom plugins to achieve that today.

But yeah, in 20 years there sure will be power to spare to run something like Doom 3 entirely in Javascript or in whatever interpreted language browsers will be using then (if browsers as we know them will still exist), and surely they will have solved the lack of standards for OpenGL/audio in browsers by then (unless OpenGL will be so hopelessly legacy that it will also have to be emulated, also in-browser).

Share this post


Link to post

Does anybody remember CompuServe's "GO ACTION" ??

That was the place where all people met and asked how the Icon of sin
can be defeated...

Share this post


Link to post
Maes said:

But yeah, in 20 years there sure will be power to spare to run something like Doom 3 entirely in Javascript or in whatever interpreted language browsers will be using then (if browsers as we know them will still exist)

JAVASCRIPT IS NOT AN INTERPRETED LANGUAGE ANYMORE DAMNIT

Share this post


Link to post
tempun said:

JAVASCRIPT IS NOT AN INTERPRETED LANGUAGE ANYMORE DAMNIT


It is totally interpreted for the purposes of runtime: in a proper compiled language you can't produce an executable out of a source file with syntax errors (exception made for FORTRAN, which you can trick into produce some very subtle clusterfucks), while an interpreted language will execute everything line-by-line until the offending line(s) of code are reached, without doing any sort of sanity or dependency check first.

Java and CLI languages, despite their use of JIT compilation and their having a compilation process which prevents making "executables" out of syntactically incorrect programs (there's an exception though: Eclipse will create .class files even out of those, but will throw an exception at error points), are still considered interpreted languages.

Until we get a Javascript engine that preemptively compiles everything before running a single line of code and thus REFUSES to run at all in the presence of a single error, then it won't be truly compiled. Of course that's totally undesirable in a web environment...

Finally, even if the last point held true, a program as complex as Doom 3 would almost certainly NOT be converted line-by-line into hand-optimized "native" Javascript: more probably, it would run as a cross-compiled affair into a VM running on top of Javascript (just the way GWT works for Java or Emscripten works for C/C++ stuff), so you would still get interpretation-like overheads, one way or the other.

Share this post


Link to post
Maes said:

Until we get a Javascript engine that preemptively compiles everything before running a single line of code and thus REFUSES to run at all in the presence of a single error, then it won't be truly compiled. Of course that's totally undesirable in a web environment...

Pretty sure that's exactly how modern Javascript engines work.

Here's a file with a Javascript syntax error for example. Doesn't work in modern browsers, and I'd be surprised if it's ever worked in any browser for that matter.

Might be worth reading the Wikipedia page on interpreted languages, eg.

Initially, interpreted languages were compiled line-by-line; that is, each line was compiled as it was about to be executed, and if a loop or subroutine caused certain lines to be executed multiple times, they would be recompiled every time. This has become much less common. Most so-called interpreted languages use an intermediate representation, which combines compiling and interpreting. In this case, a compiler may output some form of bytecode or threaded code, which is then executed by a bytecode interpreter.

The former type ("parse as you execute") is what (old) versions of BASIC did. FraggleScript also does this - as with most other things in its design, it's pretty much the worst possible way to do things.

That definition is usually what I think of when I hear "interpreted language", though it seems that definition also extends to languages like Python or Ruby that are compiled to an internal tree structure or a bytecode representation, but not to machine code. I'm guessing that Javascript has traditionally worked this way as well. Syntax errors are caught at compile time with an approach like this. Modern Javascript implementations like V8 or Spidermonkey compile to actual machine code, just like a C compiler.

Share this post


Link to post
fraggle said:

Here's a file with a Javascript syntax error for example. Doesn't work in modern browsers, and I'd be surprised if it's ever worked in any browser for that matter.


Smart example, but there's one catch: that's a type of error that messes with the general language's token validity, which is not really the topmost cause of Javascript errors.

Perhaps "syntax errors" in that sense are not enough to trick Javascript into attempting execution (I should' ve thought of that, being it a curly-bracket language it must have at least some of C/C++/Java's tokenizing mechanisms and expect flawlessly formed blocks) however if you insert a syntactically valid call to an inexistent function name, Javascript won't complain and will dutifully execute everything until it encounters it and die with an error. AFAIK no modern browser can avert that (yet), and I'm not sure if doing so would prevent the weak/lazy typing mechanism from working.

Try it: modify the "sneaky snippet" in such a way:

alert("hello world")
shit
alert("goodbye world")
This time it will execute the first hello world, and then crash silently. Only with a JS debugger you'll see the "UncaughtReference Error" being thrown at "shit". Pre-emptive checking my ass. In a truly compiled, strongly typed language, that simply couldn't happen at all (again, I leave some reserve for Fortran, which can be tricked into producing executables with calls to undefined references, but that's because that language stinks).

Share this post


Link to post

That isn't a syntax error. The code you posted is grammatically valid JavaScript which makes use of an undeclared reference, or at least, that is the interpreter's sense of it. "shit" would have to refer to an object or a property of the object that is "this" in the current context of execution (which would itself be an object of some type).

This has nothing at all to do with weak typing and everything to do with the timing of binding. Statically compiled executables check that all identifiers used in the program make sense at link time. Dynamically compiled ones may do it at runtime, as in this case.

Share this post


Link to post

The end result is the same though: there was a show-stopping error somewhere down the line, that WAS NOT caught before starting execution, which is typical of how purely interpreted languages work.

It was compiled alright....in code that DOESN'T WORK and CRASHES (ghosts of Fortran memories here...).

BTW, checking for properly formed tokens before executing statements is not a prerogative of compiled languages: interpreted languages can do that to some degree (checking only for proper parsing, not actual execution or name binding). Even some older dialects of BASIC with "semi-compiling" features did that, but that was not enough to prevent actual syntax errors during actual execution. Many so-called "BASIC compilers" actually just converted BASIC to an IL and used an execution environment to run it, even when they produced actual .EXEs. Nothing to do with how C or Pascal compilers worked.

A lot also depends on how much flexibility the language allows: Javascript is pretty bad with its optional end-statement colon, one-word statements and not much could be done to ensure that your program isn't complete bogus, other than scanning for mismatched parentheses, brackets, abnormal tokenization etc.

On the other end of the spectrum you have Delphi/Pascal: damn, are those bondage & discipline... Fortran LOOKS like it's just as strict, but it's more like a mishmash of inconsistent, arbitrary rules than something really strict (which would also be constant). Matlab's scripting language is also every bit as bad as Fortran.

A good rule to see if a language lends itself to 100% clean-cut compilation with precise rules, is to see the quality of lexical analyzer tools available for it in the various IDEs: Java is simply stellar, most IDEs can ihghlight and correct errors literally as you type. C/C++ is pretty good, while with Fortran/Matlab, you're lucky if you get syntax highlighting and maybe a function navigator that doesn't fuck up after 2-3 functions. Strangely, Javascript's situation is much closer to Fortran's than to Java's or C/C++'s.

Share this post


Link to post

Quasar is right: it's not a syntax error. What you're objecting to really is more the fact that it's a dynamic language rather than a static language.

A lot also depends on how much flexibility the language allows: Javascript is pretty bad with its optional end-statement colon, one-word statements and not much could be done to ensure that your program isn't complete bogus, other than scanning for mismatched parentheses, brackets, abnormal tokenization etc.


It's a valid objection, but personally I see it as similar to null pointer dereferences in C/C++/Java, except instead of dereferencing a pointer that goes nowhere, you're trying to access a variable that hasn't been declared. It's the kind of error that shows up immediately as soon as you have a proper test suite with any decent level of coverage.

What I will concede is that it's certainly easier to implement a dynamically typed language as a (bytecode) interpreter. Actually, if you flip it round the other way, the reverse is true, too: it's easier to write a native compiler for a language like C because of features like type annotations for variables (that don't exist in dynamic languages). Some people get this funny idea that static typing and typed variables are there as some kind of safety features: they're really just there because it makes compilers easier to write.

In the end though, features like these certainly don't preclude the possibility of writing full, efficient compilers for languages like java script: things like JSLinux do a good job of showing off the kind of things that modern compilers are capable of.

Share this post


Link to post
fraggle said:

In the end though, features like these certainly don't preclude the possibility of writing full, efficient compilers for languages like java script: things like JSLinux do a good job of showing off the kind of things that modern compilers are capable of.


So, keeping with the "shit" example above, exactly what should such a compiler do? Raise a compile-time error, a warning, or just let it be, hoping that somehow, the "shit" reference will be there during runtime and it's none of its business asking how or why?

This also precludes efficiently encoding the literal reference "shit" as e.g. userspace symbol 0x00000001 once and for all, unlike what a C compiler would do before sending stuff to the linker. No matter how you look at it, JS is not a language designed to make the compiler's job easy, so any solution will have to be a compromise, and it will never have the potential to become a fully compileable language, at least not to the degree achievable e.g. with Java and GNU GCJ or the defunct J++.

Even those achieved this by making some compromises e.g. dynamic class loading was practically impossible for a Java program compiled into a static .exe (which was practically converted to an intermediate C equivalent).

Share this post


Link to post
Maes said:

So, keeping with the "shit" example above, exactly what should such a compiler do? Raise a compile-time error, a warning, or just let it be, hoping that somehow, the "shit" reference will be there during runtime and it's none of its business asking how or why?

This also precludes efficiently encoding the literal reference "shit" as e.g. userspace symbol 0x00000001 once and for all, unlike what a C compiler would do before sending stuff to the linker.

Depends on the language. In the case of Javascript, it's the kind of thing that's detected at runtime.

You can actually get similar things in C. For example, if you do this:

int main(int argc, char *argv[])
{
    shit();
}
Assuming 'shit' is a function that doesn't exist, the compiler can't actually detect that there's an error - only the linker can, when you come to actually link the program. Modern compilers like gcc will give you a warning but that's all. If you're compiling a library you're SOL until you try to make something actually link against it.

And "modern compilers" is the important point here: C has been around since 1972: ie. 40 years. Until about the early 90s, C compilers didn't give anything like the helpful warnings that gcc gave. If you look at old books about C they'll recommend using a tool called "lint" that is now essentially redundant as the compilers now do it all for you. Interestingly in Javascript terms we're at about the same point as we were in C's history when the compilers and tools for that really matured.

No matter how you look at it, JS is not a language designed to make the compiler's job easy, so any solution will have to be a compromise, and it will never have the potential to become a fully compileable language, at least not to the degree achievable e.g. with Java and GNU GCJ or the defunct J++.

Even those achieved this by making some compromises e.g. dynamic class loading was practically impossible for a Java program compiled into a static .exe (which was practically converted to an intermediate C equivalent).

I certainly never said it was a language that was easy to compile: in fact I think in my previous comment I explicitly stated it wasn't. Will it ever be as efficient as C? I doubt it. But things like JSLinux demonstrate that the modern Javascript implementations are efficient enough that you can pull off some pretty impressive things with it - things that wouldn't have been possible 5 years ago, even with faster CPUs.

Share this post


Link to post
Maes said:

It was bad enough asking people if e.g. you could stick the PARALLEL version (and associated drivers) to their computers so you could copy some things, imagine trying to convince them to let you install a SCSI card and troubleshoot the thing for 2-3 hours ;-)


I never had problems with my SCSI card, (it's ISA btw), but I'd also never consider using the Zip drive as portable media, I only used it for backup.

Share this post


Link to post

In any case, I maintain that JIT compiled != truly compiled. It's simply an incremental enhancement/transparent convenience that is applicable (sometimes) to interpeted languages, but that's about it. It doesn't level the semantic and -most importantly- philosophical differences between compiled and interpreted languages.

And OK, so nitpicking between the 'compilation' and 'linking' phases was an ingenious sophistry indeed, in the context of this conversation, but the end result is the one and the same: a C program where the linker would meet unsatified/unresolved dependencies could not continue, period, whereas JS would go by a "best effort" strategy, just like BASIC. And I suspect that it actually RELIES on such behavior. After all, it was designed to run even from partially loaded/broken webpages, not bit-perfect binary executables ;-)

JS is much more similar to FORTRAN in that respect, which has a weird "hot & cold" attitude towards error: a wrong keyword generates a cascade of errors throughout all the program, a function called with the wrong arguments generates a compile-time error, but calling an inexistent function and stomping reserved keywords is A-OK if it's done without disturbing the previous rules. Until you try running the "executable", that is.

Share this post


Link to post
Use3D said:

I never had problems with my SCSI card, (it's ISA btw), but I'd also never consider using the Zip drive as portable media, I only used it for backup.


I personally never had problems with my SoundBlaster cards either (2.0 and later a SB16 Vibra), yet back then I read tons of horror stories about how setting up drivers, IRQs, base addresses and DMAs was an unsurmountable obstacle "for the rest of us", only to go away when Plug And Play (TM) came along.

But SCSI cards (and the SCSI protocol in general) was considered a PITA in general: way off the complexity and cost scale for common grunts, it was viewed as some Black Juju that only chicken-bone-dancin' Unix Warlocks could command, and only the 1337est of the 1337 had on a Wintel/DOS machine. And apart from getting the cards going, the real challenge was getting the peripherals themselves going. Channels, terminators...ugh.

As for the ZIP drive, you just proved my point: while the medium itself was perfect as a portable storage (small, robust, almost immense in capacity for the era), the drives themselves could not be relied upon to be everywhere, so it was effectively reduced to a personal backup system. You might just as well backup your data with this and a VCR, or a cheaper streamer unit.

Share this post


Link to post

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×