Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
K3K

The /idgames archive in the '90s

Recommended Posts

Honestly I kinda miss those days of computer hardware black magic and voodoo, really separated the dedicated from the douchebags.

Share this post


Link to post
Use3D said:

Honestly I kinda miss those days of computer hardware black magic and voodoo, really separated the dedicated from the douchebags.


*brofist*

But unfortunately, the likes of you and me are becoming dinosaurs...

Share this post


Link to post

Another dinosaur reporting for duty.

Maes said:

But SCSI cards (and the SCSI protocol in general) was considered a PITA in general: way off the complexity and cost scale for common grunts, it was viewed as some Black Juju that only chicken-bone-dancin' Unix Warlocks could command, and only the 1337est of the 1337 had on a Wintel/DOS machine. And apart from getting the cards going, the real challenge was getting the peripherals themselves going. Channels, terminators...ugh.

As for the ZIP drive, you just proved my point: while the medium itself was perfect as a portable storage (small, robust, almost immense in capacity for the era), the drives themselves could not be relied upon to be everywhere, so it was effectively reduced to a personal backup system. You might just as well backup your data with this and a VCR, or a cheaper streamer unit.

Having used SCSI and Zip drives a lot on the Amiga (and to a lesser extent on PCs), I must say I've had little trouble with either. Maybe I'm some sort of idiot savant.

Biggest problem I've had with Zip drives (apart from too few people owning them) is SCSI controllers disagreeing on what the track/sector layout should be, I could get Commodore and GVP controllers to accept each others disks but the Oktagon 2008 simply refused to compromise. That piece of shit wouldn't boot under Kickstart 1.3 either.

Share this post


Link to post

I used cdrom.com at the time and also D!Zone2 prior to constant connection and also using AOL's FTP browser to search the archive as well.

OP: No need to censor yourself here.

Share this post


Link to post
GreyGhost said:

Another dinosaur reporting for duty.
Having used SCSI and Zip drives a lot on the Amiga (and to a lesser extent on PCs), I must say I've had little trouble with either. Maybe I'm some sort of idiot savant.


http://en.wikipedia.org/wiki/Autoconfig

Amiga made everything easy. It was ahead of its time, in so many ways...

Share this post


Link to post
TB171 said:

What was it like uploading WADs to the archive? Was it a pain in the a**? Or was it easy like it is now a days? Sorry if you think this topic is stupid but I'm only 13 and don't know too much about DOOM when compared to someone whos 20's-30+.


In the pre-1995 era, that is, the BBS-dominated world, uploading anything, wad or not, was pretty straightworward and relatively painless provided you had a decent network driver (the almighty BAUD BANDIT!). I did the Amiga-to-PC switch just to play Doom, but continued to use my Amiga as the main computer for transferring files over the BBS world due to not having the time to learn all that DOS stuff overnight. This also meant I had to get a DOS ANSI font installed and working on Workbench if I wanted to browse most Doom BBS' appropriately.

The process of uploading a file was pretty much the following:

1. Call a BBS.
2. Log in.
3. Bring up the upload dialog box, select your file, press the transfer button.
4. Mission accomplished.

Everything was done from the terminal, which was like the web browser from back in the day.

In retrospective, though, they were a lot like web portals, but less cluttered and more exciting.

The FTP was a welcome addition (in my opinion) because the servers were, for the most part, faster than BBB' so you could upload and download more in a fraction of the time, the only downside, you needed an actual internet connection, which wasn't much of a deal but then you had the costs of the BBS call PLUS internet access, which I personally didn't give two shits about because I wasn't exactly living under a bridge.

Gopher, that was quite something...There were only a few Doom-dedicated gophersites, yet only one of them accepted WAD uploads, if I recall correctly. The protocol was actually pretty nice, kinda like an enhanced FTP, but the combination of a proper website with public FTP capabilities turned out to be a lot more interesting for the average user, though not necessarily as efficient, due the html overhead, which would soon become irrelevant.

You could also upload wads to certain usenet alt's, and it was indeed done for a short while, but this wasn't as much as a widespread practice due to the high risk of data corruption (before the yEnc overhaul)

Now to answer your question, it wasn't nearly as much as a pain in the ass as some people think. FTP's are still going strong so no problem for WADS specifically thanks to idgames, but try uploading something else to a regular file storage website without getting a paid account and see how fun it is.

Share this post


Link to post
Maes said:

In any case, I maintain that JIT compiled != truly compiled.


"truly" is a loaded term, you mean "statically". Java does a lot of the same stuff, the "compiler" compiles from Java (or whatever) into bytecode which is later interpreted by HotSpot, but despite this its IDEs and analysis tools are famously powerful. Like fraggle and Quasar said, your beef is dynamic binding during runtime. FWIW, you can screw yourself that way in C too (function pointers), but the difference is that in C it's a security risk, in JS it's just a little exclamation point somewhere. Actually I would say that exemplifies a big schism in modern software development: either you're developing something that's intended to be statically compiled into a native code executable, or you're developing something intended to run on a runtime like HotSpot/PyPy/V8.

Share this post


Link to post
Ladna said:

Like fraggle and Quasar said, your beef is dynamic binding during runtime.


Dynamic to the point that a call to an inexistent function or access to an inexistent object (not simply to a null or dangling pointer, but to a name that hasn't even been declared anywhere) will cause a runtime error rather than a compile-time (assuming that you're using a JIT engine) error? The only "truly compiled" language that allows you to do that to some extent, AFAIK, is Fortran, and then again even that catches more errors than JS. Not that it's something to be proud of, NB.

OK, I realize that Javascript != Java with a different API and a bit funny syntax, but certain aspects of dynamic binding are incompatible with the traditional responsabilities of a compiler.

Share this post


Link to post
Maes said:

...certain aspects of dynamic binding are incompatible with the traditional responsabilities of a compiler.


Just because some compilers (actually linkers mostly) perform some (inadequate) static analysis doesn't mean they're the same thing.

Anyway, JIT compilation IS compilation. You can't call it anything else. You're just conflating compilation with linking and linking with static analysis, all of which can be done (and almost always are) by separate tools at different phases of development and deployment. I'll agree with you that JavaScript allows some crazy stuff, but late binding has been around since LISP and really isn't that wild anymore.

I do agree with the "I wish there was something that caught dumb errors during development" sentiment though. Whenever I develop in Python/PHP/Javascript/etc. I commit hilariously stupid mistakes like duplicating function/variable names or whatever. The crap static analysis you get with C/C++/Java actually ends up saving you time because that's faster than hunting down and fixing the nasty bugs that occur otherwise. I know there are some tools but blah blah blah.

Share this post


Link to post

Still, I don't get how both C-like strong typing and weak BASIC-like typing can coexist within the same language.

To successfully assign a value to a variable, I must have declared it somewhere, I can't just write "A=10" like in BASIC out of the blue, and expect it to work. This won't be caught as an error during "compilation" but will bomb out nonetheless when it will be executed.

Similarly, as long as I don't violate parentheses/token parsing rules, I can write pretty much entire statements choke-full of references to inexistent functions and variables that might never be magically satisfied during runtime. How that can be the mark of a "compiled" language is beyond me.

For me compiled means that a very strict entity (the compiler, AND linker if you want to be pedantic) carefully checks everything, and if it makes sense, then it produces a static executable. This has ALWAYS been one of the main differences between compiled and interpreted, and a historical reason why interpeted languages were considered more beginner-friendly: with BASIC you essentially had line-by-line debugging and exception checking by default, and could always "RUN" something if at least the first line was correct.....

Sure, there were BASIC compilers (e.g. Turbo Basic) that let you get away with the interpreter's shenanigans and apparent laxness, but those used a system not entirely different than modern virtual machines: when your BASIC program was turned into an .EXE, it was basically a Turbo Basic runtime + a compact representation of your program (the "bytecode"). There was no static analysis involved, and it was basically just an "accelerated interpreter". Needless to say, it was not a particularly efficient system.

JIT the way JS does it has more in common with Turbo Basic's than with Java or C#'s, which can do much more optimizations because they try to get at least some things down to spec before starting execution.

Share this post


Link to post

I guess you'll just have to come to terms with the fact that compilation isn't static analysis. Hell, you can compile *to* JavaScript. Freeing yourself from this restriction is extremely powerful, enabling higher-level languages to compete speed-wise with lower-level ones (JIT), providing infinitely better security, and probably catalyzing what seems to be the new waves of computing (mobile and cloud-based computing).

Besides, most languages have tools that fill this role. Python has PyFlakes, PyLint and PyChecker, for example, and applications that are super critical either use languages that are easier to verify (Ada, Haskell) or adopt highly defensive coding procedures - don't dynamically reassign functions at runtime for one.

And I can certainly sympathize with the feeling of horror that you get when you fill line after line with what is essentially garbage, and the parser happily parses it, and the bytecode interpreter eats and eats until something doesn't resolve and then boom. If you don't like that - there are lots of people that don't - use GWT, use Dart, use Emscripten, use JavaScript code analysis tools.

Finally I don't understand your comment about typing. There's effectively no such thing as a type system in JavaScript, certainly what's there bears almost no resemblance to C's (weirdo) system.

Share this post


Link to post

Nah, I had BASIC's type system in mind(for those dialects that had it, anyway). Most had a ham-fisted system for numerical types, whereas no qualifier or no declaration worked fine, but just to be safe floating point numbers were used, even on 8-bit machines. Not very unlike what JS is doing right now, to the point that custom extensions are needed to handle stuff like arrays of pure integers, but I'm digressing.

You COULD specify a type (or even change the type of a variable a-posteriori!) or imply it by postfixing a symbol (e.g. %, $, etc.), but as I said, it was pretty ham-fisted, and you could change it anytime e.g. i$, i, i% etc. ;-)

I insist on static analysis as a prerequisite to proper compilation because, in the end, an executable program boils down to a bunch of precise CPU instructions, to be sent on their merry way exactly once (per program loading, that is).

The only way to make sure that you don't stop execution every now and then to disambiguate what you REALLY mean before sending the final product to the CPU, is performing static analysis before you piece those instruction together once and for all, otherwise you're essentially doing an interpreter's job in disguise. Maybe a very good disguise, maybe a "best of both worlds" approach if you wish, but still a compromise.

Share this post


Link to post
Maes said:

I insist on static analysis as a prerequisite to proper compilation because, in the end, an executable program boils down to a bunch of precise CPU instructions, to be sent on their merry way exactly once (per program loading, that is).


Still with the loaded terms: "proper" haha. Everything on computers boils down to this. When you push a button in your web browser, it executes a code path manifest ultimately in CPU instructions. Whether or not those CPU instructions exist on disk or are generated by a JIT is immaterial.

Maes said:

The only way to make sure that you don't stop execution every now and then to disambiguate what you REALLY mean before sending the final product to the CPU, is performing static analysis before you piece those instruction together once and for all, otherwise you're essentially doing an interpreter's job in disguise. Maybe a very good disguise, maybe a "best of both worlds" approach if you wish, but still a compromise.


I'm not entirely sure what you're getting at with this.

Share this post


Link to post
Ladna said:

I'm not entirely sure what you're getting at with this.


That's because you need to resolve some dynamic bindings first ;-)

Share this post


Link to post

I am 13 Aswell and i have been playing doom since i was 3 :P There was no ZDoom And Stuff. I dont think much modding either.

Share this post


Link to post
michael9r9r said:

I am 13 Aswell and i have been playing doom since i was 3 :P There was no ZDoom And Stuff. I dont think much modding either.


10 years ago? Of course there was. Hell, Community Chest 1 is almost 10 years old itself. Also If I'm not mistaken zdoom began development in 1998.

Share this post


Link to post
michael9r9r said:

There was no ZDoom And Stuff. I dont think much modding either.

There's around 8,000 wads in the archive that were released before 1998, so in terms of sheer volume (if nothing else) the mid 90's could be considered the golden age of Doom mapping.

Share this post


Link to post

Now there are people releasing "vanilla" maps on /idgames that don't run unless you have limit-removing port, and sometimes they even need ZDoom (like for example, pretty much all the recent uploads by "DoomWar"). I don't personally care what engine they map for (that's their business) but it's getting old to download all these files that aren't what they're advertised as.

Share this post


Link to post

I still remember accessing ftp.cdrom.com when I still used AOL dialup to get Doom goodies.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×