Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
bgraybr

SDL Hardware Accelerated?

Recommended Posts

On the roadmap for SDL 1.3 this line caught my eye:
Done: Create 3D accelerated texture based rendering API

This is a miracle for most programmers (SDL's ancient rendering methods should have been replaced years ago IMO), but for Doom source ports it means that SDL will not longer be able to easily replicate software rendering behavior.

Are there any source that actually use SDL for rendering? I don't think that there are, but if so, am I right in assuming that it might cause some problems later?

http://wiki.libsdl.org/moin.cgi/Roadmap

Share this post


Link to post
bgraybr said:

On the roadmap for SDL 1.3 this line caught my eye:
Done: Create 3D accelerated texture based rendering API

This is a miracle for most programmers (SDL's ancient rendering methods should have been replaced years ago IMO), but for Doom source ports it means that SDL will not longer be able to easily replicate software rendering behavior.

Are there any source that actually use SDL for rendering? I don't think that there are, but if so, am I right in assuming that it might cause some problems later?

http://wiki.libsdl.org/moin.cgi/Roadmap


Many source ports use SDL, but all of those that use it for non-opengl graphics purposes simply use it as a dumb framebuffer, with all actual drawing (both the 3d renderer and 2d elements like menus and status bars) being done in user-space software. So the 1.3 stuff doesn't really change anything.

You could in theory rewrite the engine to use a hybrid system where all of the "3d" rendering is done by the software engine, and the status bars/menus are overlayed with that accelerated system or something similar; but practically speaking its pointless, as status bar drawing doesn't consume a significant portion of time.

As far as SDLGL ports go, they already do the hardware accelerated 2d rendering; you just call glOrtho() and plop crap on the screen.

Share this post


Link to post
bgraybr said:

but for Doom source ports it means that SDL will not longer be able to easily replicate software rendering behavior

PrBoom-Plus compiles and works fine with SDL 1.3. It always uses 32bit surface even if prboom creates 8bit. Also it has some small glitches with SDL_compat.c functionality, but they are easily fixable on client side. This for example:
http://bugzilla.libsdl.org/show_bug.cgi?id=1212

Share this post


Link to post
bgraybr said:

Are there any source that actually use SDL for rendering? I don't think that there are, but if so, am I right in assuming that it might cause some problems later?[/url]

Lots of source ports use SDL: Chocolate Doom, PrBoom/+, Eternity, I think there are others as well. If this is implemented properly on the SDL side, it should be transparent to the source ports - the screen will just render faster. This is a very neat feature (and one that did occur to me before).

I did some work a while back to get Chocolate Doom to compile with SDL 1.3, but I haven't kept it updated, so it might not compile any more. Once the new SDL gets properly released I'll have more incentive to keep it working. It's actually quite frustrating - SDL 1.3 looks like it has some really useful features, but it's been in development for years and still hasn't been finished. That roadmap page isn't encouraging either - it basically implies that there are a ton of new features that they still want to add. At this point they ought to just concentrate on getting what they have stabilised and finished.

Share this post


Link to post
Porsche Monty said:

So 8-bit modes are effectively gone in SDL 1.3?

They are supported with wrapper (SDL_compat.c), but internally it does not use 8 bit as far as I know. Probably I am wrong.

Share this post


Link to post

In any case, there is an "Use GL surface for software mode" option in prboom-plus, so you are not affected with palette and vsync issues anymore.

Share this post


Link to post
Porsche Monty said:

So 8-bit modes are effectively gone in SDL 1.3?



8-bit modes are effectively gone Period. Consider yourself lucky if you still got a graphics card that can handle them without any hiccup. There's a reason why so much software depending on them is migrating to a hardware accelerated option to render the paletted surface onto a 32-bit screen.

Share this post


Link to post

My biggest concern now is the blue disk icon that normally flashes during disk access. In Chocolate Doom you need GDI and 8-bit modes in order to make it work.

Share this post


Link to post

Reminds me of how some newer devices (e.g. Android) don't have any native support for indexed images -drawing grayscale or 8-bit graphics must be done programmatically in user-space software- or use some workaround like creating a new GL surface for each picture and rendering there. What's crazier is that there's not even a wrapper for that in the API -when you select image formats, you can only use 32-bit, 24-bit or 16-bit RGB and that's it.

Share this post


Link to post

Welcome to the present. 8-bit has no use anymore outside of legacy gaming and modern hardware is not developed for that. And why should it?

Share this post


Link to post
Graf Zahl said:

Welcome to the present. 8-bit has no use anymore outside of legacy gaming and modern hardware is not developed for that. And why should it?


Well, there's is still the issue of supporting 8-bit "web images" and greyscale images in a compact format (8-bit grayscale is just as good as using 24-bits in plain RGB format, and most hardware can't display any better anyway). So that plain and simple translates into forcibly blowing everything up to 24 -or rather 32 bits-, and having to carefully unmarshal older data byte by byte (well, that was already the case with sub 8-bit images) but 8-bit is pretty much the common lower end for data and for images. Well, it USED to, at least. Let alone that indexed graphics allow for some effects to be pulled easily and effortlessly.

Share this post


Link to post

Huh? What?

We are talking about 8 bit hardware support, not 8 bit image formats. The latter ones will continue to exist for a long time but the hardware support for 8 bit screen formats will be gone for good in a few years.

Share this post


Link to post

As of a few weeks ago, Eternity also has a GL ortho-projection backend, courtesy of the new hardware abstraction layer for video. EE is going to continue adding HALs until we are capable of porting easily between platforms and libraries, eliminating what little dependency we really have on SDL at all (it should be an option amongst the backends, not the only one available).

I have been looking into the possibility of supporting SFML as one of these, and supporting GL rendering was a prerequisite, since that library doesn't allow for any software drawing whatsoever, nor does it try to emulate it by, say, turning blits into quads for you.

EE even shares support for the GL_ARB_pixel_buffer_object extension with PrBoom-Plus, which makes the framebuffer upload 1.5-2x faster according to entryway.

Share this post


Link to post

DoomLegacy has used SDL for OpenGL rendering for years. With the old nv driver SDL does the rendering in software, but with the nvidia driver SDL uses hardware rendering.
Isn't this texture based hardware accelerated rendering?
As long as the old API is not removed, then no problems.
Maybe this is just additional API with some new features.

(It may be, that any description that requires more than two adjectives applied to a noun, we may be unable to convey any confidence that the reader is using same meaning, as we intended)

Share this post


Link to post
Graf Zahl said:

We are talking about 8 bit hardware support, not 8 bit image formats.


I was oriented somewhere in the middle of the two ;-)
Yeah, I don't realistically expect that hardware is actually able to make 8-bit and 32-bit video depths coexist seamlessly with no overhead at all. Even the indexed 8-bit and 16-bit modes that e.g. Java allows to be used apparently seamlessly with 32-bit desktops, are actually done with an adaptation layer that takes care of the "palette", so that the programmer doesn't have to write his own byte8-to-uint32 framebuffer converter, which is instead -and hopefully- implemented through a more efficient lower level native API or even hardware accelerated.

OTOH, the Android platform has no provision AT ALL for indexed 8 bit or 16 bit graphics. Nada. Zilch. Null. Not even as a built-in-but-shoddily-written API. If you want to process images at less than 32-bit RGB fullcolor for whatever reason, you have to write your own palette adapter, as there is NOTHING in the API that will do that efficiently -or even conveniently- for you, at least not for non-GL graphics. Compare with what I had discovered here (BTW, Froyo 2.2 ROMs seem able to run that same benchmark at 4x the speed with no other changes to the code, thanks to the JIT, but it's still a major PITA that there's no API support at all for doing That One Thing). What makes it a WTF for me is that we're talking about a portable device, with supposedly more limited resources etc. and so at least the ability to handle 8-bit graphics at some level would seem to be obvious. Or not?


Then again, on our dear IBM-PC compatibles (yes, because that's what we've been using for the last 30 years or so) there's that little nifty standard called VGA....which no sane video card manufacturer would dare not supporting anymore. Video cards and graphics chips designed for use with Macs or other platforms are another matter though.

Share this post


Link to post
Maes said:

Then again, on our dear IBM-PC compatibles (yes, because that's what we've been using for the last 30 years or so) there's that little nifty standard called VGA....which no sane video card manufacturer would dare not supporting anymore. Video cards and graphics chips designed for use with Macs or other platforms are another matter though.



Are you sure? Nobody needs that anymore so it might as well be gone some day. Dead cruft is dead cruft, standard or not.

Share this post


Link to post
Graf Zahl said:

Are you sure? Nobody needs that anymore so it might as well be gone some day. Dead cruft is dead cruft, standard or not.


Geez, Graf, do we have to state the obvious here?

If this was Apple we were talking about, then you'd have a point very well taken. Because there, one day Steve Jobs says "old cruft is old cruft" and Hop-La! Gone are e.g. ADB monitor connectors, NuBus keyboards and mice, 800K floppies, Mac Os Classic, Motorola and PowerPC CPUs, and whoever doesn't like it can suck it down.

On the IBM-PC Compatible world however, none has ever dared taking such a drastic step in 30 years (there were only a handful of legacy peripherals that really-really-really died, e.g. the tape port that some versions of BASICA and GWBASIC supported).

But for the rest, if I connect a 5"1/4 floppy drive to a modern quad core mobo, it will still boot from a DOS 1.0 floppy disk. If there was a way to physically connect an old Hercules card to the ISA bus (that still exists somewhere in every modern chipset), it would work just like it did. And VGA and its modes are still quite up on the ranking of "legacy" devices. Even Windows 7 and still boot in plain VGA/SVGA modes, and a lot of utilities still depend on them (and on textmode tweaks).

Just to understand whether you mean something else...is some new, shiny standard that I'm not aware of being pushed as the new standard for PCs to replace the old BIOS, its textmode, and the basic VGA graphics mode? Will PCs boot directly to graphics like e.g. Macs in a couple of years? By all means, tell if you know something :-p

The closest the PC world will ever come to a critical break with the past will be when a mainstream Intel-compatible CPU will not support 16-bit x86 code anymore, but I haven't heard of something like that coming (and due to how the Intel x86 works, I don't think it's even possible to have e.g. an x86-32 CPU without an underlying x86-16, or an x86-64 without the underlying x86-32 without the x86-16 etc. Can those CPUs even skip the x86-16 mode entirely when booting?

Now, MAYBE if that whole "Windows 8 will run on ARM" story comes true and Wintel becomes a thing of the past, then maybe VGA and all the other legacy stuff will also die. Or maybe we'll all be using Macintel and EFI one day. Who knows. But the day where the "old world" will die is still far, far away. I'd expect there to be a split in technologies, rather than a succession: "old style" IBM PC compatibles may be sold at least for a period along with more modern EFI-based designs that will be totally incompatible with MS-DOS and real mode, even if using Intel...but that day is still far, far away.

Share this post


Link to post
Graf Zahl said:

Are you sure? Nobody needs that anymore so it might as well be gone some day. Dead cruft is dead cruft, standard or not.


Everything will be gone someday, but to say that VGA's not needed anymore, now that's misinformation talking.

Share this post


Link to post
Maes said:

Just to understand whether you mean something else...is some new, shiny standard that I'm not aware of being pushed as the new standard for PCs to replace the old BIOS, its textmode, and the basic VGA graphics mode? Will PCs boot directly to graphics like e.g. Macs in a couple of years? By all means, tell if you know something :-p

The closest the PC world will ever come to a critical break with the past will be when a mainstream Intel-compatible CPU will not support 16-bit x86 code anymore, but I haven't heard of something like that coming (and due to how the Intel x86 works, I don't think it's even possible to have e.g. an x86-32 CPU without an underlying x86-16, or an x86-64 without the underlying x86-32 without the x86-16 etc. Can those CPUs even skip the x86-16 mode entirely when booting?

Now, MAYBE if that whole "Windows 8 will run on ARM" story comes true and Wintel becomes a thing of the past, then maybe VGA and all the other legacy stuff will also die. Or maybe we'll all be using Macintel and EFI one day. Who knows. But the day where the "old world" will die is still far, far away. I'd expect there to be a split in technologies, rather than a succession: "old style" IBM PC compatibles may be sold at least for a period along with more modern EFI-based designs that will be totally incompatible with MS-DOS and real mode, even if using Intel...but that day is still far, far away.



Nobody knows what's coming. But let's face it: For all intents and purposes rarely anybody needs that legacy stuff so who's to say that hardware manufacturers will keep it forever? Granted, as long as 32 bit OSs exist the likelihood is very small that anything a 16 bit program may need will be discarded.
However, when I look at the systems that are sold right now, I can't see anything with a 32 bit OS anymore. New systems are already 64 bit exclusive where I live. So in 5 years 16 bit support will be obsolete and maybe, just maybe, hardware developers are starting to remove the dead cruft. Like systems that don't need text mode anymore to boot or BIOS's that only have the functions that are needed to start a modern operating system or who knows what. None of that stuff is used anyway after booting.

Share this post


Link to post
Maes said:

Let alone that indexed graphics allow for some effects to be pulled easily and effortlessly.


The same thing could be said for music. "MIDI soundtracks allow for some effects to be pulled easily and effortlessly" but this didn't stop everyone from moving onwards to MP3/WMA/whatever PCM recordings instead of synthesis; relegating MIDI stuff to musicians exclusively.

Share this post


Link to post
Gez said:

The same thing could be said for music. "MIDI soundtracks allow for some effects to be pulled easily and effortlessly" but this didn't stop everyone from moving onwards to MP3/WMA/whatever PCM recordings instead of synthesis; relegating MIDI stuff to musicians exclusively.


You're confusing the midi interface with mid files, and you even throw in an "effects" which neither was designed to handle. In any case, midi soundtracks were used out of need, they had to circumvent the limitations in consumer-level audio hardware from the era, nothing more, nothing less.

Share this post


Link to post
Graf Zahl said:

However, when I look at the systems that are sold right now, I can't see anything with a 32 bit OS anymore.


So they don't sell Android devices, Atom processors, or Windows Vista/7 32-bit anymore? Seems hard to believe.

A=OK, eventually things will move on but the IBM PC compatibles are pretty much a technological anomaly, just like the OSes that power them: backwards compatibility is not just an afterthought, it's a design consideration.

Just think what would happen if someone made an otherwise "normal" PC but without a normal BIOS. Who can say what piece of software that somewhere, somehow, required a BIOS service will break, and when? No wonder the only company that ever did this is Apple, with their Macintel line.

None of that stuff is used anyway after booting.


Ahem... Hiren's boot CD... Ultimate Boot CD for Windows... FreeDOS.... try harder next time. Unless, and only unless, that "next gen" platform you're talking about cannot be called an IBM PC compatible anymore. But at that point maybe that ominous prophecy that "the network is the computer" will become true, and what consumers will be able to access anyway will be generic, thin-client "cloud storage" bitty boxes, and a powerful stand-alone desktop PC will be a thing of the past anyway *shivers*.


As long as those three innocent words: "IBM PC Compatible" rule the market, none will dare change a damn thing for fear of breaking something, somewhere. And sure as hell, I wouldn't buy a system that's "almost" or "mostly" compatible. It's one of those "all or nothing" deals. In fact, it would require conscious effort to design a "quasi-PC" that works only with modern OSes, by removing only that legacy stuff. It would be nearly impossible to predict with how much hardware and software it would be compatible.

What you say will require a complete redesign of the OS, hardware expansions and even applications. E.g. go buy a disk controller. Chances are it has a BIOS ROM extension that only works in vanilla x86 real mode. Good luck with preserving compatibility with that....unless you are a secret fanboy of Apple's marketing model, that breaks everything totally every now and then (and I still don't get how they managed not to wipe out their long-time userbase by doing that). If so, do tell. It will save both of us more long paragraphs.

Share this post


Link to post
Porsche Monty said:

You're confusing the midi interface with mid files, and you even throw in an "effects" which neither was designed to handle.


Nope. I'm thinking about stuff like what System Shock or most Lucas Arts games pulled with MIDI, based on smooth transitions between subsongs. Or the ability to have the MIDI player mute some of the channels to give an effect similar to the Descent menu music. Or override instruments, or tempo, or whatever. You can algorithmically create endless variations from a MIDI song, and some enhanced MIDI formats (such as the Miles Sound System's XMI format) were designed for that.

Sure, you can do some of that with recorded songs too, but it's a lot more intensive. It's easier to synthesize a sound wave from a music sheet than to analyze a sound wave to turn it into a music sheet.

Share this post


Link to post
Porsche Monty said:

You're confusing the midi interface with mid files, and you even throw in an "effects" which neither was designed to handle.


What? The general midi spec has about eleventy controller events in it, all of which are valid in files as well as realtime streams. Not every synth may implement every one, but saying the spec is not designed to handle effects is completely wrong.

http://www.midi.org/techspecs/midimessages.php

See table 3


And that's not even getting into sysex messages.

Share this post


Link to post
Gez said:

Nope. I'm thinking about stuff like what System Shock or most Lucas Arts games pulled with MIDI, based on smooth transitions between subsongs. Or the ability to have the MIDI player mute some of the channels to give an effect similar to the Descent menu music. Or override instruments, or tempo, or whatever. You can algorithmically create endless variations from a MIDI song, and some enhanced MIDI formats (such as the Miles Sound System's XMI format) were designed for that.

Sure, you can do some of that with recorded songs too, but it's a lot more intensive. It's easier to synthesize a sound wave from a music sheet than to analyze a sound wave to turn it into a music sheet.


Again, midi music was intended for ancient hardware, and that's why the video game industry dropped it in favor of higher quality and more versatile alternatives as the necessary technology became affordable.

See both the instruments used by midis and recorded music as "presets". What you do with them is your business, but which one would you rather use, homemade presets made to you liking, or stiff, generic crap?

Share this post


Link to post
natt said:

What? The general midi spec has about eleventy controller events in it, all of which are valid in files as well as realtime streams. Not every synth may implement every one, but saying the spec is not designed to handle effects is completely wrong.

http://www.midi.org/techspecs/midimessages.php

See table 3


And that's not even getting into sysex messages.


We're addressing different contexts here.

Share this post


Link to post
Porsche Monty said:

See both the instruments used by midis and recorded music as "presets". What you do with them is your business, but which one would you rather use, homemade presets made to you liking, or stiff, generic crap?


Ahem....MOD music *cough cough*. Too bad that the hardware to properly support it never became mainstream on PCs (Gravis Ultrasound was the only soundcard really designed with that sort of music in mind, while others e.g. even the AWE32 were more MIDI oriented and not nearly as flexible). Even modern X-Fis by Creative (the only company still making sound cards with an actual DSP on them) are more geared towards supporting multi-positional audio, real-time filtering etc. rather than being optimized for module music.

Even MIDI abandonment first went through a painful phase where games used uncompressed CD Audio (with its own advantages as well as set of problems), until there were enough computrons on the average desktop PC to handle compressed audio in real time along with other processing.

In any case, what Graf says won't become a reality as long as the market will have to satisfy the need for 100% IBM-PC compatible machines. As long as there is demand, none would settle for an "almost" compatible, even if it was more future proof or "modern" on paper alone.

It may be already a reality in some market niches and/or other platforms, but good old classic Wintel still has a long career ahead, and outlived pretty much all competing platforms in the course of 3 decades.

Any oroposed "NEW and ENCHANCED!!!" PC architecture would have to go through a rather long transitional market coexistence period marked by low sales, inevitable comparisons, FUD etc. before it could eventually overtake the -admittedly ancient- IBM-PC architecture where it matters most: in The Market.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×