Jump to content
Search In
  • More options...
Find results that contain...
Find results in...
AveryMaurice

Integrated Graphics Cards

Recommended Posts

Just a quick but important question: I have an ATI Radeon HD 3200 stock but its integrated into my motherboard, can I still buy a newer graphics card and install it without problems? Any help would be appreciated.

Avery

Share this post


Link to post
Mr. Freeze said:

Integrated = part of the motherboard. You can't remove it.

I have heard of people paying hefty amounts of money for people to take apart the chipset and replace components in order to remove integrated graphics cards. This usually requires getting a new motherboard anyways though.

Altough I would like to know if I could install a new one and just forget about the old, or if it will create a conflict.

Use3D said:

Get another one and disable the built-in. Whee.


How do I do this black magic of disabling the built in one?

Share this post


Link to post
AveryMaurice said:

I have heard of people paying hefty amounts of money for people to take apart the chipset and replace components in order to remove integrated graphics cards.


That would be the stupidest thing in the universe.

Share this post


Link to post
Use3D said:

That would be the stupidest thing in the universe.

More then likely, this is just information I'm getting from "tech experts" at my local hardware store. >.>

But how do I disable the built in one?

Share this post


Link to post
AveryMaurice said:

"tech experts"

AveryMaurice said:

at my local hardware store


Hmmmm...

Share this post


Link to post

Never go there again, they are just stoking their e-penis spewing rubbish like that.

As long as you have a PCI/AGP/PCI-E slot on the motherboard, you can add another gfx card and disable the onboard one as others said.

AveryMaurice said:

But how do I disable the built in one?


There will be an option in the BIOS to either disable the onboard, or change the preference from onboard to AGP/PCIE.

Share this post


Link to post
AveryMaurice said:

How do I do this black magic of disabling the built in one?

Simply through the BIOS. Also, most motherboards that I've seen that feature integrated graphics chips lack higher-end expansion slots as a compromise. So be aware of what slots are available, and purchase a suitable card.

Share this post


Link to post
AveryMaurice said:

But how do I disable the built in one?


In general, by plugging in a new one. You didn't specify what mobo you have though -you may have one of those broken barebone/"media center" motherboards that DON'T have PCIex slots. In that case you're stuck with the integrated, and at most you can downgrade to a PCI (plain PCI) video card.

If that's not the case, plugging in a new one will automatically disable the integrated one, in sane BIOSes at least.

In most, you can just find a "Video card priority" setting somewhere, and select between integrated, PCI, PCIex or even a combination thereof (for multiple monitors).

Share this post


Link to post

Well, a quick google search for Acer Aspire M3202 shows that it comes with integrated ATI 3200 graphics support, but also has expansion slots for PCI-E 2.0 cards. Open up your case, if you have a PCI-E expansion slot in it, then you can just install a new card and it should override the integrated graphics, if not then as explained above disable the integrated graphics in the BIOS. If you need to know what a PCI express slot looks like, do an image search. I've done the same in past computers with integrated graphic cards, just put a new one in and it'll work.

Share this post


Link to post
Maes said:

If that's not the case, plugging in a new one will automatically disable the integrated one, in sane BIOSes at least.


However, problems arise when you have insane BIOSes. In case anyone's interested, my own personal graphics card debacle has not been resolved.

Share this post


Link to post

Incidentally, which card manufacturer is it that allows on-board video to act as a backup renderer when you plug in a proper card? Also, I thought I heard that one of them was working on an SLI/Crossfire solution that doesn't require two of the same card.

Share this post


Link to post

"Hybrid Crossfire" by AMD/ATI allows you to run certain Radeon cards in Crossfire with certain Radeon integrated video chipsets.

Lucidlogix makes Lucid Hydra, which does multi-GPU graphics with any modern cards.

Share this post


Link to post

I figure that if you have Windows, you could probably be able to install new video cards without even having to mess with the BIOS, what with Plug-and-Play and the like. Then again, I've never installed a video card before.

I have an NVidia chipset and from what I was told, it will work better if I get NVidia video cards (I intend to double up when I do get them). So I'd suggest sticking with ATI for the best performance.

Share this post


Link to post

My old computer had a built in one, we just plugged in the new one and then connected the monitor to it instead of the motherboard one.

Share this post


Link to post
Creaphis said:

However, problems arise when you have insane BIOSes. In case anyone's interested, my own personal graphics card debacle has not been resolved.


At that point I'd just spend $30 on a 7300, 8400 from a local hardware store or somesuch, and call it a day. It will be better than what you have, by OVER 9000!!! It's what, one day's worth of salaries? You'll have to skip a couple of meals, I can live with that :-p

I don't get why people HAVE to buy stuff from online auctions with zero security/warranty, hoping to shave $1-$2 that they will lose anyway in delivery time/potential DoAs.

As for this thread, that thing seems to be using a custom ACER mobo:

http://www.bios-mods.com/forum/Thread-Acer-Aspire-M3202-R01-C0L-Acer-2-1-SLIC

Which however should have a PCI-express x 16 2.0 slot, so an upgrade might be possible.

http://uk.answers.yahoo.com/question/index?qid=20090905101859AA6JkQK

Since the PSUs on such PCs are incredibly tightly powered (possibly even BTX-size factor) you may not be able to practically use anything better than an nVidia 7300 or something along this class.

I don't understand why you think that disabling the onboard will be the hardest part in this process. On the contrary, it's usually the easiest/automatic/unnecessary. There are other more serious issues you should consider instead.

Share this post


Link to post
deathbringer said:

My old computer had a built in one, we just plugged in the new one and then connected the monitor to it instead of the motherboard one.

Ditto - then went into the BIOS and reclaimed 256MB of Ram.

Share this post


Link to post
GreyGhost said:

Ditto - then went into the BIOS and reclaimed 256MB of Ram.


Heh you're lucky if you still have a BIOS where you can do that -most onboards today, especially Intel-based have a "dynamic memory management" which doesn't allow you to have a minimum guaranteed amount video RAM (thus making some games fail to even start), nor a clear cap on how much "normal" RAM you will be short of at any given time.

Share this post


Link to post

As if there weren't enough arguments against on-board video.

For my money, it's just good sense to get the most powerful card if I can afford it. My affair with mid-range cards ended with the GeForce 8600GT. There was virtually no fps difference turning graphics features on and off - in that sense, it was rather powerful. On the other hand, if there was ANY action going on in-game, the fps would drop to single digits. No amount of tweaking seemed to allay the huge burden on the GPU and it pretty much proved to me that there's no substitute for pure clock speed.

Fortunately, I had an SLI-capable motherboard. The benefit was noticeable, but nowhere near giving my computer a second chance at life. That's a whole other story.

Share this post


Link to post

You must be doing it wrong. My 8800 GT works great and I got it secondhand. When I bought my previous video card (X1950 Pro) this thing was about the most powerful you could buy. I could have afforded it at the time, but it would have been a fantastic waste of money. Adding what I paid for it to the cost of the previous card didn't even come close to equaling what I would have paid for it new.

Share this post


Link to post

How is the graphics card going to help him if the bottleneck is in the CPU? Sure, if you run a GPGPU Demo such as this one, having a CUDA-enabled GPU and an application/game that supports it is bliss, and you can do stuff well beyond what your CPU alone would allow (e.g. I can get 300+ FPS on a real-time high-detail fractal on a Pentium IV @ 3.00 GHz, which is impossible with software alone).

However some games are just CPU intensive on their own: even if the graphics took away zero time, you'd STILL have to spend a lot of time in the game logic.

I know I've written a LOT of posts against the "Don't buy a too powerful GPU! It will get wasted!" myth, but that was against a precise fallacy: that somehow, somewhat, a "too fast" GPU would automagically slow down an older PC. It won't, actually. In fact it will speed up parts where the rendering is the bottleneck (if you find particularly GPU intensive but NOT CPU intensive demos and the such).

But it won't enable a Core 2 Duo-class to run on a Pentium III. Or even a Pentium IV.

Share this post


Link to post

Huh? I wasn't talking about the CPU.

For one, the GeForce 8800GT is still a good card. My point was that, in terms of GPU, a little graphical sheen is a bad tradeoff for pure processing power. The 8600GT I had was shamefully underpowered. Even with a second card in SLI, they were still outperformed by a single 8800GT. I actually regretted my 7800GT dying out. It could at least hold its own when there was a lot of action on-screen, which is what REALLY matters.

Can you think of a game that uses cutting-edge graphics that you wouldn't mind playing at 8fps?

Share this post


Link to post

Well there's your problem. Falling for Nvidia's numbering scheme. This 8300 must be faster than a 6600 because it's an 8 series and not a 6! WRONG! The 8300 is actually comparable to the 6300. The 6600, 7600, and 8600 were all essentially the same video card just renumbered. It's really no surprise you found the 7800 faster.

Share this post


Link to post
POTGIESSER said:

The 8300 is actually comparable to the 6300. The 6600, 7600, and 8600 were all essentially the same video card just renumbered. It's really no surprise you found the 7800 faster.

Actually, no, you can't compare different generations to each other at all using the number scheme. What matches what between generations varies all over the place. The only thing the numbers are good for is telling what generation your card is and what its capabilities/powerfulness is versus other cards in that same generation.

Share this post


Link to post
Nuxius said:

Actually, no, you can't compare different generations to each other at all using the number scheme.


That is not entirely accurate ;-)

At least in nVidia, the hundreds digits represented, as you said, how a certain card stood among its generation. In general, numbers lower than 300 = budget, 500-600 = medium range and anything above that (especially 800) was considered high-end.

Usually the high-end of one series was far superior to even the medium-range of the next series and the series after that (and ATI followed a similar numbering scheme, too).

But a 6600 being the same as a 7600? The 7x00 series was already using unified shaders and had DX10.0 compliance (even the low-end models), the 6x00 series was the last DX9.0 nVidia series with the traditional pixel-vertex pipeline architecture and released in AGP versions too.

That being said, the LOW end 6x00 models (especially the 6200) are still marketed along with the various 7200, 8200 etc. cards (however the latter have unified shaders and DX10 support, quite a big departure).

Then there were oddball numbers such as 150, 125, 250, 350, 550 etc. which meant that something was "off" like e.g. using slower/shared memory, being integrated, lower clock speed, less pipelines etc.

Share this post


Link to post
Nuxius said:

Actually, no, you can't compare different generations to each other at all using the number scheme. What matches what between generations varies all over the place. The only thing the numbers are good for is telling what generation your card is and what its capabilities/powerfulness is versus other cards in that same generation.

Not true. Compare performance of an 8800GTX to a 9800GTX, the difference is negligible. I've owned a 6600, 7600, 8600. Gained a couple FPS each time, whooptydo essentially bought the same card 3 times over with very minor changes. Didn't spend as much on the 7600 and only a fraction of that on the 8600 but still.

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×