Is 1gb Of Graphic Memory Worth It? 2ms Vs 8ms

March 26, 2011 at 19:07:33
Specs: Win7 x64, Athlon 6400+ / 4gb Ram
Right now I have two 22" monitors running on DVI @ 1680x1050 with a 512mb dual DVI graphic card, I edit 1080i/p video with Vegas x64.I am upgrading to one 40" LCD TV running on HDMI @ 1920x1080 (same as my footage I edit) is it worth it to get a 1gb graphic card? Will it look JUST AS GOOD with a 512mb card as it would on a blu-ray player? Would a 1gb card make it look BETTER then a blu-ray player?

Also, my monitors are 2ms on the response time, the new 40" is 8ms, does this 6ms loss affect 1080HD video playback or?

See More: Is 1gb Of Graphic Memory Worth It? 2ms Vs 8ms

Report •

March 26, 2011 at 19:15:59
The amount of memory on a graphics card is not nearly as important as the type.

Report •

March 26, 2011 at 20:06:35
Really? How so? The one I have now is DDR4.

The only thing is my card supports up to 1920x1080 through 1 DVI port using a DVI to HDMI adapter, there is not an actual female HDMI connector on the board. It appears to use an odd pin layout in the DVI port, is this still the same true HD because I want a true HD desktop?

Report •

March 27, 2011 at 09:23:21
Your desktop can only look as good as the content being streamed to it. Below is a definition of HD. There are many parameters included.

You are comparing two different mediums. HD TV means a resolution of 720i. Your LCD display is capable of more than that.

I am not an expert on Graphics cards. That said, just as desktop memory has progressed to DDR3 which has a higher potential bandwidth than its predecessor, the same holds true with the memory on graphics cards. This generally extrapolates into the card being able to produce more details at higher frame rates.

So, if one card has 1GB of DDR2 memory and another with 512MB of DDR4 I believe the one with DDR4 will out perform the other, all other things being equal.

IMO pretty much any modern graphics is capable of HD. Even integrated lower end chips.

Blueray is better than DVD which is better than VHS.

The 2ms vs. 8ms comes into play when there is fast action on the screen. The faster 2ms is less likely to streak. Both are good compared to screens from just a few years ago when 25ms was the norm.

There are other factors to consider with an LCD display. Contrast is important.

As far as 1080i/p on a 22" display goes, I would say that is overkill. I doubt you can see the difference between 1080p and 720i/p.

One final point is that the high end graphics cards are for the most part aimed at gamers. The demands on the card when playing a fast moving game far exceed what yu would need to play a movie on a Blueray player. So, I don't think you need to over think your choices. Normal computing is like watching TV in Digital as opposed to HD.

Report •

Related Solutions

March 27, 2011 at 09:49:03
You make a valid point about the DDR3 and DDR4, but what about my interface type? With the supplied DVI to HDMI adapter? Obviously it probably will not have sound but I just want to make sure there is no bottle neck there opposed to having an female HDMI on the board.

As far as the true HD standards go I know all about that I have been editing for a long time. My cameras are all 1080 progressive and interlaced so it would be nice to see the whole image in full size frame by frame when editing. I also watch many full 1080HD movies on my computer so full resolution is nice there too.

Thanks for the refresh rate information, I dowt I will notice a difference. I'm using it as a monitor what is the correct viewing distance? haha I will be viewing it from 2 feet away with my desktop resolution set to 1920x1080 so things will be small probably. The other thing is I had the choice between 60khz and 120khz, would the 120 have been worth it?

Report •

March 27, 2011 at 11:41:37
HDMI is just a convenient method to connect both Video and Sound in one cable. Meant for TVs. It is no better than DVI.

TVs are somethimes part of an entertainment system. The HDMI cable would be coming from an AV receiver.

In the case of your computer you probably are using a separate speaker system.

Report •

March 27, 2011 at 19:34:45
Oh, ok thanks I did not realize DVI had such resolution capabilities.

Yes, I use a fiber optic cable from my sound card to a surround receiver.

Report •

March 27, 2011 at 21:27:01
If you compare the max frames per second rates of the same make of two video cards that have the exact same video chipset but different amounts of same type of memory on the card, you will probably find that there is no difference in performance at the middle and lower resolutions.
When the card has more memory, the max frames per second rates at the higher resolutions are better.
With recently released or fairly recently released video chipsets on desktop graphics cards, you would have to have a huge display that is able display the higher resolutions in order to be able to actually make use of the higher resolutions without making things on the display too tiny, so in most cases you don't benefit from more ram on the card, after a certain point.

Report •

March 27, 2011 at 22:01:14
What do you consider higher resolutions? Something over 1920x1080?

Report •

March 28, 2011 at 09:05:49
What the higher resolutions are for a particular video chipset varies, of course.

My point is, if you don't benefit from having more memory on the card with the monitor or display you're using at the max resolution where things are not too tiny, there's no advantage to you of the card having more ram and you paying more for that.

I just looked on the web and found it's difficult to find the specs about
- what resolutions a particular video chipset supports
- fps ratings for the same video chipset / video card with different amounts of memory on the card

fps ratings, of course, depend on what software you're testing the video chipset with.
E.g. with a video benchmark test I did the other day on a friend's system I was working on, a Radeon 4850 512mb card produced an average of about 350 fps over 3 minutes at whatever resolution it was testing at, 22 fps at the same resolution for the same system's onboard Radeon 3200 video chipset.

I just looked at the boxes of a few video cards with ATI video chipsets that I have.

What they do rate is the standard resolutions the video chipset supports and what the max vertical refresh rate is at each resolution..

A relatively older made by ATI AIW card with a Radeon X600 Pro video chipset.
256mb DDR memory

640X480, 800X600, 1024X768, 1152X864 - 200 Hz max vertical refresh rate.
1280X1024 - 160 Hz
1600X1200, 1920x1080 - 120Hz
1920X1200 - 100 Hz
1920X1440 - 90 Hz
2048X1536 - 85 Hz

As I recall from research I did years ago, the more memory on the card, the higher the max vertical refresh rate at the higher resolutions, the max at the lower resolutions were identical to cards with the same video chipset with less memory.

The max vertical refresh rate at a particular resolution is directly proportional to the max rate for a particular fps test.

I have only one LCD monitor, a 5 year old Samsung, 6O Hz max vertical refresh rate at it's "Native"or "Optimal" resolution.

I've never had a CRT monitor that was capable of more than, say, 150 Hz vertically.

For the LCD displays, the cheapest have a 60 Hz vertical refresh rate at their "Native"or "Optimal" resolution, which is usually also it's max resolution.
More expensive models have a 120 or 240 Hz vertical refresh rate at their "Native"or "Optimal" resolution.

The only modern displays I know of that can exceed that are the Plasma displays - 640 Hz.

The higher the max vertical refresh rate, the better video motion looks (it's less blurred).

Report •

March 28, 2011 at 09:32:15
Wow what a great post packed with information, I definitely will stick with the card I am using now.

I will be only using 1920x1080 desktop resolution (displays maximum) that way if I open a video file from one of my cameras and watch it in full screen I will see it at it's full resolution as what it would look like on a Bluray player in someones living room.

I want to say I had an old 21" "big back" monitor in the early 2000's that was something around 85-90Hz, some sports videos I take at 60fps probably would look better on a 120Hz display then. And this also relates to the refresh rate correct? (blurriness)

Report •

March 28, 2011 at 09:48:52
LCD displays work differently than the old CRT type you described. The refresh rate on a 21.5 CRT would need to be at least 75Hz. LCD displays are rock solid at 60Hz.

Response time affects fast movement on the screen. Lower is better.

Go to the link below and scroll to the bottom. There is some useful info there about LCD dispays.

Below is another useful link

Report •

March 28, 2011 at 20:39:50
Great article!!!

Report •

March 31, 2011 at 20:41:25
So I got my 40" hooked up on my desk here, it's just as wide as my two 22" monitors together were. But it's twice as tall which gives you a stiff neck, I kind of hate it. Text is not as clear as my 22" monitors were, it looks best in "game mode" which still looks horrible. I tried all different resolutions, 1920x1080 is best and recommended. It also has a 1/2" letterbox on all 4 sides, I tried all size settings on the TV menu. I am going to try my hardest to return it without being charged a $75 restocking fee, my only other option besides the HDMI is the USB or "PC" input which I have not tried yet because I don't think that will look good at all. I bet it's just a cheap USB video card in the TV. Very disappointing

Report •

April 1, 2011 at 07:29:07
LCD and Plasma displays have the disadvantage that the display, especially the text, looks best at the "Native" or "Optimal" resolution - at all other resolutions, the display does not look as good. (CRT displays do not have that problem).

Choose a resolution for the TV display that is the same as the display's "Native" or "Optimal" resolution - see the TV's manual

Turn on Clear Type in Windows XP or Vista or Windows 7- makes type / fonts on LCD and Plasma screens look clearer.

There is no video card in the TV.
When the TV can display in computer monitor mode, it has smaller pixels that are capable of displaying better as a computer monitor, but they're often not as small as they would be on a computer monitor

Report •

April 1, 2011 at 07:41:36
I have the option of "natural" or "native" but natural leaves a 1/2" letter box on all 4 sides, and native is even smaller with a full 1" letter box on all 4 sides. I am already using clear type, I might try the USB interface but I am determined to return it for two 24" monitors instead that are 1920x1080 resolution, LED and 2ms.

Report •

April 1, 2011 at 08:00:09
Another point to consider when looking at Video Card memory is the speed that it runs at. For example if you were considering an NVidia GTX460 (great value card), you might think that the 1GB card is the same as the 768MB but just with a little more RAM. In fact the 1GB version runs the RAM at a higher clock speed, with an appreciable performance improvement.

Report •

April 1, 2011 at 10:19:08
ijack, I pointed that out in #3 above.

Report •

April 1, 2011 at 10:41:46
Sorry if you think I'm repeating something you have already said, but I can't see anything in that post that makes it clear that the same memory on the same card, just different amounts, may be run at different speeds.

Believe me, I was just trying to emphaize that particular point not repeat what has already been said.

Report •

April 1, 2011 at 12:17:02
ijack, no problem. I was just emphasizing to pink that the type of memory on the card is probably more important than the amount. They were asking about 1GB of card memory.

Report •

April 2, 2011 at 13:22:31
I can definitely say I run my monitors a solid 8 hours a day 7 days a week. I have had really good luck with my Samsung 2253BW 22" monitors, 3 year money back warranty. Plus Samsung is a highly established name and their products look better as far as appearance (bezel, etc) so I sent back to them and ordered two of these last night:

The only difference is it's 50 cd/m2 less on the brightness however it is LED compared to the ASUS. The only bad reviews are on the stand which I put up with on the 2253BW's so it should work fine for me, and they looks nice.

They are ONLY HDMI & VGA so they come with DVI to HDMI cables, would thicker aftermarket cables make a difference as this person did:

Report •

Ask Question