PDA

View Full Version : GeForce 4 info...........


CaptainRAVE
12-26-2001, 05:30 PM
GeForce4 Ti 1000. This is the fastest graphics cards built on GeForce4 chip working at about 300MHz frequency. The card will have AGP 8x interface and 128MB DDR SDRAM memory working at 700MHz.

GeForce4 Ti 500. This is a bit slower solution with around 275MHz chip frequency and 600MHz memory frequency. Although it will have AGP 4x interface, the card will still come with 128MB graphics memory.

GeForce4 MX 460. This is the eldest representative of the GeForce4 MX family. It will probably have 4 rendering pipelines and DirectX 8-compliant T&L unit. The amount of DDR graphics memory used (with 128bit bus) will be cut down to 64MB, and its working frequency will be reduced down to 550MHz. The core will work at 300MHz.

GeForce4 MX 440. These cards will go with 64MB DDR SDRAM with 128bit access bus. The memory working frequency will be 400MHz and the core frequency – 275MHz.

GeForce4 MX 420. According to the available data, this GeForce4 MX version will be targeted for the Low-End market that is why the cards built on it will have 64MB SDR SDRAM memory working at 166MHz. The core will work at 250MHz. Besides, It looks as if there were only two rendering pipelines in this modification.

digl
12-26-2001, 05:48 PM
I had seen that but hadnt post anything because It said there were rumors, I saw exactly the same about two or three days ago

Is it official now?

Nob Akimoto
12-26-2001, 06:00 PM
Personally while I have a lot of respect for Xbit labs, I find this would be a step backwards for nVidia, ESPECIALLY considering what they've announced about their product line with the advent of the Titanium series.

A more likely product line in mid-late 2002 would be.

GF4 Ti-Insert Random Number Here(possible 1000) Super High End. Based on NV25 core at .13 Micron, with something like 375Mhz DDR-SDRAM ram.

GF4 Ti-500 or lower. Mid-end card filling the current niche the GF3Ti200 is. Probably NV25 core at .13 micron, with a clockspeed of about 250-300Mhz.(Lower clock is likely.)

NV17 Derivitive. The NV17 is apparently the new mobile chipset nVidia has planned. Since mobile chipsets are by their nature smaller, less heat intensive, and more economical expect a chip based on this at .15 micron to be the low end card. Specs are probably roughly GF3 level, with 2 rendering pipelines and single vertex shader.

matt--
12-26-2001, 09:57 PM
AGP 8x? That's not even out yet(please correct me if I'm wrong).

Does this mean I need a new mobo of a Ti1000?

digl
12-26-2001, 09:59 PM
I think you are right about not being out yet
But maybe you can install it in an AGP 4X mobo, and you just dont benefit from the 8x

Toa Tahu
12-26-2001, 10:36 PM
As far as I know,GeForce 3 was just released this year...And,I purchased a new comp.,this year,only with GeForce2.Sheesh!Why those people must come out with this things so fast?!

OnlyOneCanoli
12-26-2001, 10:46 PM
To crush the competition. They know ATI won't be able to keep up with their six month cycle. Games get better graphics, games need better graphics cards.

But that 8x thing is nasty. Would I have to upgrade my mobo in order to use the Ti 1000? I just did that less than a month ago. :( Or just stick with the Ti 500.

Frosty_V2.1
12-26-2001, 11:50 PM
Hmmm I seem to remember reading somewhere that a VIA based mobo has been released with 8x support, is that right?

Anyways, seems to me that if all the other GeForce 4's support AGP 4X why not theTi1000, although it wouldn't be worth buying if u didn't have a rig to match IMHO.

I've waiting a long time this...

Millions o' Monkeys
12-27-2001, 12:02 AM
this is hell, i thought id be cool if i went out and bought myself a TI 500 but now im not too sure...i might just stick with my 2 mx

Darth Bjorn
12-27-2001, 01:13 AM
Well, I was gonna go out and get me a Ti500. But now I think I'll just hold off for a bit. Though this sounds a little fishy to me. I'll really believe it when I see it on the nVIDIA site.

StephenG
12-27-2001, 02:42 AM
Originally posted by Darth Bjorn
Well, I was gonna go out and get me a Ti500. But now I think I'll just hold off for a bit. Though this sounds a little fishy to me. I'll really believe it when I see it on the nVIDIA site.

i've been saving for a big upgrade for almost 2 years. i was gonna buy all the upgrade stuff jan 1st but now i dont know too.

digl
12-27-2001, 01:02 PM
the more you wait the better what you get
that happens always with PCs
You can be waiting forever without upgrading, because there is always something better coming out in a few months
What I want to upgrade now is my ram
Maybe when the Geforce 4 is released Ill get a G3Ti500 cheap

acdcfanbill
12-27-2001, 05:40 PM
if i buy a GF3, thats when i would get it too.. im looken at the radeon 8500 really closely though... ;)

Nob Akimoto
12-27-2001, 05:43 PM
With regards to AGP form factors...

AGP 8x is(to my knowledge) slot compatible with AGP 4x, the only thing is you won't get the extra benefit.

I wish they actually would've raised the AGP bus speed rather than just doing this all over again...

It's been proven faster clocked AGP buses do more than switching from a 2x standard to the 4x...

digl
12-27-2001, 05:45 PM
me too
But in the last reviews the 8500 still is not match for the GF3Ti
Although the radeon wins in directx 8 tests, and maybe that could make it better for future games
And the radeon is much cheaper...
By the time the GF4 is released Ill check the latest 8500 reviews and decide what to do :)

acdcfanbill
12-27-2001, 06:12 PM
eh? what reviews are you reading? most everything i see puts the 8500 slightly above the GF3Ti... even my buddies benchmark agrees...

StormHammer
12-27-2001, 07:04 PM
Originally posted by digl
[B]the more you wait the better what you get
that happens always with PCs
You can be waiting forever without upgrading, because there is always something better coming out in a few months


I couldn't agree with you more. I've often felt the same...upgrade now...or wait for that rumoured new piece of kit? You could end up getting yourself in a negative-feedback loop and never upgrading. The only problem is...you realise by the time you do upgrade that your old piece of kit isn't worth anything because the current low-end PC spec at a budget price blows your kit completely out of the water.

My current advice would be to wait for the new piece of kit to arrive to get it's predecessor at a lower price...then ditch that in 6 months to a year to get the next step up the ladder, so you can at least get some money back for your redundant components before support for them is dropped by the manufacturer.

Of course, the only problem with that scenario is that you constantly have to find the money to upgrade...but at least you should make some money back if you upgrade at the right time. Unless you're a charitable idiot like me and just give the redundant stuff away to your relatives and friends. :)

Agen
12-27-2001, 07:04 PM
Yep 8500 seems to beat gf3 in msot reviews not by much thoguh

xwing guy
12-27-2001, 07:34 PM
Yeah it does, but if it came down to choosing between a gf3 or a Radeon 8500 I would choose which ever one was cheapest because they're so close.

digl
12-27-2001, 07:40 PM
looks like I havent read any review after the last driver release by ATI
Ill search for some new reviews

acdcfanbill
12-27-2001, 09:02 PM
thats probably it, digl, if i remimber correctly, when ATi released its new drives to utilitize all of the radeons features, then it beat out the GF3Ti...

Tap[RR]
12-28-2001, 07:41 AM
Read this Radeon 8500 review W/ New drivers (http://www.hardocp.com/reviews/vidcards/ati/8500_revisited/) they compare the radeon drivers as well as the geforce3..... =), im thinking about upgrading again (just upgraded ram to 512 1 day ago for jk2 >_<). Heh... now i need/want a radeon or geforce3, cant make up my mind :confused: .

digl
12-28-2001, 05:42 PM
I read a newer review yesterday and the 8500 won almost everything
Ill check that review
Thanks for the link Tap

Darth_Lando
12-28-2001, 06:17 PM
I am glad Ati is stepping up to the plate. That benefits all hard core gamers in the end (whether you buy their cards or not).

Kyro III should be coming out as early as Feb (though I bet there will be a delay). This card is supposed to be "it" for powervr. Supposdly has TRUFORM, a directx8 card, and as fast as the Radeon 8500 for half the price. That is about $150 US. But knowing their track record, I am not holding my breath on this card.

GF3 and R8500 are both great cards. Personally I am leaning more towards the Radeon. But if I somehow ended up with a GF3 I wouldn't complain one bit.

It's just too bad about Matrox.

Sherack Nhar
12-28-2001, 07:15 PM
Bah, Matrox aren't competitors. They just produce awesome business cards, that's all. Their targeted audience is much different.

I think those rumored specs are right on the mark. I'm also suspecting that 64-bit color depth support is just around the corner... It'll be a great new way to hammer down your framerate with no visible enhancement to the graphics!

EDIT: If your guys want to get a GF3 Ti500 or a RADEON 8500, I sure hope you don't want to play DooM3 or Quake 4... those cards are hardly able to run the new DooM engine.

Lord_FinnSon
12-28-2001, 07:46 PM
You mean with MAX quality settings? I guess you are right about that, but these games have so many options that you can lower down, if you don't get enough frames with your graphics card; you might also wanna use(if you haven't already done so) applications like NVmax and/or Powerstrip to do some additional tweaking with your cards settings. Of course I have to admit that you should always have a possibility to play every game with their highest quality settings(without too many frame drops) like they were meant to be, but that would simply mean you have to buy a new Nvidia/Ati graphics card every year. As a sidenote, I'm also going to upgrade my PC quite soon, because after almost three years, my good old Riva TNT just can't do the job; I was able to extend its lifespan only by playing with max performance settings, while dreaming about newer, bigger card that could some day show me every possible little detail. :D

Sherack Nhar
12-28-2001, 08:55 PM
Originally posted by Lord_FinnSon
Of course I have to admit that you should always have a possibility to play every game with their highest quality settings(without too many frame drops) like they were meant to be, but that would simply mean you have to buy a new Nvidia/Ati graphics card every year. That's not entirely correct. John Carmack's engines has always been known to be incredibly demanding. If you have a graphic card that runs Carmack's engines at their highest settings (with a 60+ framerate) then you can run about anything that's on the market right now.

BTW, nice avatar ;)

Nob Akimoto
12-28-2001, 11:50 PM
Originally posted by Sherack Nhar
EDIT: If your guys want to get a GF3 Ti500 or a RADEON 8500, I sure hope you don't want to play DooM3 or Quake 4... those cards are hardly able to run the new DooM engine.

You base your claim about D3/Q4 on what?

The tech demos for Doom3 were run on a GeForce3, I hardly would say that's "hardly" running a game's engine. Especially considering the unoptimised state such a demo would be in.

While I don't quesetion id Software's ability to create new game engines that continously push the envelope in terms of performance stress, it should be pointed out that the main sell point of these games is exactly so, as a game.

No developer, not even one with the clout of id and Carmack would release something that would obsolete the mid-end(as of H2 2002), nevermind the rest of the generations behind it.

Considering the D3 itself is coded to take advantage of the NV20 architecture, I really can't see how it wouldn't run said arch at a playable framerate at a medium/higher resolution.

The only real bottleneck is texture memory, and unless DooM3 uses ungodly sized textures per scene and push more than 50,000 polys I really can't see how it'd crawl it down to a halt.

For all intents and purposes while in terms of overall capability id engines do show off a great deal of flash they're hardly the most demanding things on the market.

If you really want demanding, check out professional 3D apps, now THOSE are demanding.

Sherack Nhar
12-29-2001, 12:33 AM
Sorry for being so imprecise, here is the full story:
Take this link (http://www.nvnews.net/cgi-bin/search_news.cgi?keyword=john+carmack) and go see "The Carmack- Part 2"

Here are various quotes by The Man himself:
The low end of our supported platforms will be a GF1 / 64 bit GF2Go / Radeon, and it is expected to chug a bit there, even with everything cut down.
This one shows that Carmack is clearly not aiming at introducing the New DooM to the mainstream crowd. However, keep in mind that DooM3 is still a long way from being released. Even so, it shows that mass-market accessibility is just not a concern to good ol' John :)


We are aiming to have a GF3 run Doom with all features enabled at 30 fps. We expect the high end cards at the time of release to run it at 60+ fps with improved quality. The "high end" cards he's referring to are probably going to be released in spring.

The reason that the New DooM engine is so demanding is that the lighting model that Carmack has created is just so darned complex that it brings "older" videocards like GF2 to their knees. You should check out that Carmack interview on Gamespot under their DooM3 coverage.

I hope I've been clear enough this time ;)

ed_silvergun
12-30-2001, 01:30 PM
Bear in mind that the DooM 3 demos that were exhibited were probably running on high-end PCs which might well not be available to the mainstream market yet.

GeForce3 will not run the new engine from id very well at all. Yes, you'll get a playable framerate with some graphics cut down, but if you want to play it as it was intended you will need a GeForce4 or equivalent. Remember that the new DooM game is not going to be out for about another year, maybe more, and that GeForce3 will be nearly two year-old technology by then. That's pretty old by computer standards!

Yes, AGP 8x will run on a 4x bus, you just won't get the extra speed benefits.

Darth_Lando
12-31-2001, 09:45 AM
LOL. Even though DOOM 3 will only play at 60fps on the fastest cards on the market when it does come out, everyone will be saying to wait another 6 months because the next generation of 3d cards will be out and will be able to play the game at 100 fps. then when those come out people will say "wait 6 more months then you can play it at 130 fps on the next generation cards..." ect :D :D :D

Vagabond
12-31-2001, 04:36 PM
Well, regardless of what card will run Doom 3 acceptably, I doubt I'll be buying any "high-end" video cards now, or in the future. Spending over $200 for a video card, while easily within my budget, just doesn't offer a significant ROI, from my point of view. For example, I've recently purchased a VisionTek Xtasy GeForce 3 Ti-200, for the computer I'm building, for only $199. That card offers the functionality of the higher end GeForce 3 cards, and excellent speed for a relatively reasonable price. Sure, I could have spent an extra $150 on the Ti-500 for what? An extra 20 frames per second? That's just not worth it to me.

If Doom 3 can't run on anything but a GeForce 3 and above, then it is likely that Mr. Carmack will be somewhat disappointed in his sales figures.

psycoglass
12-31-2001, 05:00 PM
Whats the difference between a AGP 4x and A AGP Pro slot. I don't believe any of the Geforce 4 news until I see it on a news site or Nvidia's press release.

The_Phantasm
01-01-2002, 10:01 PM
"GeForce 4 info...........
GeForce4 Ti 1000. This is the fastest graphics cards built on GeForce4 chip working at about 300MHz frequency. The card will have AGP 8x interface and 128MB DDR SDRAM memory working at 700MHz.

GeForce4 Ti 500. This is a bit slower solution with around 275MHz chip frequency and 600MHz memory frequency. Although it will have AGP 4x interface, the card will still come with 128MB graphics memory.

GeForce4 MX 460. This is the eldest representative of the GeForce4 MX family. It will probably have 4 rendering pipelines and DirectX 8-compliant T&L unit. The amount of DDR graphics memory used (with 128bit bus) will be cut down to 64MB, and its working frequency will be reduced down to 550MHz. The core will work at 300MHz.

GeForce4 MX 440. These cards will go with 64MB DDR SDRAM with 128bit access bus. The memory working frequency will be 400MHz and the core frequency – 275MHz.

GeForce4 MX 420. According to the available data, this GeForce4 MX version will be targeted for the Low-End market that is why the cards built on it will have 64MB SDR SDRAM memory working at 166MHz. The core will work at 250MHz. Besides, It looks as if there were only two rendering pipelines in this modification."

Again this is speculation, I've also seen this on the IGN forums.

Nob Akimoto
01-01-2002, 10:36 PM
The originator of this speculation is Xbit hardware.(At least to my knowledge.)

It'd be nice if people would actually post the SOURCE of news clips from hereon out.

It's the very least one can do. It's just a matter of decency.

digl
01-01-2002, 11:16 PM
I also read it at Xbit hardware long ago, they possibly originated this
Its a rumor, thats why there is no source, if the source was someone working at Nvidia he would be fired at once if his name is published all around the web with the specs of the announced cards :D

StephenG
01-01-2002, 11:28 PM
Originally posted by Vagabond
If Doom 3 can't run on anything but a GeForce 3 and above, then it is likely that Mr. Carmack will be somewhat disappointed in his sales figures.

when is Doom3 coming out? by the time D3 comes the gf4 will be out or maybe the gf5????? who knows. the point i'm trying to make is when Doom3 comes just about every PC gamer will have a gf3. who still uses TNT1 or the ones b4 it?

digl
01-01-2002, 11:35 PM
the point i'm trying to make is when Doom3 comes just about every PC gamer will have a gf3
Im not sure of that, but things go so fast that Its possible

Nob Akimoto
01-02-2002, 12:40 AM
Originally posted by StephenG


when is Doom3 coming out? by the time D3 comes the gf4 will be out or maybe the gf5????? who knows. the point i'm trying to make is when Doom3 comes just about every PC gamer will have a gf3. who still uses TNT1 or the ones b4 it?

I see you're not very well versed in the PC industry then.

While a product cycle of a particular graphics manufacturer is 6-10 months(horrible horrible cycle btw...), the turn around time for the AVERAGE PC user is in the order of 3-5 years.

Quake3 for all intents though selling well(considering it did have much more reasonable minimum specs compared to it's next available successor, D3)

While I don't know anyone using the Riva128 chipset(it was a crappy piece of **** card to begin with), I know quite a few casual gamers who still use TNT's.

Right now I'd say a GF256(yes that old card) as minimum would be the most reasonable for ANY game that wants to have mass market appeal.

The truth of the matter is Q3 and it's brethen overall probably sold very FEW copies compared to titles such as The Sims, or the Tycoon series which do not require $400 ridiculously overpriced, overhyped pieces of silicon to function.

Games sell very few copies when they set outrageous minimum requirements. At least during their most profitable times.

Which is why publishers will jack up the price to compensate.

Sorry, I'm not paying $400 then an additional $20 over the standard $40 to play some overhyped no-game-depth flash fest...

StephenG
01-02-2002, 01:39 AM
Originally posted by Nob Akimoto
Sorry, I'm not paying $400 then an additional $20 over the standard $40 to play some overhyped no-game-depth flash fest...

i'm not saying u have to, just when new stuff comes out prices of the old stuff goes down, in time. not even i'm gonna put $900australian dollars into a vid card. i got my gf2ultra cheap, very cheap after about 3 months after the gf3 release. D3 is very far from release, i still reckon when it comes out most pc gamers will have something like a gf3. The 3d graphics card industry is moving very very fast.

digl
01-03-2002, 09:11 PM
http://www.msnbc.com/news/681639.asp

those guys were the source of the GF4 info