PDA

View Full Version : Tweak JK II to its MAX PERFORMANCE! L@@K This!


hvydarktrooper
05-13-2002, 10:43 PM
I have discovered a way to make your JK II game run better. If you go to http://jkiiarena.tk and go to the cheats page you'll find out! THis really works and I garuntee your game will run way better and at its Max on your PC! :) :fett:

hvydarktrooper
05-13-2002, 11:06 PM
Let me know how this works for you. I hope this makes your guys JK II experience better since thats what JK II Arena promises. :fett:

Bambers
05-13-2002, 11:19 PM
hmm. I automatically put com_fps to 1000 (or some other huge number) in all q3 games. :)

Doesn't make much difference in JO mind as my 1GHz athlon can't quite keep up in heavy fights. :(

hvydarktrooper
05-13-2002, 11:23 PM
It made mine run a lot better. Even though I had a great system (Alienware) JK II still didn't run at its full potential. I did this and it ran awesome!

Mobius47
05-14-2002, 01:35 AM
800x600
16 bit color (human eyes cant distinguish much more than 16,500 colors)
(16 bit is about 16,000 and 32 bit overkill:rolleyes: ,go figure)
dynamic lights
trilenear blah blah blah... you get the picture
i get ~40 fps
(my game boy advance can do that w/doom:D :rolleyes: )

comp =
Dell PIII L series (L550r @ 550 mrgahertz):cool:
256 megabytes ram:cool:
geforce 2 mx200 "Xtasy" 32 megabyte (PCI):cool:
Altec lansing speakers with a big box (i dont really care 'bout sound:rolleyes: )
Window 98:D

Can i get higher FPS?

Chastan
05-14-2002, 02:30 AM
Be sure not to set com_maxFPS too high for your system in MP mode or it can directly affect your ping / network performance.

Kaotic
05-14-2002, 03:26 AM
Yeah, it's better to set maxfps to a value slightly higher than your average fps. Set it to a large number and it'll take harder hits when there's a lot of action, set it to something more conservative and it'll be a lot more consistant. In some situations it can even help program stability, but not all.

-Kaotic

CreeP_303
05-14-2002, 03:42 AM
is there anyway to tweak a really HORRIBLE system?

specs.


Celeron 375A
320MEGS of RAM
ASUS Riva TNT2 M64

ninelives
05-14-2002, 12:45 PM
well, setting fps too high will have a disadvantage for those who are using optical mouse.

Ever hear of mouse lag? It means your fps is too high or fast and your mouse can't keep up.

SaberPro
05-14-2002, 12:52 PM
I got a slight increase I guess...it was constant 90 now i see a bit of the 100s...thx for the info

ArtifeX
05-14-2002, 12:59 PM
com_maxfps is used to cap your framerate so that your cpu is freed up to do more important work. Setting this to anything higher than 80 is totally pointless as the human eye perceives true motion at only 60 fps. People setting their com_maxfps to 1000 are just going to force their processor to do a lot of extra number crunching that doesn't have any perceptible difference on-screen. If you're getting over 80fps consistently, then up your detail levels until your framerate just barely dips below 80 during heavy fight sequences. That way, you're getting the best of both worlds.

I think people would benefit more from setting their displayRefresh property to the maximum value for their monitor in that resolution and color depth.

AnabolicJedi
05-14-2002, 01:20 PM
ArtifeX is exactly right here.
There is no point of setting the Maxfps value to over 100.
The human eye cannot catch up framerates higher than 27 frames per second. So its useless if it is 35 fps or 150 fps. The cpu will just have to jump from 120 range fps down to maybe 30 fps in heavy fighting, that will make your performance worse.

This is just something done by people because it maybe feels good to get out an extra 5-10 fps, but your eye wont catch it anyway.

Mobius47
05-14-2002, 01:33 PM
Originally posted by CreeP_303
is there anyway to tweak a really HORRIBLE system?

specs.


Celeron 375A
320MEGS of RAM
ASUS Riva TNT2 M64


OH MY GOD THAT IS THE WORST COMPUTER IN THE WORLD!!! HOW THE HELL DID YOU EVEN INSTALL JK2???

Dvlos
05-14-2002, 01:46 PM
About the color thing.. there is a difference in computers when you see lights and effects or textures with "Banding" in them. There is a difference, if you really take the time to examine your game you will see it.

When I had a p3 450 mhz and playing Max Payne I THOUGHT there was no difference in 32bit or 16bit until I noticed heavy banding in the subway lights and fog areas. When I got a faster machine Max Payne ran/look so much better.

I think people should put more effort into either playing hte game at high resolutions with 32 bit color or 800x600 or 1024x768 with anti-aliasing.. it looks sharp!

Wicket the Ewok
05-14-2002, 01:52 PM
The eye can't really see more than about 16,000 colours but the dsitribution isn't uniform. It can tell between quite sutble differences in colour in the mid-range while being quite terrible at very bright and very dark colours. That's why 16-bit to 32-bit is quite noticable.

noaxark
05-14-2002, 02:06 PM
AnabolicJedi:

I'm not to sure if the human eye can catch anything more that 27 frames per second regurlarly, but on computer games you sure as hell notice a HUGE (and I mean HUGE) difference between 27 fps and 85.

Mobius47:

I don't know where you got that info, but it's all wrong.
The human eye can see up to something over 5 million (!) colors. 16 bit depth is a bit over 65000 colors and 32 bit is 16.7 million* colors*, so you're alot better of using 32 bit. It's not such a huge performance hit, though of course it depends on your hardware.

*The reason this number is so much higher than what the eye can see is that it's easier for the computer to calculate it (or something).

noaxark
05-14-2002, 02:10 PM
darn... you guys beat me to it to answer... guess I just left the reply page too long =)

Abefrulman
05-14-2002, 02:58 PM
Ok, I've been trying to practice for that Gateway tournament that is next month. Now when I load a game with their predetermined settings my FPS drops from 85fps to less than 30fps. I assume its because there is 12 other bots in the game but damn that much? I have my video set to 800X600 32bit, bilinear, basically all my setting are middle of the road....Anyone have any suggestions?

I'm running a Geforce2 MX 400

Homosexual Ewok
05-14-2002, 03:09 PM
http://www.lucasforums.com/showthread.php?s=&threadid=53861

Go there^

:ewok:

Brodieman
05-14-2002, 04:57 PM
Originally posted by noaxark
AnabolicJedi:

I'm not to sure if the human eye can catch anything more that 27 frames per second regurlarly, but on computer games you sure as hell notice a HUGE (and I mean HUGE) difference between 27 fps and 85.

Mobius47:

I don't know where you got that info, but it's all wrong.
The human eye can see up to something over 5 million (!) colors. 16 bit depth is a bit over 65000 colors and 32 bit is 16.7 million* colors*, so you're alot better of using 32 bit. It's not such a huge performance hit, though of course it depends on your hardware.

*The reason this number is so much higher than what the eye can see is that it's easier for the computer to calculate it (or something).

in regard to fps, the human eye in psychological testing can detect differences in frame rate up to around 80 fps. though at a stable 27 fps the noticeable of 5 fps either way is not too noticeable. The difference though between 27 and 85 fps is noticeable, mostly because difference in speed is noted very quickly by the human eye when it drops (not so much when it increases). As to colour reference yes the human eye has a spectrum of over 5 million colours.

TexRoadkill
05-14-2002, 07:29 PM
In regards to framerate-

The reason that computer games need higher frame rates than televison (30FPS) is because cameras blur the action and your brain 'fills in the blanks' easier.

Video games create perfectly crisp images so any skipping of movement between frames is much more noticeable.

Ever notice that Saving Private Ryan effect that was popular for a couple years in action movies? They used it in action sequences in Gladiator also. They shoot that with a high speed shutter so each frame is a crisp shot with little motion blur. That is what gives those action scenes a slight strobe effect. You really notice 30FPS with a fast shutter speed.

When video cards can do motion blur reliably the frame rates can drop considerably and still look good.

I have a P3 750, 512 PC133, GF3 TI200 and the game runs great at high detail. The best of any new game I've played. It does chug on big battles but duels run flawlessly.

Cobalt60
05-15-2002, 05:16 PM
guys..
the reason you need a fast framerate ,
is because of something called "persistance of vision"
I'm surprised this hasn't been mentioned.

(I hear a lot of talk about the performance limitations inherant to the human eye)

((but .. consider for a monent.. the performance limiations inherant to your display hardware))






your eyes don't respond nearly as 'quickly' to stimulation ,
as the phosphor coating on the inside of a CRT responds

as a result :
1) your retina will "hold" an image a lot longer than the phosphor coating on the inside of a CRT
(therefore : the retina will retain 'afterimages')
((which is why you often see 'spots' before your eyes
immediately after the policeman shines that damn flashlight in your face))

2) it takes a bit longer for the eye to register what it is seeing
((which is why the eye actually needs to be exposed to the image for a longer period of time than the phosphor))
((which is also why people ~think~ that "the human eye can't see more than 27fps"))




consider this:
the CRT in your television refreshes the picture at a rate of 60hz (right?)
but television signal is actually filmed/broadcast at a framerate of only 30fps
(hmm)

this means that every second "image" being displayed on your TV is identical.
(why?)

because an individual frame on the CRT would always fade too quickly from the phosphor,
before your eye could fully register what's being displayed.

therefore each image needs to be displayed twice

(ie: the phosphor coating on the inside of the CRT cannot "hold" the image for quite as long as the human retina requires , in order to fully register the image)
((otherwise , the individual frames would fade too quickly for you to see them;
your eye would only register a transparant 'ghost' image, with blurry motion))





ALSO consider:
the CRT in your computer monitor has a phosphor coating
which reacts MUCH MORE QUICKLY to electrical stimulation,
than the phosphor coating in a common television

this means that the image burned into the phosphor will fade even MORE QUICKLY on a computer screen, that it would on your TV set

(in fact , studies have shown that electrical stimulation at 60hz is NOT ENOUGH to create a crisp lifelike image in VGA)
((and thats why 60hz flickers, it looks 'transparant' , and it gives you a headache))
((you need to have a refresh rate of at least 85hz (100 is nice)
in order to see a crisp lifelike image of your windows desktop))






conclusion : what's the perfect framerate?
(it depends on your monitor ; not your "human eye")

the perfect framerate for 3D games is an integer factor of the monitor's refresh rate

1/2 of the refresh rate is adequate ,
as is the case with common "television"
((but 1/1 of your refresh rate is preferred))

so then : if your monitor is set to a resfrsh rate of 100hz ,
then your Human Eye will be "adequately" stimulated by a framerate of 50fps
((but a framerate of 100fps is preferred))


anything less is NOT ENOUGH to create the illusion of actual motion

(given the performance limitations inherant to your monitor's CRT)

(regardless of the performance limitations inherant to yur "human eye")

COMEDY BoB
05-15-2002, 05:55 PM
Originally posted by Cobalt60
guys..
the reason you need a fast framerate ,
is because of something called "persistance of vision"
I'm surprised this hasn't been mentioned.

(I hear a lot of talk about the performance limitaions inherant to the human eye)

((but .. consider for a monent.. the performance limiations inherant to your display hardware))


your eyes don't respond nearly as 'quickly' to stimulation ,
as the phosphor coating on the inside of a CRT responds

as a result :
1) your retina will "hold" an image a lot longer than the phosphor coating on the inside of a CRT
(therefore : the retina will retain 'afterimages')
((which is why you often see 'spots' before your eyes
immediately after the policeman shines that damn flashlight in your face))

2) it takes a bit longer for the eye to register what it is seeing
((which is why the eye actually needs to be exposed to the image for a longer period of time than the phosphor))
((which is also why people ~think~ that "the human eye can't see more than 27fps"))


consider this:
the CRT in your television refreshes the picture at a rate of 60hz (right?)
but television signal is actually filmed/broadcast at a framerate of only 30fps
(hmm)

this means that every second "image" being displayed on your TV is identical.
(why?)

because an individual frame on the CRT would always fade too quickly from the phosphor,
before your eye could fully register what's being displayed.

(ie: the phosphor coating on the inside of the CRT cannot "hold" the image for quite as long as the human retina requires , in order to fully register the image)
(therefore each image needs to be displayed twice)

((otherwise , the individual frames would fade too quickly for you to see them;
your eye would only register a transparant 'ghost' image, with blurry motion))


ALSO consider:
the CRT in your computer monitor has a phosphor coating
which reacts MUCH MORE QUICKLY to electrical stimulation,
than the phosphor coating in a common television

this means that the image burned into the phosphor will fade even MORE QUICKLY on a computer screen, that it would on your TV set

(in fact , studies have shown that electrical stimulation at 60hz is NOT ENOUGH to create a crisp lifelike image in VGA)
((and thats why 60hz flickers, it looks 'transparant' , and it gives you a headache))
((you need to have a refresh rate of at least 85hz (100 is nice)
in order to see a crisp lifelike image of your windows desktop))


conclusion : what's the perfect framerate?
(it depends on your monitor ; not your "human eye")

the perfect framerate for 3D games is an integer factor of the monitor's refresh rate

1/2 of the refresh rate is adequate , as is the case with common "television"

(so : if your monitor is set to a resfrsh rate of 100hz ,
then your "Human Eye" will be adequately stimulated by a framerate of 50fps)
(anything less is NOT sufficient to create the illusion of actual motion)

((given the performance limitations inherant to your monitor's CRT))
((regardless of the performance limitations inherant to yur "human eye"))

Good explanation Jedi......Yoda would be proud

Cobalt60
05-15-2002, 07:10 PM
-cheers-



and btw Chastan is correct about MP play.

here's a quote from the man (in this case Kenn Hoekstra):
http://www.webdog.org/plans/173/
"Thursday, April 25th, 2002 - For those of you having
some ping/performance issues in Jedi multiplayer, I
can offer the following advice...
- If you have a really fast machine, you can cap your
frames per second (defaults to 85) using the com_maxFPS
command. I'd recommend capping it at 50 or 60."



((personally I'd recommend capping at 50fps for MP if your monitor is refreshing at 100hz ; see explanation above))

Chastan
05-15-2002, 07:32 PM
I don't know, but even above any supposed "limititaions of FPS" that the eye can see I do notice a difference in FPS. It just looks a lot more smoother. I don't really think there's much need to go higher than 100FPS though, it looks pretty good there. I think that really it just has to do more with the timing of when the frame updates and when your brain catches it. I don't really know, I'm no expert :D


As for refresh rates and such, I always notice when people have their monitor set at 60hz... it really bugs me. It's the first thing I notice when looking at a monitor... I always go "fix your refresh rate" but it seems like other people can't even tell... weird..

A_Crying_Dragon
05-15-2002, 07:43 PM
Keep in mind that also While framerates of 27 per second are what the eye 'sees' and thats what they use at theaters. . that is 27 frames, per second. .The frames per-second used by monitors are drawn from bottom up.. Thats why things seem so herky-jerky. . sometimes.

It boils down to, you wanna try to get as close to your Computers verticle refresh-rate as you can. So 75Hz = 75fps.
Like the other guy said.

Even some of the most top of the line, and a majority of 17 inch monitors are topped at 1024x780 @ 75Hz, Mine is slated at 85Hz.
You don't wanna use a setting that has less than 72Hz. It causes too much eye-strain, percieveable 'breaking of the screen' IE you see this alot when TV broadcasts pictures of Computer monitors, but it can be seen if you have a sharp eye and good reflexes.. unless its high enough.

Cobalt60
05-15-2002, 08:21 PM
a movie screen and a CRT are entirely different.

the only correlation between them is in the white canvas backdrop that the film is projected onto
(which can be seen as a CRT with an infinite refresh rate , if you like)



the 27 fps at the movies (I thought it was 24fps?) is based on the same idea as above,
but it has nothing to do with the refreshrate of the display screen,
and it has everything to do with your retina.

like I said, the eye needs to be stimulated for a certain length of time before it will retain the image (as explained above) and also, the retina will retain that image for a certain length of time afterwards.

(and the effect, as your eye travels from frame to frame,
results in something we call "motion blur")

(which kicks in at about 25fps)



but this would only apply to a reflective screen (or a wall) that's being stimulated by a strobe light (like at the movies)

this wouldn't apply to your computer screen (CRT),
because that image is not "solid" to begin with.

the CRT must be stimulated twice as often as this (or more)
in order to produce an image that even "appers" solid

(because the CRT fades too quickly , as explained above)





p.s.

an interesting experiment that arises from all this
(in an effort to eliminate the limitations of your hardware and simply test your 'human eye')
would be to cap your max fps at exactly 25frames, with a monitor set to refresh at 75hz
((similar to the 24frames which you see at the movies))

and then compare the experience to a maxfps of 50frames, with a monitor set to refresh at 100hz;
and also to a maxfps of 75frames , with a monitor set to refresh at 75hz
((all of these settings should provide a nice crisp 'solid-looking' image))

and then ask yourself:
is there "really" a difference , in the fluidity of the motion ,
that your 'human eye' can distinguish?




((I bet the answer is yes))

DigitalVapor
05-15-2002, 09:42 PM
MY BIG QUESTION IS: Given my system specs, how would I go about getting good performance at HIGH QUALITY mode? because I am ultimately sick of lower quality settings and high quality skipping (sound and video)

My specs are:

AMD Duron 750 mhz
128 MB Ram
ATI Radeon 7000 32 MB
6X Creative PC-DVD Drive

This is the best I can afford so if you tell me to buy new hardware, F**K YOU (I say this because some smart @$$es cant do anything but insult peoples systems and tell them to get better ones)