View previous topic :: View next topic |
Author |
Message |
Gentree Watchman
Joined: 01 Jul 2003 Posts: 5350 Location: France, Old Europe
|
Posted: Thu Jun 02, 2005 10:29 pm Post subject: |
|
|
I've seem reports that the more recent (>6629) drivers caused serious heating effects.
Anyone seeing an end to , or confirmation of that with this release?
I'm masking anything >6629 until I see a resolution of that one.
TIA. _________________ Linux, because I'd rather own a free OS than steal one that's not worth paying for.
Gentoo because I'm a masochist
AthlonXP-M on A7N8X. Portage ~x86 |
|
Back to top |
|
|
Ma3oxuct Guru
Joined: 18 Apr 2003 Posts: 523
|
|
Back to top |
|
|
Archangel1 Veteran
Joined: 21 Apr 2004 Posts: 1212 Location: Work
|
Posted: Fri Jun 03, 2005 3:26 am Post subject: |
|
|
I'm most impressed - these drivers have seriously improved TwinView. In older versions monitors using TwinView were pretty much incomprehensible to the window manager - now it seems to be much like Xinerama, except it works a lot better for gaming.
Only problem is that my PCI TNT2 isn't going any more. I wasn't using it much anyway though... _________________ What are you, stupid? |
|
Back to top |
|
|
paulisdead Guru
Joined: 10 Apr 2002 Posts: 510 Location: Seattle, WA
|
Posted: Fri Jun 03, 2005 4:39 am Post subject: |
|
|
If you want to mess with coolbits, you need the new nvidia-settings. Until we get an ebuild for that (since portage apparently uses a repackaged version of it, so I couldn't just rename the old ebuild) you can extract it from the nvidia installer and use the precompiled one in there. I just copied it over the nvidia-settings files portage had installed, and tweaked my xorg.conf file, and now I have the overclocking option in nvidia-settings.
*edit
btw Gentree, yes the overheating if you restart X is still present. However coolbits gives us a bit of a workaround, if you overclock the card like 5mhz on the ram and core it should cool back down. _________________ "we should make it a law that all geeks have dates" - Linus |
|
Back to top |
|
|
kmare l33t
Joined: 20 Nov 2004 Posts: 619 Location: Thessaloniki, Greece
|
Posted: Fri Jun 03, 2005 1:18 pm Post subject: |
|
|
it seems to be in portage now... emerging and hoping for no more Xids...
edit: aaaaaaaaaargh... Xids are still there... pls nvidia fix it!!!
Last edited by kmare on Fri Jun 03, 2005 3:22 pm; edited 2 times in total |
|
Back to top |
|
|
l_bratch Guru
Joined: 08 Feb 2005 Posts: 494 Location: Jersey
|
Posted: Fri Jun 03, 2005 2:06 pm Post subject: |
|
|
Just got the new driver, no new nvidia-settings in portage yet though...
glxgears score remains the same but i'll test out the new clocking for some in game test later |
|
Back to top |
|
|
mijenix Guru
Joined: 22 Apr 2003 Posts: 393 Location: Switzerland
|
Posted: Fri Jun 03, 2005 4:48 pm Post subject: |
|
|
Hi
Yes and my Nvidia GeForce 2 GO works again with the new driver, the old one was crap!
So now I'm happy with nvidia
--Mathias |
|
Back to top |
|
|
th3w4y n00b
Joined: 22 Aug 2004 Posts: 15 Location: Zug/Switzerland
|
Posted: Fri Jun 03, 2005 5:24 pm Post subject: overclocking is really something |
|
|
after overclocking using "Auto Detect" on my FX 5200 card
i have changed the GPU freq from 250 to 301Mhz
and the memory freq form 280 to 388 Mhz
and the glxgears really gived me nice rezults from 946.760 FPS to 1320.792 FPS Super |
|
Back to top |
|
|
avalanche n00b
Joined: 01 Oct 2004 Posts: 31 Location: Bremen, Germany
|
Posted: Fri Jun 03, 2005 5:52 pm Post subject: |
|
|
Hmm... when I try to use TwinView using the new Driver, X says that my hardware does not support it. I'm trying to use it on a geforce6800gt. Anyone with a similar problem there? |
|
Back to top |
|
|
clajoie n00b
Joined: 26 Mar 2005 Posts: 16 Location: Washington, DC
|
Posted: Fri Jun 03, 2005 6:06 pm Post subject: |
|
|
I had been running the previous version of the nvidia driver and was running my monitor at 1600x1200, with this version I can't get it any higher than 1024x768. XOrg log says "bad mode clock/interlace/doublescan" for 1400x1050 and 1600x1200. _________________ - Chad |
|
Back to top |
|
|
tam Guru
Joined: 04 Mar 2003 Posts: 569
|
Posted: Fri Jun 03, 2005 6:16 pm Post subject: |
|
|
At least my glxgears values increased.
Old driver
Code: |
tam@amd64 zx6r $ glxgears
38276 frames in 5.0 seconds = 7655.200 FPS
38601 frames in 5.0 seconds = 7720.200 FPS
38115 frames in 5.0 seconds = 7623.000 FPS
38568 frames in 5.0 seconds = 7713.600 FPS
|
New driver
Code: |
tam@amd64 zx6r $ glxgears
35973 frames in 5.0 seconds = 7194.600 FPS
40835 frames in 5.0 seconds = 8167.000 FPS
40824 frames in 5.0 seconds = 8164.800 FPS
40834 frames in 5.0 seconds = 8166.800 FPS
|
|
|
Back to top |
|
|
anarchist Apprentice
Joined: 12 Jul 2002 Posts: 264
|
Posted: Fri Jun 03, 2005 9:26 pm Post subject: |
|
|
world of warcraft is running much smoother now, but i am missing the new nvidia-settings in portage, hopefully it will be released in near future |
|
Back to top |
|
|
Cintra Advocate
Joined: 03 Apr 2004 Posts: 2111 Location: Norway
|
Posted: Sat Jun 04, 2005 6:24 am Post subject: |
|
|
71.67 settings still work for me OK
mvh _________________ "I am not bound to please thee with my answers" W.S. |
|
Back to top |
|
|
stahlsau Guru
Joined: 09 Jan 2004 Posts: 584 Location: WildWestwoods
|
Posted: Sat Jun 04, 2005 6:37 am Post subject: |
|
|
hey, and it seems to work with the GF6200 agp. Nice one, nvidia!
(the older drivers didn't work in X with that card)
*EDIT: is was too fast with my adjudgement. They don't work. When i wrote this, i forgot about that i indeed rebooted but didn't change my graphics-connector from dvi to normal vga. Bad news for all 6200agp-users.
Anyway, i wonder if there's any disadvantage if one connects his card from the DVI-out over a DVI/VGA Converter to a vga screen. Anyone knows?
**EDIT2: damn, i feel my english is bad today, hope you understand what i'm writing of. Too much fuel yesterday |
|
Back to top |
|
|
ferrarif5 Apprentice
Joined: 06 Sep 2003 Posts: 211 Location: Manchester, UK
|
Posted: Sat Jun 04, 2005 7:29 am Post subject: |
|
|
My GLXGears have improved and Doom III seems snappier, just waiting for nvidia-settings to pop into portage, then it's break my graphics card time!
**Only got a Geforce 6200, getting dual 6800 Ultra soon muhahaha** _________________ Asus P6X58D-E Mobo
Intel Core i7 920
18GB Corsair DDR3
User:335876 | Screenshot |
|
Back to top |
|
|
VValdo Guru
Joined: 08 Jan 2005 Posts: 395
|
Posted: Sat Jun 04, 2005 9:25 am Post subject: |
|
|
I will be curious, once nvidia-settings is out, to hear what people are doing OC-wise, what kind of performance increases they are getting, and what is "safe" overclocking vs. pushing it...
W |
|
Back to top |
|
|
lasj n00b
Joined: 14 Jun 2003 Posts: 7 Location: Valby, Denmark
|
Posted: Sat Jun 04, 2005 10:56 am Post subject: |
|
|
clajoie wrote: | I had been running the previous version of the nvidia driver and was running my monitor at 1600x1200, with this version I can't get it any higher than 1024x768. XOrg log says "bad mode clock/interlace/doublescan" for 1400x1050 and 1600x1200. |
See http://www.nvnews.net/vbulletin/showthread.php?t=51431
You need to add a modeline for you display in the "monitor" part of xorg.conf
I use this for my Samsung 213T :
Code: | Modeline "1600x1200" 140.00 1600 1632 2024 2052 1200 1200 1208 1216 |
The first value is the dot rate: 140.00
This was the problem. The /var/log/Xorg.0.log: say:
(II) NVIDIA(0): Using ConnectedMonitor string "DFP-0"
(--) NVIDIA(0): DFP-0: maximum pixel clock: 140 MHz
(--) NVIDIA(0): DFP-0: Internal Single Link TMDS
...
(II) NVIDIA(0): SamsungMonitor: Using default hsync range of 30.00-81.00 kHz
(II) NVIDIA(0): SamsungMonitor: Using default vrefresh range of 56.00-75.00 Hz
(II) NVIDIA(0): Clock range: 12.00 to 140.40 MHz |
|
Back to top |
|
|
Sm1 Apprentice
Joined: 02 Dec 2003 Posts: 251 Location: Ames, IA
|
Posted: Sat Jun 04, 2005 12:19 pm Post subject: |
|
|
stahlsau wrote: | hey, and it seems to work with the GF6200 agp. Nice one, nvidia!
(the older drivers didn't work in X with that card)
*EDIT: is was too fast with my adjudgement. They don't work. When i wrote this, i forgot about that i indeed rebooted but didn't change my graphics-connector from dvi to normal vga. Bad news for all 6200agp-users.
Anyway, i wonder if there's any disadvantage if one connects his card from the DVI-out over a DVI/VGA Converter to a vga screen. Anyone knows?
**EDIT2: damn, i feel my english is bad today, hope you understand what i'm writing of. Too much fuel yesterday |
Actually there should be no difference. From my understanding - a specific set of pins are set to be assigned to analogue signals per DVI spec, and the rest are for either a single-link DVI, or dual-link DVI. Actually, if you take a look at some GF6600GT's (like mine), all you get are two DVI ports and a little breakout box for component and svid - so you'd hope there is no degredation for going analogue |
|
Back to top |
|
|
stahlsau Guru
Joined: 09 Jan 2004 Posts: 584 Location: WildWestwoods
|
Posted: Sat Jun 04, 2005 1:11 pm Post subject: |
|
|
Quote: | Actually there should be no difference. From my understanding - a specific set of pins are set to be assigned to analogue signals per DVI spec, and the rest are for either a single-link DVI, or dual-link DVI. Actually, if you take a look at some GF6600GT's (like mine), all you get are two DVI ports and a little breakout box for component and svid - so you'd hope there is no degredation for going analogue |
thanks a lot. in this case i'll stay with the connection as it is. |
|
Back to top |
|
|
Paulten Apprentice
Joined: 28 Mar 2003 Posts: 257 Location: Sykkylven, Norway
|
Posted: Sat Jun 04, 2005 3:10 pm Post subject: |
|
|
What is the status of nvidia + apm/acpi? Does the new drivers support suspend?
Last time I remember it worked was on 31xx something.
Paul _________________ Homepage : http://paul.kde.no Jabber ID : tenfjord@jabber.org
"Dei levde som dyr. Dei verken røykte eller drakk" -Ukjent |
|
Back to top |
|
|
firephoto Veteran
Joined: 29 Oct 2003 Posts: 1612 Location: +48° 5' 23.40", -119° 48' 30.00"
|
Posted: Sat Jun 04, 2005 4:22 pm Post subject: Power Save modes.... |
|
|
Standby, suspend, and power off works now on the second display when you're running dual head off of a single card. Not sure if the random wakes itself up is still present (might be and xorg issue) since I just have a keyboard button mapped to "xset dpms force off" but I think I'll try the timed settings again. _________________ #gentoo-kde on freenode |
|
Back to top |
|
|
tam Guru
Joined: 04 Mar 2003 Posts: 569
|
Posted: Sat Jun 04, 2005 4:55 pm Post subject: |
|
|
I have just compared the Doom3 timedemo demo1 FPS with Win XP
gentoo, driver 1.0.7664: 48 FPS
WinXP driver 71.89: 62 FPS
These are the values from the third run of the timedemo. Second and third run were almost the same, just one or two tenth difference.
The graphic settings are identical on both OS, of course. |
|
Back to top |
|
|
Aynjell Veteran
Joined: 28 Jun 2004 Posts: 1117
|
Posted: Sat Jun 04, 2005 5:19 pm Post subject: |
|
|
What video card? My video card beats windows. _________________ CPU: 3800+ X2 (2.5Ghz)
GPU: eVGA 7600GT (640/1700)
MOBO: DFI SLI-DR (Surprisingly good!)
RAM: 2 x OCZ Gold 1024 DDR500 3-4-3-7 (2048)
HDD: Western Digital Raptor |
|
Back to top |
|
|
weingbz n00b
Joined: 29 Jan 2005 Posts: 46
|
Posted: Sat Jun 04, 2005 5:42 pm Post subject: Re: new nvidia driver 7664!!! |
|
|
kmare wrote: | Release Highlights:
(...)
* Added support for the AGP variant of GeForce 6200.
(...)
* Removed support for legacy GPUs; please see (sec-01) CHOOSING THE
|
This is just great, I have one of these legacy cards and old PCI TNT2 and I want to additionally buy a 6200. Seems like I need 2 drivers to run that setup (and I don't think that's possible). I think I'll cancel my order, the thing I learn from this is that shit happens if you use any closed source drivers . |
|
Back to top |
|
|
tam Guru
Joined: 04 Mar 2003 Posts: 569
|
Posted: Sat Jun 04, 2005 5:42 pm Post subject: |
|
|
Aynjell wrote: | What video card? My video card beats windows. |
6600GT PCI-E |
|
Back to top |
|
|
|