CAPTCHA
Image CAPTCHA
Enter the characters shown in the image.
This question is for testing whether or not you are human.
  • Create new account
  • Reset your password

User account menu

Home
The Hyperlogos
Read Everything

Main navigation

  • Home
  • My Resumé
  • blog
  • Howtos
  • Pages
  • Contact
  • Search

Compaq nw9440 xorg.conf: Support for all resolutions

Breadcrumb

  • Home
  • Compaq nw9440 xorg.conf: Support for all resolutions

I have a Compaq nw9440 "Mobile Workstation" class system, P/N (or type or something) EZ901#AA. There are a number of different systems called the nw9440 which come in the same case, which are slightly different. I won't go into it here; this isn't an ad for HP. In fact, this laptop has presented (in my opinion) more than its fair share of obstacles in my pursuit of Linux. ACPI is slightly wonky, the modem is from Conexant, and video, of all things, has been a major PITA. This is highly unfortunate because this laptop has quite excellent graphics, in fact carrying the best of breed for its date of release - the NVIDIA Quadro FX1500. In particular, I am able to use the display panel only at its maximum resolution.

As time has gone on I've finally managed to assemble a functional Xorg.conf by playing the 'nv' (Free, Open Source) and 'nvidia' (Official NVIDIA corporate) drivers against one another. For some reason, the Free driver is capable of detecting all of the resolutions of which the panel is capable. Unfortunately, it's not capable of gracefully handling high temperatures, and at least at this time when the GPU temperature hits the thermal threshold the X server locks up. Sometimes you can kill it remotely and get video back, sometimes you can kill it remotely and never get video back. So it was in my best interest to get the closed-source driver working if I wanted a stable system.

The problem in detail

For some reason, the nvidia driver is unable to determine what resolutions the internal digital flat panel display (DFP) can handle. The EDID that it retrieves (attached below) provides the following information (via parse-edid):

$ parse-edid edid.bin
parse-edid: parse-edid version 1.4.1
parse-edid: EDID checksum passed.

        # EDID version 1 revision 3
Section "Monitor"
        # Block type: 2:0 3:f
        # Block type: 2:0 3:fe
        # Block type: 2:0 3:fe
        Identifier "SEC:4743"
        VendorName "SEC"
        ModelName "SEC:4743"
        # Block type: 2:0 3:f
        # Block type: 2:0 3:fe
        # Block type: 2:0 3:fe
        # DPMS capabilities: Active off:no  Suspend:no  Standby:no

        Mode    "1680x1050"     # vfreq 58.794Hz, hfreq 64.674kHz
                DotClock        119.000000
                HTimings        1680 1728 1760 1840
                VTimings        1050 1052 1058 1100
                Flags   "-HSync" "-VSync"
        EndMode
        # Block type: 2:0 3:f
        # Block type: 2:0 3:fe
        # Block type: 2:0 3:fe
EndSection

This is lovely, and that is a working resolution. It enables the use of the panel at the native resolution, and at the highest supported frequency. This is all anyone who doesn't play games will ever need. Unfortunately, I am not one of those people; in fact, I play windows games in both wine and VMware Server, both of which require resolution switching to provide full-screen play as they do not provide any video scaling.

Since this system began its life running Ubuntu Dapper Drake, I originally used the 'nv' driver long enough to install and run the 'nvidia' version. I then created my new xorg.conf by modifying the original. I flipped back and forth now and then during the process, and noticed that I couldn't have support for all resolutions and 3D acceleration at the same time. At the time I was playing only Alpha Centauri and Civilization 2 (in VMware) and neither game required any scaling, being turn-based strategy games designed for large displays.

The Solution

Reading through various linux manual pages, I came across the following page in particular:

xvidtune(1x)                                                      xvidtune(1x)

NAME
       xvidtune - video mode tuner for Xorg

SYNOPSIS
       xvidtune [ -show | -prev | -next | -unlock ] [ -toolkitoption ... ]

DESCRIPTION
       Xvidtune  is  a  client  interface to the X server video mode extension
       (XFree86-VidModeExtension).

xvidtune is one of those programs that's amazingly useful for people who have a multisync CRT monitor, and pretty much useless for everyone else - or so I thought. Reading down a bit, we come to the following:

OPTIONS
       xvidtune accepts the standard X Toolkit command line options as well as
       the following:

       -show     Print the current settings to stdout in xorg.conf  "Modeline"
                 format and exit.

Well, how about that? I don't even have to copy this stuff out? For example, if I run xvidtune -show right now, I get the following:

$ xvidtune -show 
"1680x1050" 147.10 1680 1784 1968 2256 1050 1051 1054 1087 +hsync +vsync 

Prefix this with the word Modeline and bingo! You've got a Modeline entry for your xorg.conf. Most people don't have these in there any more, although sadly way back when I got started with this whole linux thing (get off my lawn!) there were no configuration-authoring programs, and you had to write your XFree86 conf file yourself, complete with modelines. If you had an at all wonky monitor (which I did) then you had fun ahead of you. I have a funky monitor today, too, but now I have the "nvidia-settings" program to work with. - and Xorg to make it irrelevant any other time but, apparently, when I'm setting up this particular system.

Starting a new xorg.conf

Anyway, all you have to do to get an xorg.conf file these days (at least, if you have the 'nvidia' driver installed) is run nvidia-settings and a nice little window pops up that lets you configure your graphics system. all you have to do to generate a generic file is to click "X Server Display Configuration" and then click the "Save to X Configuration File" button. If you uncheck the "Merge with existing file" checkbox and save to a new file, it will start you out fresh.

I won't go into the boring details of where I found all of this information, but I will explain a bit more about the fixes. I will provide excerpts from the xorg.conf here, and a full, working file is attached to this page.

Using Modelines

It's quite simple to use a Modeline. We saw what they looked like above, but here's a more expanded example of a "Modes" section of an xorg.conf.

Section "Modes"
    Identifier         "Modes_0"
    ModeLine     "1680x1050" 119.0 1680 1728 1760 1840 1050 1052 1058 1100 -hsync -vsync
    ModeLine     "1024x768" 65.0 1024 1048 1184 1344 768 771 777 806 -hsync -vsync
    ModeLine     "800x600" 40.0 800 840 968 1056 600 601 605 628 +hsync +vsync
    ModeLine     "640x480" 25.2 640 656 752 800 480 490 492 525 -hsync -vsync
EndSection

These are the most important modes to me, although I have more modes in my actual xorg.conf. Let's see how these are used in the file:

Section "Monitor"
    # HorizSync source: edid, VertRefresh source: edid
    Identifier     "Monitor0"
    VendorName     "Unknown"
    ModelName      "Seiko"
    UseModes       "Modes_0"
    HorizSync       30.0 - 75.0
    VertRefresh     59.0-60.0
    Option         "DPMS"
EndSection

That's the complete, actual "Monitor" section and, as far as I know, accurate values for the panel's Horizontal and Vertical sync range, more or less. It will do some modes with a very slightly higher or lower vertical refresh.

Making Modes work with 'nvidia'

The nvidia driver wants to get my monitor information from the EDID. If it's not found there, it seems to want to reject it. It was also rejecting modes because they supposedly had a too-high "pixel clock", but those modes' clock was lower than the rate reported even in the EDID! So I took a more "brute force" approach to specifying the modes I wanted. Prior to this, I had the Modelines and such as above, but the modes were all rejected!

Section "Screen"
    Identifier     "Screen0"
    Device         "Videocard0"
    Monitor        "Monitor0"
    DefaultDepth    24

This sets up the X "Screen" (to which graphics are drawn) including the bit depth. Some various other options which aren't really relevant are here, but there are some interesting ones:

    Option         "UseEdidFreqs" "False"
    Option         "AllowDDCCI" "True"
    Option         "IncludeImplicitMetaModes" "True"
    Option         "ModeValidation" "NoEdidMaxPClkCheck"

The only of these lines I am sure are necessary are the latter two. The first line tells the system not to use EDID modes at all; they are discarded from the pool of modes to be checked. The second line permits use of DDC/CI to get information about the display. The third includes video modes which are normally discarded on a DFP with an EDID (even a wrong one.) The last one disables that pesky pixel clock check, and without it some of these modes do not work. The pixel clock of the panel is 330 MHz but the nvidia driver eliminates anything above the clock given for the mode shown in the EDID, which is 119 MHz.

The following also exists in this section, and could close it out like this:

    SubSection     "Display"
        Depth       24
        Modes      "1680x1050" "1024x768" "800x600" "640x480" "1024x768" "800x600" "640x480" "1680x1050_2" "1024x768" "800x600" "640x480" "1400x1050" "1280x1024" "1440x900" "1280x960" "1280x800" "1280x768" "1152x768"
    EndSubSection
EndSection

The default depth is set (as you saw a bit above) to 24 — this is actually 32 bits, but it's specified as 24 for compatibility purposes. The QuadroFX actually displays color with ten bits per pixel, or 30 bit, but applications don't know this and it's only used for internal processing and eventually for display. I'm not sure how this is reflected at the panel level, and for all I know the panel is dithered like a Macbook Pro, but I do believe there is a 10 bit per channel RAMDAC for analog output. Quadro display adapters are meant for professional use (and this one does not disappoint!) The other eight bits are either discarded or used for alpha (transparency) but as our display devices do not have variable opacity (yet?) they are significant only for textures and not for display.

All of the options in my xorg.conf are in the one attached to this page, and they set up OpenGL options and other niceties (for instance, you can use Beryl/Compiz.)

Using the xorg.conf

For those who somehow :) don't already know, you slip this file into some directory somewhere and it configures X. In Ubuntu's case (which implies that distributions both upstream and down will be the same) this directory is /etc/X11. Make a backup copy of your xorg.conf; you'll probably want to copy lines out of it. I probably want to do that too, but I haven't gotten around to it yet. I do still have the one made by debconf, though, and that one has all the proper font paths and whatnot.

Just download the file someplace, then copy /etc/X11/xorg.conf to /etc/X11/xorg.conf.bak (as root,) then gunzip the xorg.conf that you downloaded from me (xorg.conf.gz) and copy it to /etc/X11 (again, as root.)

You should copy any missing lines from the "Files" section in particular from your xorg.conf to mine. If your keyboard or mouse doesn't work, copy the appropriate section and replace mine.

Attachment Size
edid.bin.gz (121 bytes) 121 bytes
xorg.conf.gz (1.95 KB) 1.95 KB
howto
nVidia
Xorg
Compaq
laptop
  • Log in or register to post comments

Footer menu

  • Contact
Powered by Drupal

Copyright © 2025 Martin Espinoza - All rights reserved