VGA

The video gate array on this thing is decidedly strange. It is not a single VLSI chip. Rather, it is done the old fashioned way with a frequency generator, a sync generator, a RAMDAC, and some jellybeans. There is no text mode, everything has to be plotted directly into the frame buffer. There is 1MB of video RAM soldered to the board, and there are pads for another 1MB.

The Frequency Generator

The dot clock is generated by an Avasem 9116-14CN20. I can't find any docs for it, but the ICS 2494 seems close. The ViewStation can have either of two varieties of the chip. The stock chip has clocks from 55MHz to 130MHz in 5MHz increments. A rework was available which consisted of replacing it with a chip with standard VGA rates and installing a jumper.

The Sync Generator

Sync is generated by an LM1882 (aka 54ACT715). If you're used to XFree86 Modelines, the parameters for this chip will make sense, except that the display area comes first, rather than last.

The RAMDAC

The RAMDAC is a TLC34075-110AFN palette. Not much to say about that, except that it interfaces to the jellybeans and video RAM.

The Jellybeans

I don't know much about this, except that there appears to be a 9 bit counter for horizontal, and a separate 9 bit counter for vertical. This is a bit unusual, as most VGAs don't separate it like this, simply wrapping after the end of the previous line. Not so here. Every line begins on a 1024 byte boundary. The vertical counter is incremented by some horizontal phenomenon, such as, oh, say, horizontal sync.

Even weirder; if you try to make a screen wider than 1024 bytes, the next 1024 bytes are not just incremented. Bit 8 (bit 10 if you are thinking of bytes, rather than 32 bit words) is complemented. And going further, you get the same 1024 original bytes again. My guess is that for 2MB systems, there must be a jumper or an output bit somewhere which makes the vertical increment by 2 instead of 1.

The Standard VGA Problem

I've been wearing out the connectors on my monitor, the VGA card in my development computer, and the ViewStation by constantly connecting the monitor to the two boxes. I do have another VGA monitor, but it only does 640x480. My ViewStation has the default frequency generator, the slowest speed of which is 55MHz, whereas standard VGA needs something closer to 25MHz. I have devised a kluge. I set the resolution to 2560x480 at 2bpp with a 100MHz dot clock. Playing with my colors, I can use the same code for this mode as I do with 8bpp. For instance, color 0x2 becomes 0xaa; which is actually 4 pixels of color 0x2 packed together. I may be able to do this at 4bpp, but I'd have to play with the modeline to make it work with the resultant virtual dot clock of 27.5MHz. Key to this hack is the reduction of bits per pixel, which keeps me below 1024 bytes per line.

The boot screen still comes up at 1024x768 because I haven't gotten control of the box at that point. While it is quite unintelligible, I don't think this monitor is smart enough that it would be damaged by it.