The grabber in Windows 3.1 was improved to save and restore the index register as well, but it does not attempt to restore the flip-flop state, which is significant. The problem with the VGA emulation was that it erroneously applied the flip-flop state to reads from port 3C0h, and Windows 3.1 would save the wrong index register value… but only the second time through, because the flip-flop state was different at that point. That is to say, the Windows 3.1 standard mode grabber read from port 3C0h to query the attribute controller index register state, but the emulation returned the currently selected data register contents instead.
And then, when restoring the attribute controller index register the next time around, the register would be restored to the wrong value which didn’t have bit 5 set, causing the screen to go blank.
Michal Necasek
It’s not every day that you learn how an aspect of the workings of VGA causes a blank screen under very specific circumstances when running Windows 3.1 in Standard mode under emulation, and that this specific aspect of the workings of VGA was implemented to maintain backwards compatibility with EGA.
Absolutely bonkers.
Back in the day DOS programmers rarely needed to know how to do any of this since there were video BIOS routines acting as drivers to do everything. And since these mode setting routines only needed to run at program startup/termination, there wasn’t much benefit to trying to optimize it in your own code. There weren’t many reasons to bypass the BIOS drivers, but you could technically program new modes that weren’t available through BIOS, such as the venerable “mode x”.
https://gamedev.net/reference/articles/article373.asp
Although the benefits of this was short lived as new SVGA and VESA modes offered more resolution & colors than hacked VGA modes.
The main exception that I suspect nearly all VGA programmers will be familiar with is using port-IO to program the palette. While there were video bios routines for this, palette changing was often used in real-time effects. There would be overhead in calling the BIOS inside the main application/game loop, so it was standard practice to bypass the BIOS for this. Think of the screen turning red when DOOM guy is shot.
No way man! Nobody used the BIOS. It was slow junk. I worked on data analytic apps back then and we banged the hell out of the EGA/VGA registers and specs to bend it to our will. IIRC our app used 4 bit planes and a custom palette to have 2 * 2 bit layers (sure one couldn’t scroll them but we didn’t need that). Copy operations et al are barely extant in the BIOS. It also has no idea about scrolling and even the text modes are super slow. *everyone* just wrote to VRAM. Even word processors.. I don’t think the BIOS even knew about vertical refresh or page flipping. The biggest problem with VESA is that it took time to be standardized and even then the BIOS omitted large amounts of functionality possible on the cards.
ppp,
The “slowness” is virtually irrelevant when it happens only at application startup and shut down, which is when most applications used the bios. But the only slowness of real consequence is in the main loop. The main reason to bang VGA registers was to get it into non-standard modes. But if you are using standard modes, there’s not really an advantage in replacing the BIOS calls to enter those modes.
I found VESA to be so helpful in providing standard programming interfaces, but I agree with you that early hardware didn’t have a vesa bios. Also, I’m not suggesting the bios replaces the interrupt handlers and whatnot.