For reasons that I will make clear in a future story, I decided to set up a second monitor on my Windows 98 machine at work. Thanks to Dell's engineers, putting the video card in the computer was a relatively painless procedure, almost as good as the G3/4 towers. You remove a side panel, pull a lever, give it a good tug, and out pop the card slots. Stick in the card, close it up, and you're all set.
I dragged an old 15" monitor out of the storage room and put it on the new video card. So far, so good. In fact, so far, the process has been relatively Mac-like. Of course, I hadn't turned the computer on yet. So I did that.
And then it began.
Inexplicably, the computer decided that the new video card and monitor were far better than the one I had been using all this time. Never mind that the new video card was actually three years older and I hadn't had a chance to install the drivers yet ‚ screw the status quo, let's go with the new guy. Everything was showing up on the new monitor, which I had placed on the file cabinet near my desk. Nicely out of the way, but too far away to read comfortably.
No problem, I thought. I'll just wait until Windows boots up and then I'll set the main monitor to be the primary one. Sounds easy, right?
Ha ha! Windows booted up and decided that it needed to install video card drivers. For the video card I had already been using. Huh. So it did that (never asked for drivers for the new card), and I opened the Display control panel so I could switch the primary monitor.
I got the big monitor activated, but I had a hard time finding the command for setting it as primary. On the Mac, you just drag the little menu bar to the monitor you wish to set as primary. Windows doesn't have a menu bar (it's far better to eat up valuable screen space by letting each and every window have its own separate menu bar, after all), so there was nothing to drag. Just the two pictures of monitors with big numbers in them. The big "2" on my main monitor sat there, mocking me.
So I did the unthinkable ‚ I consulted the help files. What I found amazed me. Here's a direct quote:
If your desktop items do not appear on the monitor that you want to use as primary, shut down Windows and turn off your computer and monitors. Plug the monitor you want as primary into the primary video adapter and plug the other monitor into the secondary video adapter.
Apparently there's NO WAY to set a primary monitor via software. You've got to switch the cables. Naturally, I didn't want my main monitor running off the old crappy video card, so I had to re-open the machine and switch the actual video cards.
So now it works. I've got two monitors now. But Bill has one more surprise for us. New windows always open in the primary monitor. Always, always, always. It'd be nice to put Netscape in the secondary monitor, hit the New Window command, and keep browsing away. But no, the new window opens in the primary monitor, so you've gotta drag it over to the secondary. Every time you hit a web page with an annoying pop-up (like ign.com), it pops up in the primary monitor. Heck, even some (but not all, lest they be accused of being consistent) dialog boxes pop up over there. Thanks, Bill!
What really gets me is that people accept this, like a lot of other Windows crappiness (hold down Alt and hit 5-1-3 on the keypad for an emdash), as an okay way of doing things. The Mac had multiple monitors and easy ways to work with them years before Windows managed to hack something together. So it's not like they didn't have a good example to steal innovate from.