Windows Vs. Full-Screen User Experience Evolving - Finally!

by Mike Levin SEO & Datamaster, 03/21/2012

Windows suck. Screens rock. Windowing operating systems are a throw-back to the days when memory was scarce and expensive. It seemed clever and natural, but the ability to move and arrange your program windows ushered in an era of insanity that lasted until the new realities of mobile revealed to everyone that there is a better way—full-screen apps, you can easily switch between, like flashcards.

With windows, the same screen real estate (e.g. memory) had to be recycled for every program, instead of given its whole own full-screen “channel”. With screens, there’s more real estate—in the form of memory. The need to make even entry-level PCs game-capable drove that change, and is dovetailing with higher usability expectations, driven by mobile. People don’t generally “get it” yet, but what they’re hungering for on the desktop is full-screen apps, and for windows to go away.

There is also a certain fixed-versus-floating argument here. Productivity doesn’t necessarily go up with computers, because muscle-memory mastery that enables you to work forever faster and more efficiently—such as happens with musical instruments, sports and weapons—just doesn’t happen with a mouse and windows. Users are forever putzing around, figuring things out, figuring out “where I am”. You don’t have the fixed-location advantage, and your “flow” is always thrown off reorienting yourself. The “fixed location” of things in full-screen apps can change that.

I see various platforms undergoing a shift in how super-useful but under-utilized virtual screens are packaged and presented to users. OS X Lion got it soooo right, disguising a program’s full-screen mode for an on-the-fly virtual screen. Users barely realize they’re using virtual screens, and the launch dock is still on every screen, so the user can navigate between them easily by switching apps, the new Exposé or Mac’s version of Alt-Tab.

How far back does the virtual screen go? Decades ago on Unix, text-only terminals had virtual consoles, because they didn’t have that windowing option. But it wasn’t until the Amiga computer in 1986, as far as I could tell, that the virtual concept idea was applied to full-screen programs. On the Amiga, you could switch between arcade-quality full-screen running games and the desktop with Amiga+N. This was universal on the Amiga, and a blissfully quick way of switching between programs. I have sorely missed it on every platform I’ve worked on—until recently.

OS X properly connected the dots of “maximize app” and virtual screens. Ubuntu and Gnome 3 are going a similar direction with virtual screens on-the-fly, but neither has taken the radical step of tying it to the “maximize window” gadget like Apple—though in all fairness, Apple did introduce a new button for this which developers must specifically support.

Either way, the writing is on the wall. Windows had tried to compensate with features like Aerosnap to start taking the insanity out of the messy, unproductive windowing paradigm by making windows snap to logical positions. What’s more, Windows 8 Metro ironically takes the old windowing paradigm out of Windows, unless you turn the option back on to run your old software. Expect it to gradually become Microsoft Metro, because keeping the word windows in there is sort of like calling it the horse and buggy.