Vista and the Software Gap
by
Ross M. Miller
Miller Risk Advisors
www.millerrisk.com
October 13, 2008
My introduction to Windows Vista was sudden. This past
May, the morning after I had graded my last spring semester exam, I sat
down in front of my Sony desktop computer with my morning coffee. At
first, I thought that I was still dreaming. Just like in the movies,
streams of multi-color letters and numbers were streaming down my main
monitor. I spent the next day-and-a-half rationally going through all the
possibilities, replacing things one component at a time, until I decided
that it was time to give up and buy a new computer. I had been thinking
that the Sony, a mere three years old, was about due for replacement
anyway because it was just too slow.
After the most minimal deliberation, I settled on a
Gateway desktop from the dreaded Best Buy. For the price of the most
expensive Mac Mini (with neither keyboard nor mouse), I got a machine with
a quad core Intel processor that could handle two monitors out of the box.
It came with 4GB memory and 64-bit Vista that could use it all. It also
had 32-bit Vista in the box in case the 64-bit version did not work out
and I didn't miss the 0.5GB of memory that it could not access. Being
intrepid, I spent the next three-and-a-half days getting the machine as
close to my old configuration as humanly possible. I had some
compatibility issues—a few minor programs and a printer that was due to
be trashed that would simply not run under 64-bit Vista without major
effort.
I was initially quite happy with my new machine and
wondered why Vista had gotten so much bad press. Yes, it was not perfectly
backward compatible and the control settings were randomly scattered
about, but it zipped right along on a new, loaded machine. After the
initial setup, I rarely saw the dreaded UAC
screens, and when I did there was usually a good reason. (The machine
did include Service Pack 1, which supposed reduced the number of UAC
screens.) The system was vastly more secure than XP and many things worked
better. The system still had sleep-related issues—put any Windows
machine to sleep and it never is quite the same when it wakes up.
It would take a month until I understood why Vista is so
unloved among the computer cognoscenti. It was then that I first
experienced the dreaded "Display
driver atikmdag stopped responding and has successfully recovered"
message. On good days, this is all that happens and (after saving
everything), I can continue computing as if nothing had happening. On bad
days, the message repeats five times and the machine stops working. As the
message indicates, my machine has an ATI graphics card, but a similar
message appears on machine with Nvidia cards.
There is a vast literature of web postings on this
problem with ATI and NVidia drivers under Vista. Microsoft, ATI, and
Nvidia are all very much aware of this problem. As I write this, no one
knows definitely what causes it and while there are a slew of home
remedies that work for some people and not for others, there is as of yet
no cure. The common element seems to be that pushing a Vista-running
machine past its comfort zone causes this problem. One common solution is
to disable the Aero interface, a major selling point of Vista. I did not
go that far, I merely disabled the taskbar preview feature, and the
problem went away, only to return recently when I was playing with the
audio-processing features of Winamp in a way that pegged all four cores of
my CPU. (Using a program called SpeedFan,
which monitors the temperature of each core and hard drive separately, I
had been able to eliminate overheating as a cause of the problem.)
It is clear that there is some fundamental flaw in
Vista, possibility involving the timing of the communication between the
CPU and the graphics board(s), lurks behind this problem. Vista has been
available for nearly two years and the problem remains. An operating
system that has been out this long should not continue to have such a
major unresolved bug.
Microsoft has a real problem here with Vista, and no
amount of commercials
with Jerry Seinfeld and Bill Gates moving in with normal families is
going to solve it. I switch to Linux and/or Mac OS whatever in a minute,
but I am to some degree locked into Windows and am not a big fan of either
operating system. Indeed, as I write this, iTunes (an Apple product) is
running some background processes (Apple Mobile Device Process and Apple
Mobile Device Server) on my computer that are undoubtedly part of Apple's
plot for world domination.
The real problem is the "software gap."
Hardware is literally tens of thousands of times more powerful that when I
first started messing around with an early AT&T release of Unix back
in the mid 1970s. Linux and the Mac OS are expanded versions of that early
Unix and Windows is essentially a direct knock-off of Unix. Given all
these operating systems a very generous benefit of the doubt, they are at
most one hundred times more capable than AT&T Unix. At a purely
conceptual level, there is nothing new at all—the ghost of the original
Unix looms large within all three major operating systems. So hardware is
zooming along, getting better by the day, while the software that sits
directly on top of the hardware is relatively standing still. In turn,
this affects the applications that run on these operating systems.
The problem is that developing software much more
difficult than developing hardware. At the heart of this difficulty is
that pushing around electrons is a lot easier that pushing around
programmers. Introduce corporate bureaucracy into the process and you are
virtually guaranteed to get software that sucks. Exhibit 1 is Vista
itself.
I might feel more positively about software if I played
computer games. These games do seem to be where the bulk of the creative
software development is going on. From what little I have seen of games on
my rare visits to BestBuy, the typical contemporary video game looks
really hokey. Better than Ms. PacMan perhaps, but still hokey.
I do, however, own a piece of software that by the
standards of the 1970s is truly wondrous. That program is Reason
4.0. This program is put out by a Swedish company fancifully named
Propellerhead. Reason is a virtual electronic music studio with an amazing
array of synthesizers and related electronic instrumentation, mixers,
effects, etc. The combination of Reason with a USB controller keyboard, a
good 24-bit/96KHz sound card, and even a modest computer creates a music
development environment that is simply amazing. During what Reason does in
software on specialized hardware would cost hundreds of thousands of
dollars and that sucks down amp after amp of electricity. A barebones
system (computer included) that runs Reason adequately under Windows can
be had for under $1,000. A deluxe system under either Windows or Mac OS is
available for about twice that.
Reason and other similar music creation programs have
partially avoided the software gap because the underlying technology,
digital signal processing (DSP), is one where advances in hardware and
software have developed nicely in tandem. Also, DSP can use so much CPU
power (as I demonstrated with my Winamp experiment), that hardware remains
the constraint in a lot of professional work.
Nonetheless, a software gap remains for most music
creation software. Reason's user interface is hamstrung by having to run
on top of either windows or Mac OS. Also, despite the growing availability
of 5.1 and 7.1 surround sound as a standard feature in PCs and Macs,
Reason and its ilk exclusively generate plain old stereo sound. While
music-generating devices with no analog in the physical world have started
to show up in music creation programs (Reason 4.0 has a super-synth called
Thor), everyone that I've seen is merely a direct extension or hybrid of
physical world instruments. One problem is that this market is driven by
professionals and until recently these professionals were working in the
physical world before they moved over to PCs and Macs. Once everyone
forgets why synthesizer configurations are called "patches",
this is likely to change.
The software gap is most noticeable in Microsoft's
Office applications. Other than a string of cosmetic differences, little
has changed in Word, Excel, and PowerPoint from Office 97 through Office
2007. While Microsoft has spent billions of dollars in artificial
intelligence research over those ten years, the grammar checking in Word
is just as dumb as ever. PowerPoint has at least evolved from truly
pitiful to somewhat on a par with the other Office applications. Excel
still refuses to do symbolic algebra, something that Mathematica, MathCad,
and similar programs have been doing for over a decade and even some TI
calculators were offering as early as 1995. OneNote is a nice addition
to Office, but it is still rather pitiful, as is the general support of
tablet computing by Microsoft. (It is, however, much better than what
Apple offers in tablet computing, which is absolutely nothing.)
Next month is my last column of the year before my usual
two-month sabbatical from this commentary. Given the rather upset nature
of the world at the moment, I will leave the title of that commentary as
"Whatever."
Copyright 2008 by Miller Risk Advisors. Permission granted to
forward by electronic means and to excerpt or broadcast 250 words or less
provided a citation is made to www.millerrisk.com.