Warning: Computers can be hazardous to your health

Rather than simply joining in the cycle of computer upgrades, the machine-vision industry would be better served by true technological innovations.

Th 0702vsd Andywilson

Rather than simply joining in the cycle of computer upgrades, the machine-vision industry would be better served by true technological innovations.

Put a group of aged computer journalists in a room and the conversation will usually revolve around the good old days of computing. This often starts with a description of programming experiences with machines such as DEC’s PDP-8, a CPU that, believe it or not, will be 47 years old next month. The discussion will then progress through a series of computers including such classic offerings as DEC’s VAX, Apple’s Lisa, and, my favorite, The Osborne.

Reminiscing about blazing 4-MHz processors, huge 5-in. monitors, and massive 64k RAM memories (remember, that’s all you’ll ever need, folks!), one can imagine why these were called the good old days. But were they really that bad? Certainly, it took a lot more expertise to code these machines than it does today. Because of this, such tasks were better left to developers who at least understood the intricacies of assembly language programming and code optimization. In essence, the limitations of such computers led to a better breed of programmers.

Today, things are very different. For machine-vision applications, for example, a number of software programs are now available that allow the developer to link easily understood icons or scripts together in just a few days. To help these developers, companies such as Microsoft keep upgrading their operating systems. Because these operating systems require much more memory than their previous versions, Intel and other companies are only too happy to support them with faster, multicore processors. All of this, of course, costs more money, and every two years or less developers are forced to purchase new equipment.

While some may imagine that a conspiracy is at work, the effect on the application of such upgrades may be minimal. Benchmarks recently published from Intel, for example, show that, after recoding the JPEG compression algorithm, only an approximate 1.4X gain in speed is accomplished on a dual-core processor. And who, one wonders, apart from engineers at Intel, will have time to recode all these algorithms for dual and multicore processors? In short, if it isn’t broken, why try to fix it?

All this sounds rather negative, and, in fact, for this reason I have yet to replace Windows 2000 running on my Compaq nc6000 with Windows XP. Moreover, I dread the thought of Microsoft’s forthcoming Vista OS since the machine is incapable of running it!

Unfortunately, this same dichotomy is also facing the machine-vision industry and is one that developers must consider very carefully. Why, for example, would a developer want to upgrade a PC/104-based vision system to one based on the newer PC/104-Express standard, or use a Gigabit Ethernet camera in favor of one based on FireWire, or consider using a CMOS-based camera in favor of one based on the more established CCD technology?

To be sure, there are a number of price/performance trade-offs and technical benefits associated with each of these decisions. But one is left to wonder whether technology for technology’s sake is driving the development of such products and whether the system integrator really benefits from what DEC once called “better/cheaper/simpler/faster/digital.” Certainly, if months of code development have been poured into the design of an existing system, the tendency to upgrade to “newer” technologies will be less.

Luckily, in the machine-vision industry, at least at present, there are still a number of applications that have yet to be addressed by off-the-shelf products based on existing technologies. These include multispectral imaging systems, very smart “smart cameras,” image-understanding algorithms, and compact, ultrabright LED lighting. When these products are introduced they will be readily adopted by developers because they will offer novel features not based solely on CPU speed or networking standards. Indeed, perhaps 47 years hence, the computer as we now know it will not exist, leaving another group of aged journalists to wax poetic on genetics, nanotechnology, and robotics.

Th 0702vsd Andywilson
Click here to enlarge image

Andy Wilson

More in Robotics