If you follow todays trends in the home and SOHO PC market you will notice that number
crunching, word processing office type applications are becoming less and less significant
and highly graphic 3D intensive applications are all the rage. The way that the
performance of a processor is now evaluated has changed a great deal as well. For the past
several years, system benchmarking focused primarily on business related applications.
Nowadays it seems that people are more interested in how many frames per second their
systems can push out of Quake II or Forsaken.
Don't get me wrong it's not that people don't use business applications anymore but any of the hardware sites I've explored lately chart hardware performance with 3D graphics and games performance 4 to 1 over something say like Business Winstone 98. The reason for this is simple enough. System performance, and processors in particular have met and even exceeded the challenge these applications used to impose. It would seem that we no longer wait for our computers to perform, they wait for us...
Now, our attentions are focused more on the monitor screen than ever. Consequently, enhanced graphic capabilities haved moved to the forefront when measuring the raw calculating power of the processor. The promised multimedia monster of a few years ago has arrived... Emerging technologies in the fields of 3D graphics, video and computer gaming have pushed the processors we know to be office application capable over the edge, creating internal bottlenecks (both real and imagined) between the processor and the monitor.
AMD, aware of this trend a year ago decided to improve it's K6 processor by specifically increasing 3D performance. While Intels Pentium II processors get their great 3D performance from their ability to handle loads of raw floating point caculations, AMD decided to use a bit of finesse in approaching improved 3D performance. The Floating Point Unit of a processor is capable of performing volumes of complicated floating point calculations, but for 3D games only some of these calculations are needed. Separating these specific calculations thus enabling the processor to do them on multiple single numbers at the same time was how AMD accoplished this. Grabbing and then processing several data packets at the same time is called SIMD or single instruction multiple data processing. Where 3D graphics are concerned huge amounts of data has to be processed all the same, usually sequencially,done one after the other. SIMD improves this significantly, because grabbing for example, four data packets and processing them at the same time is obviously faster than grabbing a single data packet four times.
SIMD made it's first appearance as Intel's MMX technology but the MMX instruction set only provided for multiple integer calculations which, while great for image processing and 2D graphics provides less than stellar performance in the 3D arena where floating point calculations are gobbled up like krill by the 3D rendering whale.
This is AMD's "3DNow!" strong point!
There are a few potholes, however, in the road that AMD has taken. Pothole number one being that like MMX this new instruction set requires new and specific software to run it at it's greatest capabilities. If you'll remember back it took quite awhile before MMX specific software was even available in many of the software products released after it's advent. But AMD, I think, is betting that the current trend toward high end 3D graphics intensive gaming and video manipulation is going to continue. And I believe they're probably right. But, rather than dig a really deep hole (as Intel did with MMX), AMD was smart enough to include enhancements for taking advantage of Microsoft's DirectX 6 geometry rendering engine (a strong platform in the 3D gaming development industry). So that games which aren't even engineered with 3DNow! code will still run faster with the K6-2 if thet run on the DirectX 6 platform(due for release any day now...). AMD has also provided the technology for 3D graphics chip manufacturers to develope 3DNow! specific drivers for their graphics cards. (Currently only NVIDIA has implemented these specific drivers but AMD hopes to see other chip manufacturers come on board in the near future...)
Pothole number two is the future of the Socket 7 architecture. If Intel has it's way socket 7 will soon fall by the wayside. With it's rather heavy handed implementation of Slot 1 architecture; beginning with the Pentium II and being propped up for the lower end by it's relaese of the Celeron processor, Intel seems to be forcing the other chip makers to fall into line behind them. Way behind them... AMD may have a tough row to hoe convincing the buying public to stay with the socket 7 platform...