“I think 64 bit computing is overkill,” said an unknown man to his friend on a shuttle. What? Are you kidding me? That was my shocked reaction when I heard this statement. Then, after a little more thought, I decided he seemed to be making some sense, because after all, it depends on what you are using a 64 bit system for. So the question is: if 64 bit is overkill, what is the acceptable range? Perhaps 32 bit? That is the question we will deal with today.

WHAT IS 64 BIT

First of all, 64 bit is a measurement indicator in the same way we mention 64 kg, or 64 watts, or 64 km. It is the next step in end-user computing, which was at 32 bit for a long time. However, 64 bit computing has been around for a long time. It is on record as far back as early 1970 when development began on a system known as CRAY-1, which was then installed at Los Alamos National Laboratory in 1976. We see 64 bit computing until the 1990s when supercomputers just had to die the death. Though 64 bit didn’t die off, it was not common in personal computers, whereas its counterpart the 32 bit was more common because lots of software supports 32 bit.

In computing, measurement is in bits, and it grows from there to byte, kilobyte, megabyte, gigabyte, terabyte, and upward. It so happened that when processors were being built for computers, it was necessary for them to have certain capacity in order to be able to deliver certain results for that era’s application support. The benchmarks then were 8 bit, 16 bit, 32 bit and 64 bit, which gained more ground in consumer electronics than it did in the world of personal computing. In that era we did see a lot of processors, circuitry, chipsets and applications that functioned with 8 bit, 16 bit, but 32 bit was the most popular for that era, while 64 bit seemed overkill. Then things moved up a bit and 8 bit processing and 16 bit faded away, leaving the 32 bit as an acceptable standard both in the growing gaming and PC world, and 64 bit began gaining acceptance in Nintendo64, PlayStation 2, and within the server world (RISC-based servers, DEC Alpha, Sun UltraSPARC, Fujitsu SPARC64, IBM RS64, etc) and not end-user workstations. By 2003, 64 bit computing gained prominence in computer desktops.

The term 64 bit also applies to the ability of not just the processor, but the circuitry as well. In essence, a 64 bit processor (CPU) had to work with a 64 bit architecture, which comprises the data path on the motherboard called buses: the memory (onboard, external, cache, video memory), as well as the software that runs on this architecture, from the operating system to end-user applications. It is such that the capacity of the processor has to match the capacity of the architecture all the way up from the 64 bit processor, 64 bit ALU’s (Arithmetic Logic Unit), 64 bit registers, 64 bit buses, 64 bit memory. In the diagram below, I show a typical processor’s internal working.

In the diagram above, each of the components work at a certain capacity called “width,” which is the amount of data that can be processed in parallel. It would therefore hold true, that a 32 bit processor would not offer the same benchmarks in performance as a 64 bit processor with higher data width, as shown below:

The widespread cry for this level of computing comes from the limitation of its antecedent, the 32 bit processor with 32 bit ALU’s 32 bit registers, 32 bit buses, 32 bit memory, 32 bit operating system and32 bit application software. What were these limitations that came with 32 bit systems? It is known as total addressable memory.

TOTAL ADDRESSABLE MEMORY

To understand this let’s define what a memory is. System memory is first of all finite and fixed depending on the installed memory chips. The result of every process is stored in memory and memory does not just mean random access memory (RAM), but inclusive of video memory.


Somewhere in the world of data and bits, it so happens that certain calculations similar to the same things found in other fields of physics come into play. A 32 bit system can only accommodate so much memory use, similar to the same way every person has capacity to eat only so much. For a 32 bit system, that limit was 4GB. Back in the early age of system evolution and application system evolution, 4GB was a big deal. And when a few times that limit was hit, or more performance is required even from startup, the solution was to deploy another system and load balance, and that was where the whole idea of scalability came from. This is how calculation of the 4GB limit came about for a 32 bit system.

232 = 4,294,967,296 bytes 4,294,967,296 / (1,024 x 1,024) = 4,096 MB = 4GB

This wouldn’t have been a problem if progress was stopped in the world of information systems and technological advancement. But this was not so. Technology advanced, applications became more complex, user base hit the highs, adoptions skyrocketed, and the law of economics came in. Man is insatiable – we only know how to progress and not regress – and 4GB became a problem. The solution was to up that amount in a 64 bit system and this is what was obtainable:

264 = 18,446,744,073,709,551,616 18,446,744,073,709,551,616 / (1,024 x 1,024) = 16EB (Exabyte)

What does this mean? It means the problem is solved. The problem could not be solved by increasing the memory of a 32 bit system. No matter how much anyone did, the law of diminishing return would set in. In fact, it was observed that in many systems fitted with 4GB memory in the 32 bit architecture that they could not use the entire 4GB. Many ended up using only 3GB of the 4GB. This was caused by another kind of memory. Generally, when people hear the word memory, they think of random access memory (RAM). However, memory is more than that. It means the total addressable memory. The word addressable means the upper limit. It means that beyond this point no new memory location can be given attention to regardless if the memory location is slow memory such as RAM, or fast memory such as cache. It also doesn’t matter where that cache is located, be it in the CPU or the video graphic cards. We know RAM is the general system memory that the system, applications and operating system draw from to execute their tasks. Video memory on the other hand is soldered directly into the video graphics card either onboard or externally, and the graphic card’s own internal processor uses it to perform graphic tasks. In some systems this video memory is shared straight out of the RAM, and in others it is dedicated. Addressable now means that whether it is shared or dedicated, if, when added up and exceeding 4GB, the surplus cannot be used.

THE HAPPY ENDING?

There are a lot of benefits from the architecture of 64 bit and memory upping. The first most easily apparent is that 64 bit can address far more than 4GB. In fact, it can address 16EB (Exabyte) addressable, and there is no known application in the world at this time that can consume this much – not NASA, the U.S. military, the NSA, the biggest social network, the largest database in the world, or 5D video editing or animation. When it comes to process address space that the operating system uses for itself which reduces much of the memory that is left for user programs, it is much higher in 64 bit systems. Typically 32 bit systems use around 1GB to 2GB leaving only about 3GB to 2GB of the space for user mode programs. As for memory mapping, while it was a dance of memory swapping between memory chip and hard disk swap file area, in the 64 bit system this becomes very unnecessary and not as critical. Other 64 bit programs such as encryption software, 3D programs, encoders, and decoders enjoy the luxury of having CPU registers that can accommodate their processing in one single operation without having to fetch data stored in the cache or main memory. This leads to very noticeable performance increases.

It is true that more space is occupied in memory for the same data in 64 bit architecture than 32 bit architecture, due primarily to the fact that pointers and alignment padding are longer. By implication, it means more memory will be used for the same application that used less memory in a 32 bit environment. This is across the board for many vendors of applications: the programs use more in memory. This spells concern in some quarters such as increasing the cost of virtual labs for professionals who have to learn or practice or demo a product on their own budget. In many instances, this cost increase is very much frowned upon. Nevertheless, we find that progress is being made in adjusting to the good malady. Another problem area in 64 bit computing was the availability of corresponding 32 bit software. Among the very severe were system device drivers. Whereas it was possible to run these device drivers in compatibility mode if using 64 bit architecture and 32 bit application software, it was a major challenge pre- and early 2007 for Windows based users..

THE REALITY

The reality remains that 64 bit computing and its attendant memory consumption is vital to the survival of this information and devices age. Though they are expensive and powerful in meeting the emerging needs of this time, we need to remember that this information age depends largely on servers dishing out content – content that is being accessed by trigger-happy fingers. Access to the contents on these servers is done via client machines which do not necessarily need to be on par with the server architectures. Sixty-four bit does not necessarily benefit any of the average users whose system usage does not exceed email, browsing the web, movies, editing documents and a few minor games. But it does benefit the users who have to contend with complex features of applications used in industries, telecommunications, health, entertainment, scientific research, and the critical services field where databases have become massive, and response time apart from bandwidth limitations is critical to the overall outlook of productivity. Whereas medium to large organizations may not have cost limitations to procurement of the necessary amount of required memory to support the computing power of 64 bit, the vanity users of 64 bit need not tremble either as the cost of memory has continually dropped over the years as 64 bit gains mainstream acceptance.