Computer storage: Why 1GB=1024MB not 1000 MB??

Computer storage and memory is often measured in megabytes (MB) and gigabytes (GB). A medium-sized novel contains about 1MB of information. 1MB is 1,024 kilobytes, or 1,048,576 (1024×1024) bytes, not one million bytes.

Computers understand only one language – Binary ( 0 and 1 ). As such they store information in that form as well. Here 0 and 1 are bits of information.

Let us suppose that a computer has a container with a capacity of 8 bits to store one character. Let that character be ‘A’.

Now according to ASCII Table , character A will be represented by the decimal value of 65. The representation was arrived at, so that the information can be converted to binary form, that the computer understands. Now 65, after conversion to binary , can be represented as 01000001 in 8 bits. This is stored in the container in the computer memory.

So, now we have seen how a single character is stored in the computer memory.

But with the passage of time, we needed to store a lot more than a single character in the computer. As such we came up with a system, over time, to mark the humongous increase in storage space, with certain milestones.

Thus, the container came to be known as a byte ( 2323bits). Hence 1 Byte(B) = 8 bits(b).

To aid in hexadecimal representation, half a byte( 2222 bits) came to be known as a nibble.

But the computer scientists made a simple oversight( as it would cause confusion everywhere and cause the OP to ask this question) and borrowed naming conventions from the metric system. A kilobyte should actually be 1000 bytes as per the metric system askilo implies 103103 . However, 1000 in binary is represented as 1111101000, which is not very convenient. Hence, they decided to use the power of 2 closest to 1000, which is 1024( 210210), as the significant error was less . Thus 1 kilobyte(1 KB) meant 1024 bytes(1024 B). By extension:

1 megabyte(MB) = 1024 kilobytes(KB) instead of 1000 kilobytes.

1 gigabyte(GB) = 1024 megabytes(MB) instead of 1000 megabytes.

and so on …..

Hardware manufacturers, however stuck to the standard decimal meanings of the prefixes. As such the error increased manifold over time.

For instance ,

according to hardware manufacturers,

1 GB = 1000 MB = (1000 x 1000) KB = (1000 x 1000 x 1000) = 1000000000 Bytes

But in reality

1 GB = 1024 MB = (1024 x 1024) KB = (1024 x 1024 x 1024) = 1073741824 Bytes.

See the difference ?

This even resulted in litigation against hard drive manufacturers, who reported drive capacities in standard decimal multiples of bytes, while some operating systems reported the size using the larger binary interpretation of traditional prefixes.

Moreover there was a confusion whether ‘b’ in Mb or Kb represented bits or bytes.

Hence in 2002, IEEE cam up with the IEEE 1541-2002 standard concerning the use of prefixes for binary multiples of units of measurement related to digital electronics and computing. But the standard has not caught on yet, even though it was reaffirmed as a standard in 2008.

To answer your question,

1 GB = 1000 MB ( as it is still used by hardware manufacturers).

However, as per the new(relatively :D) standard,

1 GiB = 1024 MB

Hope this helps 🙂

P.S : To my dear fellow Indians who are looking for a high speed internet connection, those of you who don’t know already, 10 Mbps is 10 Megabits per second and not 10 Megabytes per second(MBps) . And thus 10 Mbps = 1.25 MBps.