# What are gigabytes?

Wikipedia gives multiple definitions that are not entirely related to each other.Gigabyte is a company in Taiwan and a nickname of a virus writer. (The) And in addition, it is a unit of measure of the amount of bytes of memory. And Jaap Koole gives a good answer, but not completely complete.

It starts with a bit and a bit is a 0 or 1. Or, to be more precise, an electric stream of 0 volts or 3 volts.(Or 5 volts or whatever voltage a processor also uses.) This stream goes through pins and wiring from A to B and the electronics rereacts and therefore “something” happens. And with just enough bits you can do a lot.

One was later to group the bits into bytes But in the years 60 there was no fixed definition of the number of bits in a byte.That could be 6, 12, 24, 36, 48 or even 60 bits per byte. Mostly multiples of 6 bits, actually, because in 6 bits fits the entire alphabet, all the numbers and the space. (64 values.) In the end, the byte has become standard for the 8-bit byte, since it has become the default.

And the Giga part is a standard name in the SI system.This system defines the designations of various units of measure such as the meter for AFTSND, gram for mass, ampere for current, and so on. These are also prefixes that are used as a multiplication factor. Kilo thus means multiplying by a thousand, mega for a million and giga for a billion. But also in the other direction like Milli benefits by a thousand and micro benefits by a million. But at bytes, this sharing is useless, so only the magnification factors are used.

This means that a gigabyte actually means 1 billion bytes and not 1,073,741,824 bytes.That’s quite a big difference. Only, originally no powers of 1000 were used within the ICT but powers of 1024, because multiples of 8 fit nicely in this. This also has to do with the fact that [math2 ^ {10} [/math nicely contains all the values from 0 to [math1024 ^ 1 [/math].And [math2 ^ {20} [/math until [math1024 ^ 2 [/math and [math2 ^ {30} [ /math again [math1024 ^ 3 [/math.So on computers, they quickly started working with powers of 1024.

And so there are actually two standards created, where the gigabyte is actually [math1000 ^ 3 [/math , but the gibibyte is [math1024 ^ 3 [/mathbytes . But the latter is often still named as gigabyte when it comes to RAM.But the powers of a thousand are used to indicate the sizes of hard drives, which means that in a hard drive of 100 gigabytes you actually have almost a gigabyte less disk space than you think. Simply because the gibibyte is not a nice number in the decimal system while the gigabyte is not a nice number in the binary system.

But for memory modules the JEDEC standard exists which uses the binary denomination and not the decimal denomination.For all other applications, you should actually assume the decimal name or else use Kibi, Mebi or Gibi. That is the ISO/IEC standard.

Well, this has been causing the confusion for several decades.But the rule is simple:

If the memory is (RAM) then the gigabyte or GB is [math1024 ^ 3 [/mathbytes .In other situations, the gigabyte or GB is correct [math1000 ^ 3 [/mathbytes while the Gibibyte or GiB is again [math1024 ^ 3 [/math .

A **bit** is a 1 or 0. The basic unit of information.

A **Byte** is a series of 8 bits

A KB is a **kilobyte**, is 1,024 Bytes

A MB is a **megabyte** IS 1,024 KB

A GB is a **Gigabyte** is 1,024 Mb

What all *sizes* are for the size of information space.