Skip main navigation

£199.99 £139.99 for one year of Unlimited learning. Offer ends on 28 February 2023 at 23:59 (UTC). T&Cs apply

Find out more

What is the difference between bits, bytes and nibbles?

Here, we are going to have a quick look at the naming system that is commonly used when talking about binary numbers and data.
A photograph of a person working at a computer

Here, we are going to have a quick look at the naming system that is commonly used when talking about binary numbers and data.

Binary numbers and data

When counting in binary, you only use the digits 1 and 0. These are called the binary digits, which is normally abbreviated to bit.

Often, I talk about bits of data, flipping bits, shifting bits, etc. In each case, I am talking about manipulating a single binary digit.

Historically, when computers were first being designed, single characters of text were represented using eight bits.

For instance, the letter a was represented with the binary number 01100001, whereas the letter Q was represented by the number 01010001.

A collection of eight bits, therefore, came to be the smallest unit of storing data, and so deserved a name all of its own. A deliberate misspelling of bite was chosen, and so the name byte came into being, so it would not be confused with bit.

Keeping with the theme, a half byte (4 bits) was given the name nibble. This number of bits was fairly important in tiny computers called microprocessors.

For larger numbers of bits, you use the standard scientific nomenclature for large multiples: kilo-, mega-, giga-, etc.

name symbol number of bits
bit b 1
nibble 4
byte B 8
kilobit kb 1000
kilobyte kB 8000
megabit Mb 1,000,000
megabyte MB 8,000,000


As a quick note, it is often useful to talk about a number of bits as being equal to a power of 2. In such cases, Kbit and Kibibyte (KiB) can be used to denote multiples of 1024 (a power of 2) instead of 1000. So a kibibyte is 1024 bytes, a mebibyte is 1024 kibibytes, a gibibyte is 1024 mebibytes, and a tebibyte is 1024 gibibytes.

The International Electrotechnical Commission has made these terms an international standard for the computer industry to use.


This article is from the free online

How Computers Work: Demystifying Computation

Created by
FutureLearn - Learning For Life

Our purpose is to transform access to education.

We offer a diverse selection of courses from leading universities and cultural institutions from around the world. These are delivered one step at a time, and are accessible on mobile, tablet and desktop, so you can fit learning around your life.

We believe learning should be an enjoyable, social experience, so our courses offer the opportunity to discuss what you’re learning with others as you go, helping you make fresh discoveries and form new ideas.
You can unlock new opportunities with unlimited access to hundreds of online short courses for a year by subscribing to our Unlimited package. Build your knowledge with top universities and organisations.

Learn more about how FutureLearn is transforming access to education