GCSE Computing A451 Unit 4.1 – Units of Data www.mrfraser.org Unit 4.1– Units of Data Candidates should be able to: a) Define the terms bit, nibble, byte, kilobyte, megabyte, gigabyte, terabyte b) Understand that data needs to be converted into a binary format to be processed by a computer. Units of Data in Computer Systems Unit Bit Nibble Byte Kilobyte (KB) Size 1 bit 4 bits 8 bits 1024 bytes Unit Megabyte (MB) Gigabyte (GB) Terabyte (TB) Petabyte (PB) Size 1024 Kb 1024 MB 1024 GB 1024 TB • A bit is the smallest unit of data that a computer uses. It holds a binary value of either 0 or 1. • A nibble consists of 4 bits. This means it can store 16 possible binary values, 0000 to 1111. Example of use: Binary Coded Decimal (BCD) System - 1 nibble is used to encode each digit e.g. • 78 -> 0111 1000 251 -> 0010 0110 0001 A byte consists of 8 bits. It can store 256 possible binary values, 00000000 (0) to 11111111 (255). e.g. • 29 -> 0010 1001 29 -> 00011101 78 -> 01001110 251 -> 11111011 A kilobyte (KB) consists of 1024 bytes (Sometimes rounded to 1000 MB or 103 bytes). 1KB of memory could store roughly one full A4 page of text. • A Megabyte (MB) consists of 1024 Kilobytes (Sometimes rounded to 1000 Kb or 106 bytes). • A Gigabyte (GB) consists of 1024 Megabytes (Sometimes rounded to 1000 MB or 109 bytes). • A Terabyte (TB) consists of 1024 Gigabytes (Sometimes rounded to 1000 GB or 1012 bytes). Why does data need to be converted into a binary format to be processed by a computer? Computers can only understand 1s and 0s... they process and store binary numbers. Any other type of data is useless to a computer unless it is first translated into binary form (digitised). This is true for all text, images, music and data that originate from an analogue (non-digital) source. 1