Trying not to get too technical, a simple way to think of what a byte is:
"Bytes" is a basically a measurement term used to describe the size of a file. A file's size can be referred in terms of "bytes". For files with a size larger than 1024 bytes you will sometimes hear a file's size also noted in these additional measurements: "k" (kilobytes), "megs" (Megabytes), "gigs" (Gigabytes), ect.
Usually 8 binary bits = 1 byte
Usually 1 bytes = 1 character of data
1024 bytes = 1k
and so on...
When you scan different objects, the binary information is saved into a file on your computer. The resulting file size (or number of bytes) may be a different size each time depending of various factors. The size of the object being scanned, number of colors, dpi of scan, etc.
Don't know if this helps you understand what a byte is. Hopefully it helps.
The following is the definition for the word "byte" (taken from www.askjeeves.com website):
Byte
/bi:t/ An amount of memory or data smaller than a word; usually eight bits; enough to represent one character; the smallest addressable unit of storage.
On modern architectures a byte is nearly always 8 bits and characters are usually represented in ASCII in the least significant seven bits.
Historical note: The term was coined by Werner Buchholz in 1956 during the early design phase for the IBM Stretch computer. Originally it was described as 1 to 6 bits (typical I/O equipment of the period used 6-bit chunks of information). The move to an 8-bit byte happened in late 1956, and this size was later adopted and promulgated as a standard by the System/360 computer. The word was coined by mutating the word "bite" so it would not be accidentally misspelled as bit.
|