You are currently browsing the category archive for the ‘Computer Science’ category.
Digital information is transfered in the form of electric currents through wires that may contain impurities that alter the currents, or electromagnetic waves through space with ambient radiation that alter the waves, yet digital information is so dependent on precision. Unlike analog information, digital information gets ruined when there is any noise involved. For example, say I shouted “Hello!” to you from far away during a windy day. The sound waves in the air constitute analog information, but the wind along the way can alter the waves and therefore perturb the information. You might hear something distorted but it would probably still resemble the original “Hello!” message. Now suppose I sent the text message “Hello!” to you via instant messaging. In decimal, the ASCII string “Hello!” (excluding the quotation marks) is 072 101 108 108 111 033, which is actually sent in bytes (blocks of 8 bits) in binary as 01001000 01100101 01101100 01101100 01101111 00100001. Now, imagine if there was some noise (perhaps radiation) in the air which altered even one of the bits. Suppose in the byte containing the letter “o”, the bits changed from 01101111 to 00101111 (the first 1 was changed to a 0). Then the message received would be “Hell/!”, a completely different message! And that’s just from one altered bit! If billions of messages are transfered every day, shouldn’t there be an overwhelming number of confusing messages all the time? And that’s just the tip of the iceberg. The messages transmitted could be in machine language telling some server to do a specific thing, and an altered bit could change its actions entirely. How come our computer-controlled devices all work properly most of the time? Maybe the radiation in the air and the imperfections in the wires that we use to transmit digital information actually don’t molest the messages we send. This is not at all true. Instead, we use error correcting codes to ensure that our messages are received as intended.
Algorithms lie at the heart of computing. When you surf the web, you open up a web browser. When you view a website, your web browser takes the HTML file, which contains code, and then the browser uses an algorithm to convert the code into a page that actually makes sense to humans. When you listen to music on iTunes, the music player takes the MP3 file, which encodes the sound as 0’s and 1’s, and it uses an algorithm read the 0’s and 1’s and tell your computer speakers to output sound. When you’re looking for a particular song in your music library, you might want to search alphabetically, so you click on the Name column to sort your songs alphabetically. Your computer doesn’t magically know how to sort your library. It follows a precise set of instructions every time it’s told to sort something. That set of instructions is an algorithm!