# Understanding the Mechanics of Computers and Their Operations

Written on

## Chapter 1: The Marvel of Electricity

Electricity is a fascinating phenomenon, distinct from anything else in our universe. It can be visualized as the invisible energy that powers nearly everything we encounter today. So, what makes electricity truly remarkable?

The answer lies in electrons, the minuscule particles that can be dislodged from atoms, initiating the flow we recognize as electricity. Essentially, electricity arises from an imbalance between protons and electrons, prompting nature to seek equilibrium.

Consider a thunderstorm. During such events, the lower regions of clouds accumulate electrons while the upper sections lose them. This disparity cannot persist indefinitely, leading to nature's intervention through a dramatic bolt of lightning. This flash is merely a rapid movement of electrons transitioning from one location to another.

In everyday electrical circuits, the process is quite similar. When one atom relinquishes an electron to a neighboring atom, it triggers a chain reaction. This atom then acquires an electron from another nearby atom, continuing the sequence. This transfer of electrons from atom to atom constitutes what we term electric current.

Batteries function like miniature power stations, generating electricity through chemical reactions. The chemicals inside a battery are meticulously selected to produce an abundance of electrons on one terminal (the negative terminal or anode), while the opposite terminal (the positive terminal or cathode) seeks to draw in those surplus electrons.

Thus, the energy stored within the chemicals is converted into electrical energy that energizes our devices.

Interestingly, all electrons are identical, regardless of their location. But how do we quantify the work accomplished by these electrons? This is where voltage comes into play. Voltage represents the potential to perform work, whereas current denotes the actual flow of electrons within the circuit.

Current is measured in amps, akin to the volume of water flowing through a pipe. Conversely, voltage can be compared to the pressure within that pipe. Additionally, resistance measures how much a material hinders the flow of electrons, quantified in ohms. You might think of the Earth as a vast ocean of electrons, much like an extensive body of water.

## Chapter 2: Information Processing in Computers

When it comes to computer information processing, Boolean algebra plays a crucial role. This mathematical framework helps to assess whether certain conditions are met. For instance, the plus sign (+) denotes OR, multiplication (×) signifies AND, and the minus sign (?) indicates NOT.

Now, let's delve into one of the most essential terms in computer science: "bit." A bit, short for "binary digit," is a fundamental unit of information in the digital realm. Our conventional number system is based on ten, primarily due to our ten fingers, but there's nothing inherently significant about this decimal system.

The binary system, in contrast, is unique because it is the simplest possible number system, containing only two digits: 0 and 1. Attempting to simplify it further would leave us with just zero, which is not very useful.

In the digital domain, a byte comprises 8 bits. This structure allows for values ranging from 00000000 to 11111111, corresponding to decimal numbers from 0 to 255. A byte is particularly advantageous for text storage, as most written languages can be represented with fewer than 256 characters. Additionally, a byte is perfect for capturing various shades of grey in black-and-white images, given that the human eye can discern around 256 shades.

For color representation on screens, three bytes are utilized to depict the primary colors: red, green, and blue.

In the video "How Computers Work: What Makes a Computer, a Computer?", the intricacies of computer components and their functionalities are explored in detail.

The second video, "How Computer Works (Complete Course)," provides a comprehensive overview of computer operations and processing.