Method 'Practical Logic'



Principles of Information



How Computers Rely on Elementary Logic: A Simple Guide



C.P. van der Velde.

[First website version 29-03-2025]


1.

 

Introduction



Digital computers - from phones to internet AI systems - run on a foundation of a different order than their circuits: elementary logic.
This isn't a shallow trick; it's the system of truth values and patterns that turns chaos into order. Logic crafts the bits, bytes, words, addresses, and values driving every decision, from a single flip to vast programs.
This guide unpacks it: how logic forms their core, wires into hardware, splits into arithmetic and Boolean operations, maps memory, scales to software, and grows through history, all resting on the unshakeble eternal laws of logic.

2.

 

Data Structure and Processing



Computers process information by performing operations on truth-value patterns.

2.1.

 

Logic's Core: Bits, Bytes, and Words



Elementary logic begins with truth-statements pinned as true or false. "The switch is on " (true) or "off" (false) sets the stage. Computers encode these basic truth-values in chuncks of information that can store patterns of increasing complexity.
• The most elementary code for truth-value is in bits: 1 for true, 0 for false - a single logic atom. Each bit doubles the range of values to be represented: 1-bit = 2 values, 2-bit = 4, 3-bit = 8, etc.
• Group eight bits and you get a byte - 256 patterns, like "01000001" for "A " in the ASCII alphabet - forming a 'molecule' of data.
• But computers think bigger, bundling bytes into words, ICT's term for a processor's native chunk of logic. They stack bytes into words - say, 32 or 64 bits - ICT's term for a processor's native chunk of data, like "01000001 01000010" (two bytes, one word) or "01000001 01000010 00000000 00000000" (32-bit, over 4 billion values).
Thus, bits are the atoms, bytes the molecules, words the strings - each builds truth-value patterns.

2.2.

 

Word Sizes



Word sizes matter because they determine the scope of operations. They're the processor's handshake - how much logic it grabs at once.
• A 4-bit word (1 byte) tops at 2^4 =16 values, limits math to 0-15 (e.g., "1111 " = 15).
• An 8-bit word (1 byte) tops at 2^8 =256 values, stretches them to 255 (e.g., " 11111111"), enough for early games like Pong.
• A 32-bit word (4 byte) holds 2^32: over 4 billion values, tackles big numbers - say, 2,147,483,647 (all 1s) - powering 1990s PCs.
• A 64-bit word (8 bytes) leaps to 2^64, i.e. ~ 18 quintillion possibilities - think " 3.14159.." for pi in science apps.

2.3.

 

Gates: Wiring Truth into Metal



These patterns ground in hardware via logic gates. An AND gate takes two inputs: 1 and 1 out 1, else 0. OR gates pass 1 if either's 1; NOT gates flip 1 to 0. Think of a lock: " if key AND turn, it opens" - two truths click. Stack gates - like a 4-bit circuit - and truth flows, each input pair mapping to an output.
Chips pack billions of gates, turning "if this, then that" into a browser's hum. Logic's patterns spark to life in silicon.

2.4.

 

Two Paths: Arithmetic vs. Boolean Logic



Computers split logic two ways - arithmetic and Boolean - both key.
Arithmetic adds: "0011 + 0101 = 0110" (3 + 5 = 8 in 4-bit binary), summing bits with carryovers.
Boolean operations apply logical relations - AND (both true), OR (one true), NOT (flip it) - and derive their conclusions, like " if power AND bulb, then light", in 4-bit words: "0011 AND 0101 = 0001" (A AND B: true only where both are 1 - rightmost bit here). This Boolean logic locks combos into predictable values, computation's root.
Picture a shop: arithmetic tallies "3 apples + 5 oranges = 8 fruit"; Boolean checks " if apples AND oranges in stock, sell" (true just for what's there). CPUs juggle both - arithmetic for math, Boolean for decisions - two faces of logic's truth patterns.

2.5.

 

Addresses and Values: The Memory Map



Logic scales via addresses and values.
• An address is a location - 3 bits (000 to 111) give 2^3 = 8 spots, like " 101" pointing to a memory slot.
• The value is what's there - say, "1" (true) or "01000001" (A).

Words set both address range and value scope.
Memory ties values to addresses. A 64-bit word can encode 2^64 addresses, or slots in memory, each of which containing a value - like number, letter, or command.
Operations - like "if 001 AND 010, then 000" - chain values via their addresses. Every task, mathematical - 5 + 7 - or boolean - "if-then" - rides this duo: addresses find, values fill.

2.6.

 

Scheme: Bits to Operations



Addresses locate memory spots; values fill them according to their type - numbers (e.g., 5), Booleans (e.g., true), characters (e.g., "A"), or strings (e.g., "hello").
Operations bind them: numerical (add: 3 + 5 = 8), Boolean (AND: 1 AND 1 = 1), character (concatenate: "A" + "B" = "AB"), string (join: "hi" + "there " = "hithere").
Thus, logic flows from bits through words to typed actions.

2.7.

 

Software: Logic Stacked High



Software lifts it further. Code turns gate-level logic into commands. " If user clicks AND file's ready, open" mirrors an AND gate - two truths trigger it.
Loops - "check until true" - and branches - "if 1, this; if 0, that" - stack patterns into choices.
AI stretches it: "if this X post's tone AND context match, predict this reply."
Coders use languages like Python, but it's still bits, bytes, and words - addresses flipping values per logic's rules. From a game's jump to a car's brake, it's elementary logic grown vast.

2.8.

 

A Byte's Growth: Logic Through Time



Logic's reach in computers grew with byte size.
• Early 1950s computers used 4-bit chunks - 16 values, enough for basic math.
• By the 1960s, mainframe computers started using larger words, like Sperry Rand's Univac 1107, in 1962, with 36-bit words. However, 8-bit bytes (256 values) standardized, powering pioneers like the IBM 360 of 1964 - letters and numbers in one go.
• The 1980s brought 16-bit words for PCs like the Intel 8086 processor.
• The 1990s pushed to 32-bit (4 billion values).
• Then the 2000s hit 64-bit ruling today's machines - phones to servers.
Each jump widened logic's canvas, from crude sums to global nets - all on the same principles of logic.

3.

 

Why It Holds: Truth's Precision



Why does it work? The laws of logic simply rests on combinatory principles. Wherever and however distinct values (like features of objects) combine, the result is predicted by the laws of elementary logic.
Logic's patterns are ironclad - "1 AND 0" is 0, no wobble. Computers bank on this: every address ties to a value, sure as stone. This drives billions of calls - streaming picks, traffic lights - because logic's frame holds tight. It's not just speed; it's certainty, from one gate to a global web.

4.

 

The Catch: Simplicity's Edge and Limit



There's a trade-off. Logic's binary - true or false - assumes crisp inputs. Life blurs: "is it cold?" might be "sort of." Computers stretch this with more bits - 16-bit color for "almost red" - but at root, it's 1 or 0. Quantum systems fuzz it with qubits, yet classical logic rules most tech. Its edge is precision; its limit is stiffness - truth patterns don't sway to gray.

5.

 

The Takeaway



Computers rely on elementary logic as their pulse - bits, bytes, and words weaving addresses and values into action.
Arithmetic adds, Boolean decides - both logic's kin. This guide shows it: from a gate's buzz to an AI's guess, logic's the thread.