Monday, September 22, 2025

Numbers - the Language of Technology

Before we can understand modern computer and Internet technologies we must know the language. Numbers are the language of modern computers and Internet. The words you are reading on your computer have been recorded in a code of 1s and 0s as I typed them. 

Modern computers and Internet protocols encode two-value (0 and 1) Boolean logic in their designs. The physical Computer circuits (both internal and external to silicon chips) are a physical manifestation of two-value Boolean logic. They achieve this in various ways: as voltages on wires, as electro-mechanical relays and capacitive storage devices, as orientations of a magnetic domain in ferromagnetic storage devices, as holes in punched cards or paper tape, and so on. 

While it is possible to code more than two symbols in any given medium, in practice however the tight constraints of high speed, small size, and low power combine to make (electronic) noise a major factor. This makes it hard to distinguish between symbols when there are many of them at a single site (electronic noise is an important factor to deal with in design, which I’ll speak of it later). Rather than attempting to distinguish between four voltages on one wire, early computer designers settled on two voltages per wire, high and low.

Another reason for the present day two-value logic system is that the first electronic computer circuits were made from relays which have only two operational properties, on and off. Because of the electro-mechanical limitations of the relay circuits, early designers had no choice but to chose a two-value logic system. 

In the 1930s, while studying relay switching circuits, Claude Shannon observed that one could apply the rules of Boole's algebra to arrays of relays to add, subtract, multiply and divide decimal numbers. He introduced switching algebra as a way to analyze and design circuits by algebraic means in terms of logic gates1. Thus casting his switching algebra as the two-element Boolean algebra. In circuit engineering settings today, there is little need to consider other Boolean algebras, thus "switching algebra" and "Boolean algebra" are often used interchangeably. 


Computers use two-value Boolean circuits for the above reasons. The most common computer architectures use ordered sequences of Boolean values, called words. Words consist of 8, 16, 32 or 64 discrete values, e.g. 01101000110101100101010101001011, each discrete value is called a bit (binary digit). When programming in machine code, assembly language, and certain other programming languages, programmers work with the low-level digital structure of the data registers. Such languages support both numeric operations and logical operations. These computers can add, subtract, multiply, divide, two sequences of bits which are interpreted as integers. These computers can perform the Boolean logical operations of disjunction, conjunction, and negation again on two sequences of bits. Programmers therefore have the option of working in and applying the laws of either numeric algebra or Boolean algebra as needed. A core differentiating feature between algebra and logic, is the carry operation with algebra but not with logic. 


The underlying technology that makes possible all present day computer, Internet, cell phone, encryption, radio, television and related technologies is binary digital (2-states of discrete units) technology. 


Many historians say that human technology began with the discovery of how to give stone a cutting edge; but I believe technology got its start when Man developed a sense of numbers, such that through the use of symbol and language, a number could take on a definition of being a noun or an adjective. Number sense, like common sense is difficult to define or express simply. It refers to an intuitive feel for numbers, their various uses, relationships, and interpretations, and how they are affected by operations. That intuitive feel comes from learning through life experiences. 


We don’t know how or when Man developed a sense of numbers, we speculate that it began with pattern recognition. We begin as children associating a symbol and a sound with a pattern. The sound a mother makes when she says ‘2’ in relation to two objects, the sound made when she says three with 3 objects. We are not born with a sense of numbers, it has to be learned. We begin counting at an early age by associating a number with a pattern of fingers. It probably wasn’t until Man reached the number 10 (2 hands) that he found he could use 10 pebbles (or any object) plus 1 more of that object for the next number. It took many generations for Man to develop a rudimentary sequence of counting 1 through hand (5) and six through 2-hands (10). 


As Man evolved and his mind expanded with ideas of “how many” and “how big”, he slowly developed a way to convey that type of information to others through the use of language and gestures. Counting in prehistory was first assisted by using body parts, primarily the fingers. In his book ‘A History of Computing Technology‘ Michael R. Williams describes three ways fingers can be used to aid humans in calculating. The first is simple counting, the way we all did it in first grade. The second way used fingers and finger positions as a ‘number language’ similar to today’s sign language. The third way involved both physical counting using fingers and mental addition and-or subtraction with the answer displayed with fingers. 


Another fundamental step for developing a sense of numbers is a concept of relation between numbers. Four (4) is less than five (5); greater than three (3); 4 is the same as two 2s, or 4 ones (1s); or a 2 and two 1s. The meaning of ‘4’ and its function when referenced to some object(s). The first numerical functions were probably addition and ‘take away’ (remember that from 1st grade?) later known as subtraction; then came multiplication and division, then algebra and finally calculus. 


Today, almost everything in our world and within the realm of our mental capacity has at least one property which is defined by or identified by a number. But a modern computer, knows only two numbers, 0 and 1. Every number problem is a simple addition. In fact a computer does not know what ‘1’ or '0' is unless we humans tell it what it is and what to do with it. 


Man did not build machines to calculate because he wanted to, he built calculating machines because he needed them to survive. We started with pattern recognition, pebbles in a rawhide bag, tally sticks and knotted rope. Then we made simple hand instruments, the abacus, quadrant and Napier’s bones and slide rules. Then we got mechanical with Pascal and Leibniz and their hand crank calculators, and Babbage’s analytical engine. With the coming of World War 2, the need for computing power surpassed that which all of Man’s machines could produce. When Man turned to electricity to power his new machines, he ushered in our present era of digital binary computing. 


Today our machines get more powerful because our numbers are both bigger and smaller, while being more accurate. Numbers that get bigger and smaller, faster each day lay at the root of all knowledge. They get bigger and smaller faster because our machines allow them to. It has, in essence become a circle, if we want to progress and make new discoveries, we need more powerful machines. 

Mathematics and Arithmetic are what brings science to the real world; how we describe and define our world; how we measure progress, and together with philosophical and abstract thought form the basis of discovery. 


We don't know when or how man developed a concept of numbers. Scholars speculate that it began simultaneously with language. The reasoning behind this assumption is that man needed to convey information. The number concept probably started with our ancestors assigning a symbol to a pattern. For example, a herd of 5 sheep would be assigned a symbol of "5". All sheep herds matching the pattern of that first reference herd will receive the symbol ‘5”. Information could be conveyed without rounding up all the sheep and presenting them, "5" saved time and energy. This assigning of symbols to real life objects became what we now call a "cardinal" number. 


At about the same time man developed techniques for counting. Counting allowed man to continuously redefine his meaning of how much and how many. Through research of language, we find early man's concept of counting was based on his observations of nature. Five was one hand, ten was two hands, 20 was one man, cycles of the tides (12 & 24) and moon (30). Through the years man came to accept a universal concept of a number line. This number line is based on a universally accepted concept that we can always pass from any number to its successor or predecessor. We commonly call this system of numeration an ordinal system. 


It is language that gives us the symbols 0,1,2,3,... Our theory of computer science is based on the set of numbers {0,1}. The symbols 0 & 1 are what we define them to be. A “1” can be a positive voltage, it can be a negative voltage, a current or a state of no current, it can be a frequency, the phase of an analog signal, it can be a closed circuit or an open circuit, it can be a note from a musical instrument, a “1” is anything that we define it to be. In the first mechanical computers a “1” was the specific degree of a circle. This circle was a gear wheel and the degree was located at a gear-tooth. Later, after we developed electricity, a “1” came to be defined as a closed circuit. In modern computers, because of the broad range of uses and functions, a “1” has a very broad range of definitions. The definition of a “1” can also vary within a specific piece of equipment. All of the above is also true when considering a “0”. 


A number is a mathematical object used to count, measure and label. During the 19th century, mathematicians began to develop many different abstractions which share certain properties of numbers and may be seen as extending the concept. These abstractions became rules. 


Mathematical rules are based on the defining the limits we place on the particular numerical quantities dealt with and how we interpret operations performed. When we say that 1 + 1 = 2 or 3 + 4 = 7, we are implying the use of integer2 quantities. Integers are the same types of numbers we all used when we learned to count3. But when we use numbers with electricity, we need to be careful. Because what most of us assume to be self-evident rules of arithmetic, valid at all times and for all purposes, actually depend on what we define a number to be. 


For example, when considering numerical quantities in Alternating Current4 (AC) circuits, we find that Real number5 quantities are inadequate for the task of representing AC quantities. We know that voltages6 add when connected in series AC circuits, but we also know that it is possible to connect a 3-volt AC source in series with a 4-volt AC source and end up with 5 volts total voltage (3 + 4 = 5)! Does this mean the inviolable and self-evident rules of arithmetic have been violated? No, it just means that the rules of Real numbers do not apply to the kinds of quantities encountered in AC circuits (and many other uses of electronics), where every variable has both a magnitude7 and a phase8. We need a different definition of numerical quantity for AC circuits (complex numbers, rather than real numbers), and along with this different definition of numbers comes a different set of rules telling us how they relate to one another. 


An expression such as “3 + 4 = 5” is nonsense within the scope and definition of real numbers, but it does fit within the scope and definition of complex numbers (think of a right triangle with opposite and adjacent sides of 3 and 4, with a hypotenuse of 5). Because complex numbers are two-dimensional, they are able to “add” with one another trigonometrically as single-dimension “real” numbers cannot. 


Computer Logic is like mathematics because the rules of computer logic depend on how we define what a proposition9 is. Greek philosopher Aristotle founded a system of logic based on only two types of propositions: true and false. His bivalent (two-mode) definition of truth led to the four foundational laws of logic: the Law of Identity10 (A is A); the Law of Non-contradiction11 (A is not non-A); the Law of the Excluded Middle12 (either A or non-A); and the Law of Rational Inference13. These Laws function within the scope of logic where a proposition is limited to one of two (binary) possible values, but may not apply in cases where propositions can hold values other than “true” or “false.” In fact, much work is currently being done on “multivalued,” or fuzzy logic14, where propositions may be true or false to a limited degree, such as in quantum computing15 . In such a system of logic, “Laws” such as the Law of the Excluded Middle simply do not apply, because they are founded on the assumption of bivalence. Likewise, many premises16 which would violate the Law of Non-contradiction in Aristotelian logic have validity in “fuzzy” logic. Again, the defining limits of propositional values determine the Laws describing their functions and relations. 


English mathematician George Boole17 sought to give symbolic form to Aristotle’s system of logic. Boole wrote a treatise on the subject in 1854, titled An Investigation of the Laws of Thought, on Which Are Founded the Mathematical Theories of Logic and Probabilities18, which codified several rules of relationship between mathematical quantities limited to one of two possible values: true or false, 1 or 0. His mathematical system became known as Boolean algebra. 


All Boolean operations performed have only one of two possible outcomes: either 1 or 0. It is a world in which all other possibilities are invalid. This is not the kind of math you want to use when balancing a checkbook or calculating current through a resistor. However, Claude Shannon19 while at Massachusetts Institute of Technology (MIT) recognized how Boolean algebra could be applied to on-and-off electrical circuits, where all signals are characterized as either “high” (1) or “low” (0).s His 1938 thesis, titled A Symbolic Analysis of Relay and Switching Circuits20, put Boole’s theoretical work to use in a way Boole never could have imagined, giving us a powerful mathematical tool for designing and analyzing digital circuits. 


George Boole gave us the binary (two-mode) mathematics. Claude Shannon applied binary mathematics to simple on-off electrical circuits. This marriage of binary mathematics and simple electric circuits formed the foundations of all our modern day computer theories.


1 Wikipedia contributors, "Claude Shannon," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Claude_Shannon&oldid=698772147 (accessed January 19, 2016).

2 Wikipedia contributors, "Integer," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Integer&oldid=698453838 (accessed January 6, 2016).

3 Wikipedia contributors, "Counting," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Counting&oldid=697913915 (accessed January 6, 2016).

4 Wikipedia contributors, "Alternating current," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Alternating_current&oldid=695480927 (accessed January 6, 2016).

5 Wikipedia contributors, "Real number," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Real_number&oldid=697498365 (accessed January 6, 2016).

6 Wikipedia contributors, "Voltage," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Voltage&oldid=692987865 (accessed January 6, 2016).

7 Wikipedia contributors, "Magnitude (mathematics)," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Magnitude_(mathematics)&oldid=693909999 (accessed January 6, 2016).

8 Wikipedia contributors, "Phase (waves)," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Phase_(waves)&oldid=694474924 (accessed January 6, 2016).

9 Wikipedia contributors, "Proposition," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Proposition&oldid=689631127 (accessed January 6, 2016).

10 Wikipedia contributors, "Law of identity," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Law_of_identity&oldid=690498138 (accessed January 6, 2016).

11 Wikipedia contributors, "Law of noncontradiction," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Law_of_noncontradiction&oldid=697624609 (accessed January 6, 2016).

12 Wikipedia contributors, "Law of excluded middle," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Law_of_excluded_middle&oldid=690696224 (accessed January 6, 2016).

13 Law of Rational Inference: A=B & B=C, Therefore A=C. These laws of logic or thought were not invented by man. These laws are universal, absolute, and invariant because these have their primary existence in the mind of God. In order to understand what you’re reading at this moment you are utilizing these laws. In understanding the words you see you are using the law of identity. But in using that law you are also using both the law of excluded middle and noncontradiction for an understanding of the meaning of the words individually and in context. And in understanding the overall meaning of what I have written you are utilizing the law of rational inference.

14 Wikipedia contributors, "Fuzzy logic," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Fuzzy_logic&oldid=696573395 (accessed January 6, 2016).

15 Wikipedia contributors, "Quantum computing," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Quantum_computing&oldid=698269363 (accessed January 12, 2016).

16 Wikipedia contributors, "Premise," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Premise&oldid=686324742 (accessed January 6, 2016).

17 Wikipedia contributors, "George Boole," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=George_Boole&oldid=698167395 (accessed January 6, 2016).

18 Boole, George. An Investigation of the Laws of Thought on Which are Founded the Mathematical Theories of Logic and Probabilities. Project Gutenberg Release Date: February 16, 2005 [EBook #15114] PDF-http://www.gutenberg.org/files/15114/15114-pdf.pdf?session_id=64b4d891f957bacc02a023cd705f2da7cbb6d271 (accessed January 6, 2016).

19 Wikipedia contributors, "Claude Shannon," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Claude_Shannon&oldid=698192619 (accessed January 7, 2016).

20 Shannon, Claude 1940. A Symbolic Analysis of Relay and Switching Circuits http://dspace.mit.edu/bitstream/handle/1721.1/11173/34541425-MIT.pdf?sequence=2 (accessed January 7, 2016).


 

No comments:

Post a Comment

Comments? Questions?