Monday, September 22, 2025

Numbers - the Language of Technology

Before we can understand modern computer and Internet technologies we must know the language. Numbers are the language of modern computers and Internet. The words you are reading on your computer have been recorded in a code of 1s and 0s as I typed them. 

Modern computers and Internet protocols encode two-value (0 and 1) Boolean logic in their designs. The physical Computer circuits (both internal and external to silicon chips) are a physical manifestation of two-value Boolean logic. They achieve this in various ways: as voltages on wires, as electro-mechanical relays and capacitive storage devices, as orientations of a magnetic domain in ferromagnetic storage devices, as holes in punched cards or paper tape, and so on. 

While it is possible to code more than two symbols in any given medium, in practice however the tight constraints of high speed, small size, and low power combine to make (electronic) noise a major factor. This makes it hard to distinguish between symbols when there are many of them at a single site (electronic noise is an important factor to deal with in design, which I’ll speak of it later). Rather than attempting to distinguish between four voltages on one wire, early computer designers settled on two voltages per wire, high and low.

Another reason for the present day two-value logic system is that the first electronic computer circuits were made from relays which have only two operational properties, on and off. Because of the electro-mechanical limitations of the relay circuits, early designers had no choice but to chose a two-value logic system. 

In the 1930s, while studying relay switching circuits, Claude Shannon observed that one could apply the rules of Boole's algebra to arrays of relays to add, subtract, multiply and divide decimal numbers. He introduced switching algebra as a way to analyze and design circuits by algebraic means in terms of logic gates1. Thus casting his switching algebra as the two-element Boolean algebra. In circuit engineering settings today, there is little need to consider other Boolean algebras, thus "switching algebra" and "Boolean algebra" are often used interchangeably. 


Computers use two-value Boolean circuits for the above reasons. The most common computer architectures use ordered sequences of Boolean values, called words. Words consist of 8, 16, 32 or 64 discrete values, e.g. 01101000110101100101010101001011, each discrete value is called a bit (binary digit). When programming in machine code, assembly language, and certain other programming languages, programmers work with the low-level digital structure of the data registers. Such languages support both numeric operations and logical operations. These computers can add, subtract, multiply, divide, two sequences of bits which are interpreted as integers. These computers can perform the Boolean logical operations of disjunction, conjunction, and negation again on two sequences of bits. Programmers therefore have the option of working in and applying the laws of either numeric algebra or Boolean algebra as needed. A core differentiating feature between algebra and logic, is the carry operation with algebra but not with logic. 


The underlying technology that makes possible all present day computer, Internet, cell phone, encryption, radio, television and related technologies is binary digital (2-states of discrete units) technology. 


Many historians say that human technology began with the discovery of how to give stone a cutting edge; but I believe technology got its start when Man developed a sense of numbers, such that through the use of symbol and language, a number could take on a definition of being a noun or an adjective. Number sense, like common sense is difficult to define or express simply. It refers to an intuitive feel for numbers, their various uses, relationships, and interpretations, and how they are affected by operations. That intuitive feel comes from learning through life experiences. 


We don’t know how or when Man developed a sense of numbers, we speculate that it began with pattern recognition. We begin as children associating a symbol and a sound with a pattern. The sound a mother makes when she says ‘2’ in relation to two objects, the sound made when she says three with 3 objects. We are not born with a sense of numbers, it has to be learned. We begin counting at an early age by associating a number with a pattern of fingers. It probably wasn’t until Man reached the number 10 (2 hands) that he found he could use 10 pebbles (or any object) plus 1 more of that object for the next number. It took many generations for Man to develop a rudimentary sequence of counting 1 through hand (5) and six through 2-hands (10). 


As Man evolved and his mind expanded with ideas of “how many” and “how big”, he slowly developed a way to convey that type of information to others through the use of language and gestures. Counting in prehistory was first assisted by using body parts, primarily the fingers. In his book ‘A History of Computing Technology‘ Michael R. Williams describes three ways fingers can be used to aid humans in calculating. The first is simple counting, the way we all did it in first grade. The second way used fingers and finger positions as a ‘number language’ similar to today’s sign language. The third way involved both physical counting using fingers and mental addition and-or subtraction with the answer displayed with fingers. 


Another fundamental step for developing a sense of numbers is a concept of relation between numbers. Four (4) is less than five (5); greater than three (3); 4 is the same as two 2s, or 4 ones (1s); or a 2 and two 1s. The meaning of ‘4’ and its function when referenced to some object(s). The first numerical functions were probably addition and ‘take away’ (remember that from 1st grade?) later known as subtraction; then came multiplication and division, then algebra and finally calculus. 


Today, almost everything in our world and within the realm of our mental capacity has at least one property which is defined by or identified by a number. But a modern computer, knows only two numbers, 0 and 1. Every number problem is a simple addition. In fact a computer does not know what ‘1’ or '0' is unless we humans tell it what it is and what to do with it. 


Man did not build machines to calculate because he wanted to, he built calculating machines because he needed them to survive. We started with pattern recognition, pebbles in a rawhide bag, tally sticks and knotted rope. Then we made simple hand instruments, the abacus, quadrant and Napier’s bones and slide rules. Then we got mechanical with Pascal and Leibniz and their hand crank calculators, and Babbage’s analytical engine. With the coming of World War 2, the need for computing power surpassed that which all of Man’s machines could produce. When Man turned to electricity to power his new machines, he ushered in our present era of digital binary computing. 


Today our machines get more powerful because our numbers are both bigger and smaller, while being more accurate. Numbers that get bigger and smaller, faster each day lay at the root of all knowledge. They get bigger and smaller faster because our machines allow them to. It has, in essence become a circle, if we want to progress and make new discoveries, we need more powerful machines. 

Mathematics and Arithmetic are what brings science to the real world; how we describe and define our world; how we measure progress, and together with philosophical and abstract thought form the basis of discovery. 


We don't know when or how man developed a concept of numbers. Scholars speculate that it began simultaneously with language. The reasoning behind this assumption is that man needed to convey information. The number concept probably started with our ancestors assigning a symbol to a pattern. For example, a herd of 5 sheep would be assigned a symbol of "5". All sheep herds matching the pattern of that first reference herd will receive the symbol ‘5”. Information could be conveyed without rounding up all the sheep and presenting them, "5" saved time and energy. This assigning of symbols to real life objects became what we now call a "cardinal" number. 


At about the same time man developed techniques for counting. Counting allowed man to continuously redefine his meaning of how much and how many. Through research of language, we find early man's concept of counting was based on his observations of nature. Five was one hand, ten was two hands, 20 was one man, cycles of the tides (12 & 24) and moon (30). Through the years man came to accept a universal concept of a number line. This number line is based on a universally accepted concept that we can always pass from any number to its successor or predecessor. We commonly call this system of numeration an ordinal system. 


It is language that gives us the symbols 0,1,2,3,... Our theory of computer science is based on the set of numbers {0,1}. The symbols 0 & 1 are what we define them to be. A “1” can be a positive voltage, it can be a negative voltage, a current or a state of no current, it can be a frequency, the phase of an analog signal, it can be a closed circuit or an open circuit, it can be a note from a musical instrument, a “1” is anything that we define it to be. In the first mechanical computers a “1” was the specific degree of a circle. This circle was a gear wheel and the degree was located at a gear-tooth. Later, after we developed electricity, a “1” came to be defined as a closed circuit. In modern computers, because of the broad range of uses and functions, a “1” has a very broad range of definitions. The definition of a “1” can also vary within a specific piece of equipment. All of the above is also true when considering a “0”. 


A number is a mathematical object used to count, measure and label. During the 19th century, mathematicians began to develop many different abstractions which share certain properties of numbers and may be seen as extending the concept. These abstractions became rules. 


Mathematical rules are based on the defining the limits we place on the particular numerical quantities dealt with and how we interpret operations performed. When we say that 1 + 1 = 2 or 3 + 4 = 7, we are implying the use of integer2 quantities. Integers are the same types of numbers we all used when we learned to count3. But when we use numbers with electricity, we need to be careful. Because what most of us assume to be self-evident rules of arithmetic, valid at all times and for all purposes, actually depend on what we define a number to be. 


For example, when considering numerical quantities in Alternating Current4 (AC) circuits, we find that Real number5 quantities are inadequate for the task of representing AC quantities. We know that voltages6 add when connected in series AC circuits, but we also know that it is possible to connect a 3-volt AC source in series with a 4-volt AC source and end up with 5 volts total voltage (3 + 4 = 5)! Does this mean the inviolable and self-evident rules of arithmetic have been violated? No, it just means that the rules of Real numbers do not apply to the kinds of quantities encountered in AC circuits (and many other uses of electronics), where every variable has both a magnitude7 and a phase8. We need a different definition of numerical quantity for AC circuits (complex numbers, rather than real numbers), and along with this different definition of numbers comes a different set of rules telling us how they relate to one another. 


An expression such as “3 + 4 = 5” is nonsense within the scope and definition of real numbers, but it does fit within the scope and definition of complex numbers (think of a right triangle with opposite and adjacent sides of 3 and 4, with a hypotenuse of 5). Because complex numbers are two-dimensional, they are able to “add” with one another trigonometrically as single-dimension “real” numbers cannot. 


Computer Logic is like mathematics because the rules of computer logic depend on how we define what a proposition9 is. Greek philosopher Aristotle founded a system of logic based on only two types of propositions: true and false. His bivalent (two-mode) definition of truth led to the four foundational laws of logic: the Law of Identity10 (A is A); the Law of Non-contradiction11 (A is not non-A); the Law of the Excluded Middle12 (either A or non-A); and the Law of Rational Inference13. These Laws function within the scope of logic where a proposition is limited to one of two (binary) possible values, but may not apply in cases where propositions can hold values other than “true” or “false.” In fact, much work is currently being done on “multivalued,” or fuzzy logic14, where propositions may be true or false to a limited degree, such as in quantum computing15 . In such a system of logic, “Laws” such as the Law of the Excluded Middle simply do not apply, because they are founded on the assumption of bivalence. Likewise, many premises16 which would violate the Law of Non-contradiction in Aristotelian logic have validity in “fuzzy” logic. Again, the defining limits of propositional values determine the Laws describing their functions and relations. 


English mathematician George Boole17 sought to give symbolic form to Aristotle’s system of logic. Boole wrote a treatise on the subject in 1854, titled An Investigation of the Laws of Thought, on Which Are Founded the Mathematical Theories of Logic and Probabilities18, which codified several rules of relationship between mathematical quantities limited to one of two possible values: true or false, 1 or 0. His mathematical system became known as Boolean algebra. 


All Boolean operations performed have only one of two possible outcomes: either 1 or 0. It is a world in which all other possibilities are invalid. This is not the kind of math you want to use when balancing a checkbook or calculating current through a resistor. However, Claude Shannon19 while at Massachusetts Institute of Technology (MIT) recognized how Boolean algebra could be applied to on-and-off electrical circuits, where all signals are characterized as either “high” (1) or “low” (0).s His 1938 thesis, titled A Symbolic Analysis of Relay and Switching Circuits20, put Boole’s theoretical work to use in a way Boole never could have imagined, giving us a powerful mathematical tool for designing and analyzing digital circuits. 


George Boole gave us the binary (two-mode) mathematics. Claude Shannon applied binary mathematics to simple on-off electrical circuits. This marriage of binary mathematics and simple electric circuits formed the foundations of all our modern day computer theories.


1 Wikipedia contributors, "Claude Shannon," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Claude_Shannon&oldid=698772147 (accessed January 19, 2016).

2 Wikipedia contributors, "Integer," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Integer&oldid=698453838 (accessed January 6, 2016).

3 Wikipedia contributors, "Counting," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Counting&oldid=697913915 (accessed January 6, 2016).

4 Wikipedia contributors, "Alternating current," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Alternating_current&oldid=695480927 (accessed January 6, 2016).

5 Wikipedia contributors, "Real number," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Real_number&oldid=697498365 (accessed January 6, 2016).

6 Wikipedia contributors, "Voltage," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Voltage&oldid=692987865 (accessed January 6, 2016).

7 Wikipedia contributors, "Magnitude (mathematics)," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Magnitude_(mathematics)&oldid=693909999 (accessed January 6, 2016).

8 Wikipedia contributors, "Phase (waves)," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Phase_(waves)&oldid=694474924 (accessed January 6, 2016).

9 Wikipedia contributors, "Proposition," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Proposition&oldid=689631127 (accessed January 6, 2016).

10 Wikipedia contributors, "Law of identity," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Law_of_identity&oldid=690498138 (accessed January 6, 2016).

11 Wikipedia contributors, "Law of noncontradiction," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Law_of_noncontradiction&oldid=697624609 (accessed January 6, 2016).

12 Wikipedia contributors, "Law of excluded middle," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Law_of_excluded_middle&oldid=690696224 (accessed January 6, 2016).

13 Law of Rational Inference: A=B & B=C, Therefore A=C. These laws of logic or thought were not invented by man. These laws are universal, absolute, and invariant because these have their primary existence in the mind of God. In order to understand what you’re reading at this moment you are utilizing these laws. In understanding the words you see you are using the law of identity. But in using that law you are also using both the law of excluded middle and noncontradiction for an understanding of the meaning of the words individually and in context. And in understanding the overall meaning of what I have written you are utilizing the law of rational inference.

14 Wikipedia contributors, "Fuzzy logic," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Fuzzy_logic&oldid=696573395 (accessed January 6, 2016).

15 Wikipedia contributors, "Quantum computing," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Quantum_computing&oldid=698269363 (accessed January 12, 2016).

16 Wikipedia contributors, "Premise," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Premise&oldid=686324742 (accessed January 6, 2016).

17 Wikipedia contributors, "George Boole," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=George_Boole&oldid=698167395 (accessed January 6, 2016).

18 Boole, George. An Investigation of the Laws of Thought on Which are Founded the Mathematical Theories of Logic and Probabilities. Project Gutenberg Release Date: February 16, 2005 [EBook #15114] PDF-http://www.gutenberg.org/files/15114/15114-pdf.pdf?session_id=64b4d891f957bacc02a023cd705f2da7cbb6d271 (accessed January 6, 2016).

19 Wikipedia contributors, "Claude Shannon," Wikipedia, The Free Encyclopedia, https://en.wikipedia.org/w/index.php?title=Claude_Shannon&oldid=698192619 (accessed January 7, 2016).

20 Shannon, Claude 1940. A Symbolic Analysis of Relay and Switching Circuits http://dspace.mit.edu/bitstream/handle/1721.1/11173/34541425-MIT.pdf?sequence=2 (accessed January 7, 2016).


 

What is Technology?

We throw the word technology around a lot today. What does it mean and why do we use it so much? Those of you who are old enough to know what life was like before computers might understand what I'm getting at. In the 1960s, when I was growing up, an automobile had an electrical system for starting, to supply a spark for combustion and to supply power to lights, radio and heater. Today that electrical system has become automotive technology. It seems that everything today has become some kind of technology. We had technologies in the past; mechanical, electrical, plumbing, carpentry, etc, but it has only been recently that we have started using the word technology more pervasively.

The word technology comes from the Greek root word techno - which meant (in ancient Greece): art, skill, craft, method, system, and Greek ology - indicating science or study1. Before the 20th century, the term technology usually referred to the description or study of the Useful Arts2. Useful Arts were concerned with the skills and methods of practical subjects such as manufacturing and craftsmanship. Useful Art is an antonym to performing art and the fine art.

I think I could say that everyone uses some kind of art, or skill or craft every day. Just picking out the clothes you will wear is part art and part skill. Dressing correctly is a useful art. Driving a car to work involves skill, another useful art. As you can see, a dictionary definition of technology doesn’t really tell us much about what it is today.

Through the ages our technology has been based on many different things. First there were our human senses and muscles, then came primitive hand tools, then horse or ox power, running water, steam, internal combustion, and now, electricity. Also through the ages, our knowledge base of technology has been increasing through the application of ideas, life experience and new information to what we already knew. Today we gain knowledge much faster than we can fully understand it or even catalog it for future learning.

Archeology gives us a glimpse of early technologies -- arrowheads, knife blades and hide scrapers made from flint rock. If you just randomly banged flint against another rock, you may eventually make a usable tool. That may very well be how the development of stone tools started. But to efficiently make a tool, early man learned techniques of angles and strength applied to the blows which allowed him to make his tools faster and better. Also, once a tool is made, man has to learn how to use it efficiently and effectively.

I think a better definition of modern technology is: Technology is human knowledge which involves creating tools, processing actions, making materials and systems to solve problems and benefit society.

Technology is about taking action or actions to meet a human need rather than merely understanding the workings of the natural world, which is the goal of science. Technology is much more than just scientific knowledge. It includes values (numerical,theoretical & practical) as much as facts, practical craft knowledge and art as well as theoretical knowledge. Technology also involves organized ways of doing things. It covers the intended and unintended interactions between products (machines, devices, artifacts) and the people and systems who make them, use them or are affected by them through various processes.

There is no one master discipline called Technology. Today, technical matters are threaded through almost every discipline. But the diversity of science, mathematics and art does not allow for a singularity of technology. The term technology is just a "catch-all" phrase that is wide and everyone has their own way of understanding the meaning of it.


 

One With Technology

What does “one with technology” mean? Being "one with technology" generally refers to a deep integration or harmonious relationship between humans and technology. This concept can encompass several aspects:

  1. Seamless Interaction: It implies that technology is so well integrated into daily life that it feels natural and intuitive. Users can interact with devices and software effortlessly, often without conscious thought.

  2. Enhanced Capabilities: Individuals may feel empowered by technology, using it to enhance their abilities, productivity, and creativity. This can include using tools for communication, learning, and problem-solving.

  3. Digital Immersion: It can also refer to a state of being immersed in digital environments, such as virtual reality or augmented reality, where the boundaries between the physical and digital worlds blur.

  4. Philosophical Perspective: On a more philosophical level, being "one with technology" can suggest a belief in the potential of technology to augment human experience and existence, leading to a symbiotic relationship where both humans and machines evolve together.

  5. Dependence and Adaptation: It may also reflect a societal trend where individuals become increasingly dependent on technology for various aspects of life, adapting their behaviors and lifestyles around technological advancements.

Overall, this phrase captures the present evolving relationship between humans and technology. It suggests that individuals or societies have adapted to and embraced technological advancements to the extent that technology has become an essential part of their lives, rather than being a separate or intrusive element.

How can we achieve a harmonious relationship between humans and technology? The answer lies in our ability to create and use technology ethically, and collaboratively. While the potential of technology is vast, its impact on us depends on how we use it. The tools of it's creation are in our hands. Once created, if we use technology wisely, it can be more than just an engine for progress, it can be the engine that drives a new era in human evolution, one marked by compassion, creativity, and connection.

Tuesday, August 5, 2025

Where to Begin?

Imagine, if you will, a world without computers. What picture would your imagination paint? From first grade through high school my tools were: a notebook full of lined paper, a pencil, pen and eraser. I learned from a human teacher writing on a blackboard with chalk or lecturing and I studied by reading books. Twenty years later when I finally entered college, I had to have a computer, an internet connection and an email address. Much of my instruction came across the internet. Twenty years after I graduated from college, it is hard to purchase anything, pay a bill, take an airline flight, or even order dinner without a cellphone (a handheld computer with internet and international telephone connections).


In the introduction of his book The Universal machine, Ian Watson said “In less than one lifespan the computer has transformed almost everything in our society.” In less than one lifespan, our culture and society have changed so quickly that our comprehension of what is ‘normal’ can not keep pace with the ever-evolving technology which is driving evolutionary changes to our culture and society. Our culture has become whatever is ‘trending’ on social media and Netflix. Our society has gone from the interaction of humans, to the interfacing of social media on the internet. Our technology, which started out as a tool for mankind, has become its ruler.

 

Technology has become so pervasive in our culture and society that we can not live without it. If you are saying, yes, I can live without technology, try this: Turn off your computers, cellphones and all your 'smart' devices for one week. Could you do it? Would you know how to open the garage door so you could go to work? Would you be able to start your car? How would you communicate? Pay bills? Would you be able to cook your own meals? If you are a student, would you be able to participate in classes? Would you be able to call the hospital in case of an emergency?

 

Human evolution is driven by the recombination of genes. Human creativity is driven by a recombination of curiosity, imagination, ideas, knowledge, research, science and mathematics

 

Technology evolves through the application of our ever evolving human mind and the recombination of the products of human creativity; making us in a sense, one with technology.  

 

As long as man has been alive, he has dreamed of making a better world for himself and future generations. It is man’s dreams, needs and wants that spark his imagination and the thought processes that eventually produce technology. Has our human created technology made our world better or have we become as Ray Kurzweil predicted in his book, The Singularity Is Near: "One with our technology?"

 

I believe that today, technology and humans influence each other to the point that a change in one causes a change in the other. A harmonious man-machine evolution if you will.

Before going any further, please take a moment and ask yourself these 4 questions:

  1. Can humanity survive without technology?

  2. Can technology survive without humanity?

  3. Do my answers to the previous 2 questions make humanity one with technology?

  4. What does the answer to question 3 hold for the future of humanity?

Technology has empowered us to solve complex challenges, it is also challenging us to rethink the very essence of humanity. Today we say that our machines can think, learn, and create. Can they? Are they in fact intelligent? To truly enjoy the benefits of technology while retaining our humanity, we must be deliberate and mindful in our approach to creating and using technology. As we develop systems, we must answer deep philosophical questions with ethical answers. Such as, what does it mean to think, what is intelligence, how does one become intelligent and what does it mean to be human? These debates will help us evolve not only technologically but morally; causing us to prioritize human values and connections in our lives rather than isolating individuals and devaluing their importance to society.

 

In order to answer philosophical questions that will influence the technological discoveries of our future, we need to understand fundamental questions about how technological discoveries of the past has influenced society, ethics and reason. Knowing the how and why of past technological discoveries opens new roads to new discoveries, both now and in the future. Inherent in each discovery, both past and present, is the science, philosophy, scientific method and social attitudes of previous discoveries. Our Heritage of Technology will go back to the beginnings of philosophy and science. But first, we must take a look at the language of both science and technology – Numbers.