Across the History Curriculum. Part 10
“[Computer science] is not really about computers —and it’s not about computers in the same sense that physics is not really about particle accelerators, and biology is not about microscopes and Petri dishes … and geometry isn’t really about using surveying instruments. Now the reason that we think computer science is about computers is pretty much the same reason that the Egyptians thought geometry was about surveying instruments: when some field is just getting started and you don’t really understand it very well, it’s very easy to confuse the essence of what you’re doing with the tools that you use.” (Hal Abelson; American professor of electrical engineering and computer science at the Massachusetts Institute of Technology; 1947-.)
“In their capacity as a tool, computers will be but a ripple on the surface of our culture. In their capacity as intellectual challenge, they are without precedent in the cultural history of mankind.” (Edsger Dijkstra; Dutch computer scientist; 1930-2002.)
Electronic digital computers are likely an important and routine part of your everyday life. You can explore your own history and find a number of examples of how computer technology has changed and is changing your world. Your PK-12 students do not have such a perspective. Powerful and routinely used computers and computerized devices have been part of their entire lives.
What is an electronic digital computer? Perhaps I should have answered that question in the first of the newsletters in this series. But, you and all of my readers have used computers (such as a Smartphone) for years. So, you have first-hand experience with computers and can provide an answer that is satisfactory to you.
This newsletter poses an analogy between capabilities of computers and capabilities of human beings. This analogy can help as you and your students begin to explore some of the history of electronic digital computers.
Here is an interesting aside. Historically, a computer was a person who was skilled in using a mechanical or electric calculator. Perhaps you have seen Hidden Figures, the 2016 movie about three African-American women calculators (Wikipedia, 2020b, link). Learn more about them in the article, Women Made the Apollo Moon Landing Possible — Here Are the Crucial Technologies and Calculations They Contributed (Brueck, 7/20/2019, link).
You probably know that women are under- represented in employment in the computer fields. As a history teacher, you are in a position to raise this under representation (sexual discrimination) issue from an historical point of view. You also can address this issue in terms of racial, and other forms of discrimination. You can stress the importance of the issue and broaden your students’ insights into this problem.
Once electronic digital computers came into widespread use, the word computer took on a new meaning.
Here is a definition that I memorized many years ago:
A computer is a machine for the input, storage, processing, and output of information. (Of course, nowadays a computer is much more than this.)
Now, here is a parallel statement about human beings that I have written:
A human being is a flesh and blood creature designed for the input, storage, processing, and output of information. (Of course, a human being is much more than this.)
So, the two statements provide an analogy between computers and humans. I really enjoy analyzing this analogy, and I find it a useful aid in exploring the history of computers.
First, a little bit more about today’s computers. A computer program is a step-by-step set of instructions that can be stored in a computer to direct the computer as it solves a problem or accomplishes a task. Once a computer program has been developed, copies can be made and loaded into other computers. I like to think of this as the fact that, once we teach one computer how to solve a particular problem, we can quickly and cheaply teach thousands or millions of other computers to solve that same type of problem. We are, in some sense, educating the computer by teaching it how to solve a problem.
Contrast that with educating a human being. A newborn child has considerable knowledge and skills that are genetically passed from parents to child. But, after that, learning is an individual task. Even if a child’s parents are quite skilled at reading and writing, the child has to start from scratch in learning to read and write.
Here is an example from elementary school mathematics. Suppose that the information to be processed by computers and humans consists of numerical data that needs to be processed by a sequence of arithmetic (add, subtract, multiply, divide) operations on multi-digit decimal numbers. The math taught in a typical elementary school includes learning to do such arithmetic operations by hand. The speed and accuracy achieved varies greatly among students. How long does it take you to do a paper-and-pencil long division of one 6-digit number divided by a second 6-digit number? How long does it take to teach a student to do this type of arithmetic operation, and at what grade level should this be taught? You probably believe it should be taught. Do you believe that this will or should be in the curriculum a hundred years from now? Think about arguments to support your position.
The first computers were designed specifically to be able to do such arithmetic calculations. In the early 1950s, the first commercially available electronic digital computers could do about a thousand arithmetic calculations per second. People thought of these computers as being lightning fast. In 2020, less than 70 years later, the world’s fastest computer was 200 trillion (that is, 200,000 billion) times as fast (CNN Wire, (3/19/2020), link):
The novel coronavirus presents an unprecedented challenge for scientists: The speed at which the virus spreads means they must accelerate their research.
But this is what the world’s fastest supercomputer was built for.
Summit, IBM’s supercomputer equipped with the “brain of AI,” ran thousands of simulations to analyze which drug compounds might effectively stop the virus from infecting host cells.
The supercomputer identified 77 of them. It’s a promising step toward creating the most effective vaccine.
This provides an example of why we need superfast computers and is an important piece of the history of computers. I find it unbelievable that in 70 years we went from a very useful and fast computing device to one that is 200,000 billion times as fast! I enjoy taking such numbers and applying them to other tools that humans have developed. For example, consider the Model T car that began production in 1908. It had a top speed of about 40 to 45 miles per hour. Mass production of millions of the Model T and other early cars certainly changed the world. Now, consider a vehicle that is a 200,000 billion times as fast as the Model T. Its speed would be many times the speed of light! You know that Einstein proved that we cannot travel faster than the speed of light. A spaceship moving just a little less than the speed of light could take off from the earth, circle it a couple of times, fly to the moon, circle it a couple of times, and return to earth—all in less than five seconds!
Of course, being able to do by-hand arithmetic calculations is only one, rather modest capability of a human being. However, there is an important message in this story. Throughout history, humans have developed a wide variety of tools to supplement and/or enhance their physical and cognitive capabilities.
Progress in astronomy provides a good example. Stonehenge allowed a person who did to know how to read or write to determine when the Summer Solstice was occurring (Stonehenge, n.d., link).
The purpose of Stonehenge is astronomical. It is carefully aligned so that, if one sits at the center, one has a clear view of the summer-solstice sun rising over the heel stone. Such monuments are fairly common, such as Nabta or Karnak in Egypt, Teotihuacan in Mexico, Moose Mountain in Saskatchewan, Medicine Wheel in Wyoming, or scores of stone rings found in Britain and western Europe.
In terms of information processing, the development of reading and writing was a very major breakthrough—a very good aid to the human brain. The value of reading and writing is so great that it is a major focus in our schools. Spend some time thinking about reading and writing as an aid to a human’s innate abilities to input, store, process, and output information. This will help you to understand why reading and writing remains such an important part of the school curriculum.
The development of reading and writing was a major world-changing human invention. Our education systems have had more than 5,000 years of experience in teaching reading and writing, and they have certainly improved in this task. Still, each individual child faces this long and challenging learning task. Our DNA does not pass this knowledge and skill from one generation to the next.
In addition to reading, writing, and paper-and-pencil arithmetic, there are many other tasks that humans can learn to do through dint of a number of years of schooling and practice. Today, computers can do an increasing number of these tasks both faster and more accurately than humans. As the capabilities and availabilities of computers continue to increase, this leaves educators with the problem of examining the information processing capabilities of computers and deciding to what extent children should learn by-hand methods of accomplishing the same tasks.
An alternative to some of this schooling is for students to learn to make effective use of computers in those parts of the curriculum where today’s computers are routinely being used by practitioners in the discipline. This will free up more student time to be spent on posing and understanding problems, and learning to make effective use of what they are learning.
In summary, prehumans and humans have developed a great many tools that aid their physical and cognitive capabilities. Some of these tools allow humans to do things (such as travel more quickly than by walking or running, and make arithmetic calculations more rapidly) that they could not do before. Historically, informal and formal education prepared humans to work with the tools they had developed, rather than to compete with them. Today’s schools are making progress in preparing students to work with computers, rather than compete with them. But we are making slow progress, and this is at a time when the capabilities of computers are continuing to increase quite rapidly.
What should history teachers at the precollege level know about computer science and the history of computers? This same type of question can be asked about each discipline of study, as each discipline has its own history of the developing uses of computers within their disciplines.
I like to consider this question from two points of view that have been discussed in previous newsletters in this series:
- Computers are a major change agent in our world, perhaps as important as development of reading and writing.
- Computers provide a variety of aids to learning, analyzing, and using historical data and information.
And, I like to think more broadly than just computer technology. How about all technology? How has the development of technology changed the world? No matter which specific aspects of history you teach, you can raise the question of what technologies existed or were developed during that historical time period and in that place. A related topic is to investigate ways that the new technology effected the people living there as well as in the rest of the world at that time.
This reminds me of a quotation from Thomas Huxley, the well-known English writer, spoken more than a hundred years ago, “Try to learn something about everything and everything about something.”
The totality of human knowledge is overwhelmingly large, as is the totality of knowledge in each well-established discipline of study. So, nowadays Huxley’s suggestion is absolutely impossible to follow. But, the Web stores and provides access to a significant portion of this accumulated human knowledge. This reminds me of a poignant quotation from Frederick Douglas, the freed American slave who became an ardent abolitionist, orator, and writer, “Once you learn to read, you will be forever free.”
Nowadays, to read includes reading the Web, and making effective use of its multimedia content. I believe Douglas’ statement would have been even more powerful if he had said reading and writing. Nowadays, both reading and writing include creating and using multimedia content.
To help you ponder my question on what history teachers should know about computer science, I suggest you reread the first of the two quotations that begin this newsletter. As a well-educated and mature adult, you have insights into the meaning of that quotation that are well beyond those of most of your students.
This is an important observation. Every student you work with has some specific knowledge and experiences that you don’t have. For example, it is quite likely that you teach students who know more than you about use of computers for social networking, online dating, game playing, creating and using a music play list, and so on.
But, you have greater knowledge and experience in the disciplines you teach, as well as in teaching and learning these disciplines. You have considerably more life experiences. When you encounter a student who has content knowledge and skills beyond you in a topic relevant to what you are teaching, take advantage of this. Learn from your student and have your student help the whole class.
Now, reread the second quotation given at the beginning of this newsletter. What does the phrase, “the cultural history of mankind” mean to you? I was reminded of a 1987 book by E.D. Hirsch, et al., Cultural Literacy: What Every American Needs to Know (Hirsch, et al., 1988, link). An appendix of the book lists 5,000 pieces of information that “every American needs to know.”
As I started to write this IAE Newsletter, I spent quite a bit of time thinking about what every precollege history teacher needs to know about computers as a part of our culture, and what we need to teach our students about this history. Here are some questions for you to think about:
- What do you know about computers as part of our current culture and as a current intellectual challenge?
- What are some aspects of computers that you believe relate to being an up-to-date history teacher?
- What do your students know about computers as part of our current culture and as a current intellectual challenge?
- What are some things that you believe your students should know about computers that relate to having a good, modern history education?
It is easy to extend such a list. For example, what do your students’ parents know, what do your fellow teachers know, what knowledge does your school or school district expect you to have, and what are the professional societies for teachers telling you?
In the remainder of this newsletter and in the next newsletter I share some tidbits of information that you may want to add to your own knowledge base. The following is one example I picked up from browsing a variety of Web articles.
A Control Data Corporation 1985 super computer, the fastest computer in the world at its time, cost (adjusted for inflation) about $30 million in year 2020 dollars, and had 1/10 of the compute power of today’s (year 2020) $1,000 Smartphone. Hmm. What does it feel like to carry around a device with greater capabilities than a $30 million room-sized computer from the past? I explored a somewhat similar topic in a 2017 IAE Blog entry, In Terms of Vacuum Tube Dollars, Likely You Are a Billionaire (Moursund, 5/8/2017).
I believe it is important to understand how this great decrease in cost paired with the considerable increase in performance has affected people. It is easy to say or write that during your lifetime, the price to performance ratio of computers has decreased exponentially. Hmm. Do you and your students know what it means for something to increase or decrease exponentially? This is a topic from mathematics that is now being frequently included in discussions about the coronavirus and other major changes that are affecting our world.
When we discuss or refer to an historical event, it needs to have a name or an identifier. This section contains three important, named parts of the history of computers.
Gordon Moore and Moore’s Law
Perhaps you have heard of Gordon Moore, the co-founder of Intel Corporation who developed Moore’s Law. For many years Moore’s Law has served as a useful aid to forecasting exponential (increases) changes in computer processing speed and exponential (decreases) in the cost of processing a given amount of computer data (Tardi, 9/5/2010, link).
This statement about Gordon Moore and Moore’s Law illustrates an important issue in the history of computers. Is it important to know about Intel Corporation? Is it important to know about Gordon Moore? Is it important to know and understand Moore’s Law?
Contrast these specific areas of possible knowledge with the following more general statements:
- Over the years, the capabilities of computers have increased rapidly and their price decreased substantially. These changes supported a very rapid growth in the use of computers.
- The development and production of computers and their components became a very large industry, both in the United States and in many other countries throughout the world.
- Many individual people and research groups made discoveries that fostered both 1 and 2 above.
I find such general ideas much easier to remember and use than the specific details of a particular person or event.
John McCarthy and Artificial Intelligence
Let me give another example—one from the history of Artificial Intelligence (AI) (Wikipedia, 2020c, link):
John McCarthy is one of the “founding fathers” of artificial intelligence, together with Alan Turing, Marvin Minsky, Allen Newell, and Herbert A. Simon. McCarthy coined the term “artificial intelligence” in 1955, and organized the famous Dartmouth conference in Summer 1956. This conference started AI as a field. [Bold added for emphasis.]
AI is one of the most important aspects of computer science. Most people use the identifier AI or Artificial Intelligence rather than the identifier John McCarthy’s theory of computer intelligence.
My 5/13/2020 Google search of the expression Moore’s Law produced about 20 million results. This suggests that the name Moore’s Law is a widely used identifier. My 5/13/2020 Google search of the expression Artificial Intelligence produced about 809 million results, and my search of the term Machine Intelligence that is used more widely in Europe produced about 697 million results. My search of the name John McCarthy produced only a very respectable 96 million results. The identifiers Artificial Intelligence and Machine Intelligence have won this
As another naming example, my 5/13/2020 Google search of the expression Computer Assisted Learning produced about 133 million results. Contrast this with my search of the expression HIICAL (Highly Interactive Intelligent Computer Assisted Learning), an identifier that I created nearly 20 years ago and mentioned in the previous IAE Newsletter. HIICAL produced only 2,780 results, and this identifier has never been widely adopted.
Alan Turing and the Turing Test
In this third naming example, an important concept was named for its developer. What is the Turing Test? Researchers in the field of artificial intelligence are making progress in having a computer carry on a protracted, intelligent conversation with a human.
So, here is an important tidbit of the history of computers (Wikipedia, 2020a, link):
Alan Mathison Turing (23 June 1912–7 June 1954) was an English mathematician, computer scientist, logician, cryptanalyst, philosopher, and theoretical biologist. Turing was highly influential in the development of theoretical computer science, providing a formalization of the concepts of algorithm and computation with the Turing machine, which can be considered a model of a general-purpose computer. Turing is widely considered to be the father of theoretical computer science and artificial intelligence. [Bold added for emphasis.]
For years there was an annual Turing Test contest (Wikipedia, 2020d, link):
The Loebner Prize is an annual competition in artificial intelligence that awards prizes to the computer programs considered by the judges to be the most human-like. The format of the competition is that of a standard Turing test. In each round, a human judge simultaneously holds textual conversations with a computer program and a human being via computer. Based upon the responses, the judge must decide which is which. [Bold added for emphasis.]
In 1950, Alan Turing posed the task of creating a conversationalist computer so skilled that human judges could not tell whether they were conversing with a computer or with another human. The test, now known as the Turing Test, was carried out in a computer keyboard written communication environment.
A more recent idea is to make use of oral communication rather than a computer keyboard. The term chatbot has come into widespread use to denote a computer system that can carry on an oral or written conversation with a person (Chatbot, 3/22/2020, link). The current annual Turing Test contest is a chatbot contest (Wikipedia, 2020d, link). In the chatbot contest, the winner is the computer system that is best in carrying on a full-blown conversation with a human person. Thus, the focus has changed from having a computer try to imitate a human being to selecting the best imitator from among a large number of computer entries.
This change in emphasis goes along with the current major efforts to develop chatbots that can converse knowledgeably and effectively with humans about the products and services that a company offers. Think of contacting the Web-based “help” feature of an online store. You would like to talk to a very knowledgeable human being. But, it is quite expensive for the store to provide this level of personal service. As an alternative, there has been a strong movement toward having you first interact with a chatbot (Hao, 5/14/2020, link):
While call centers have long been a frontier of workplace automation, the pandemic has accelerated the process. Organizations under pressure are more willing to try new tools. AI firms keen to take advantage are sweetening the incentives. Over the last few years, advances in natural-language processing have also dramatically improved on the clunky automated call systems of the past. The newest generation of chatbots and voice-based agents are easier to build, faster to deploy, and more responsive to user inquiries. Once adopted, in other words, these systems will likely be here to stay, proving their value through their ease of use and affordability. [Bold added for emphasis.]
In summary, this long section on providing names or identifiers for historical events has provided brief introductions to Gordon Moore and Moore’s Law, John McCarthy and Artificial Intelligence, Alan Turing and the Turing Test, and the chatbot.
These identification examples suggest a fun game to use with your students. Have the students name some of the most important topics they have learned about in previous history courses and/or in their current course. Determine the number of results you obtain by Web searches of these topics. For younger students, this can be a whole class activity, with you doing the Web searches. Older students can divide into small groups, each group with a Web-connected computer to do the activity independently.
As a class, discuss and analyze the results. For example, does having a higher number of search engine results provide good evidence that a topic is more important to be studying than is a topic with a lower number of search engine results? Have students try to find topics that will produce a very large number of search engine results, and others that will produce a very small number. This can make an interesting contest among students or groups of students.
This newsletter provides you with a way to think about the history of computer science through an analogy between human capabilities and computer capabilities. Both humans and computers have considerable capabilities and limitations. We humans improve our knowledge and skills through life experiences, informal education, and formal education. This is a lifelong process, and each person travels their own individual path. As the totality of accumulated human knowledge continues its rapid growth, the percentage of it that any person can master becomes smaller and small. However, by dint of specialization and long hard work, a person can carve out a niche of high-level knowledge and skills excellence.
Computers gain increased knowledge and skill through the combined efforts of thousands of electrical engineers, programmers, computer scientists, mathematicians, and experts in the various areas in which computers are found to be useful. Some of this progress can be incorporated into existing computers, and some leads to the development of more capable computers.
People are particularly good at understanding the human condition—what it means to be a human with our problem-posing capabilities, creativity, emotions, and thoughts. We far exceed computers in this regard. However, there is a steadily growing increase in the number of things that computers can do better than humans. Thus, we humans now face a lifelong challenge of learning to work effectively with the steadily increasing capabilities of computers, rather than to compete with them in areas that are well suited to computer capabilities.
The next newsletter continues our exploration of the many ways that computers are changing our world.
Abelson, H. (2006). What is computer science? YouTube. Retrieved 5/16/2020 from https://www.youtube.com/watch?v=zQLUPjefuWA.
Brueck, H. (7/20/2019). Women made the Apollo moon landing possible — Here are the crucial technologies and calculations they contributed. Business Insider. Retrieved 5/10/2020 from https://www.businessinsider.com/apollo-11-women-made-moon-landing-possible-2019-7.
Chatbot (3/22/2020). What is a chatbot? Why are chatbots important? ExpertSystem. Retrieved 5/13/2020 from https://expertsystem.com/chatbot/.
CNN Wire (3/19/2020). The world’s fastest supercomputer identified chemicals that could stop coronavirus from spreading, a crucial step toward a vaccine. Cable News Network, Inc., a Time Warner Company. Retrieved 5/27/2020 from https://q13fox.com/2020/03/19/the-worlds-fastest-supercomputer-identified-chemicals-that-could-stop-coronavirus-from-spreading-a-crucial-step-toward-a-vaccine/.
Hao, K. (5/14/2020.) The pandemic is emptying call centers. AI chatbots are swooping in. MIT Review. Retrieved 5/15/2020 from https://www.technologyreview.com/2020/05/14/1001716/ai-chatbots-take-call-center-jobs-during-coronavirus-pandemic/?truid=c5a565178729dc9a41a97c4f80bfafc5&utm_source=the_download&utm_medium
Hirsch, E.D., Kett, J.F., & Trefil, J.S. (1988). Cultural literacy: What every American needs to know. NY: Vintage Books.
Stonehenge (n.d.). Introductory astronomy: Stonehenge. wsu.edu. Retrieved 5/16/2020 from http://astro.wsu.edu/worthey/astro/html/im-lab/stonehenge/stonehenge.html.
Tardi, C. (9/5/2010). Moore’s Law. Investopedia. Retrieved 5/29/2020 from https://www.investopedia.com/terms/m/mooreslaw.asp.
Wikipedia (2020a). Alan Turing. Retrieved 5/26/2020 from https://en.wikipedia.org/wiki/Alan_Turing.
Wikipedia (2020b). Hidden figures. Retrieved 5/8/2020 from https://en.wikipedia.org/wiki/Hidden_Figures.
Wikipedia (2020c). John McCarthy. Retrieved 5/16/2020 from https://en.wikipedia.org/wiki/John_McCarthy_(computer_scientist).
Wikipedia (2020d). Loebner Prize 2019: Results. Retrieved 5/13/2020 from https://en.wikipedia.org/wiki/Loebner_Prize.
David Moursund is an Emeritus Professor of Education at the University of Oregon, and editor of the IAE Newsletter. His professional career includes founding the International Society for Technology in Education (ISTE) in 1979, serving as ISTE’s executive officer for 19 years, and establishing ISTE’s flagship publication, Learning and Leading with Technology (now published by ISTE as Empowered Learner). He was the major professor or co-major professor for 82 doctoral students. He has presented hundreds of professional talks and workshops. He has authored or coauthored more than 60 academic books and hundreds of articles.