Introduction to ICTing and Mathing Across the History Curriculum. Computer Cultural Literacy: Part B

David Moursund
Professor Emeritus, College of Education
University of Oregon
This free Information Age Education Newsletter is edited by Dave Moursund, edited by Ann Lathrop, and produced by Ken Loge. The newsletter is one component of the Information Age Education (IAE) and Advancement of Globally Appropriate Technology and Education (AGATE) publications.
All back issues of the newsletter and subscription information are available online. A number of the newsletters are available in Spanish on the AGATE website mentioned above.
My most recent free book, The Future of AI in Our Schools, was published last month. This short 75-page book is available free online at https://moursundagatefoundation.org/2018/07/the-fourth-r-second-edition/. In my personal opinion, this is the most important and best book I have ever written. Please share the book with others you think might enjoy it. It currently is available only in English, but soon will be available in Spanish as well.
Another of my recent free books is titled Computer Cultural Literacy for Educators.

English version

Versión en español

An earlier free book, The Fourth R (Second Edition) is available free in both, English and Spanish (Moursund, link). The unifying theme of the book is that the 4th R of Reasoning/Computational Thinking is fundamental to empowering today’s students and their teachers throughout the K-12 curriculum.
These recent books have now had a combined total of about 150,000 page-views. My various free web available publications—with no paid ads—have had a total of approximately 20 million page views over the past dozen years.
Introduction to ICTing and Mathing
Across the History Curriculum.
Computer Cultural Literacy: Part B
“We have ignored cultural literacy in thinking about education. We ignore the air we breathe until it is thin or foul. Cultural literacy is the oxygen of social intercourse.” (E.D. Hirsch, Jr.; American educator and academic literary critic; 1928-.)

“The achievement of high universal literacy is the key to all other fundamental improvements in American education.” (E.D. Hirsch, Jr.; American educator and academic literary critic; 1928-.)

Introduction

This is a continuation of IAE Newsletter #286 that introduced the topic of computer cultural literacy. I have renamed that newsletter Computer Cultural Literacy: Part A. This current newsletter and the two following ones will be Computer Cultural Literacy: Parts B, C, and D. My reason is that the second part of the planned two-part newsletter on Computer Cultural Literacy has grown to be far too long for one newsletter. I decided to divide it into three parts, with the remaining parts to follow in the next two newsletters.

The two quotes given to start this newsletter were used in Computer Cultural Literacy: Part A (Newsletter # 286). The first two paragraphs below also are copied from that newsletter.

In 1988, University of Virginia Professor E.D. Hirsch published Cultural Literacy: What Every American Needs to Know (Whitman College Library,1988, link to PDF file). ln this best-selling book, he argued that progressivist education had let down America’s students by neglecting knowledge in the form of a shared body of information. The book included a list of 5,000 facts, dates, famous people, works of literature, and concepts that he believed every American should know. His book and the list have proven to be quite popular. (Core Knowledge Foundation, 2020, link.)

Information and Communication Technology (ICT) is now a well-established part of our culture. This ICT-based culture includes a large number of computer-related facts, dates, famous people, software (computer programs), hardware (physical machines and devices), and concepts that have become integral to our overall culture. Today’s teachers and their students need to become familiar with many of them.

Newsletter #286 presented the names of some of the people who are important parts of the history of computer development and the field of computers in education. The focus of this and the next two newsletters is on computer-related technology, including hardware, software, publications, and a number of other important computer-related ideas. For each term on the list, there is a link to related information; this replaces the usual References and Resources section in the IAE Newsletters.

I suspect that a number of my readers will want to suggest additional terms they believe should be added to the lists. Please make use of the Comments feature at the end of this newsletter to present and briefly justify your suggestions. I’d also appreciate a link to more information about each term you suggest.

Terms Important in Computer Cultural Literacy

Abacus. Arithmetic calculating device invented more than 4,000 years ago. (Wikipedia, 2020, link.) The history of the abacus is a story of people developing devices to aid in doing addition, subtraction, multiplication, and division of integer numbers. Contrast the abacus with paper and pencil algorithms for doing such calculations. (Of course, paper and pencil did not exist when the abacus was first developed.)

Air traffic control system. Highly computerized system to coordinate and control air traffic. (Sheffield School of Aeronautics, 11/27/2019, link.)

Analogue computer. “A type of computer that uses the continuously changeable aspects of physical phenomena such as electrical, mechanical, or hydraulic quantities to model the problem being solved.” (Wikipedia, 2020, link.)

Apple, Inc. Multinational technology company founded by Steve Jobs and Steve Wozniak in 1976. The company got its start with Apple desktop computers. Its current product line includes the iPhone smart phone, the iPad tablet computer, the Mac personal computer, the iPod portable media player, the Apple Watch smartwatch, the Apple TV digital media player, the AirPods wireless earbuds, and the HomePod smart speaker. (Wikipedia, 2020, link.)

Artificial Intelligence (AI). A branch of computer science concerned with building machines capable of performing tasks that typically require human intelligence. (BuiltIn, n.d., link.) The capabilities of AI have increased rapidly in recent years. (Lauret, 7/22/2020, link.) A 2018 interview with futurist Ray Kurzweil provides an overview of a number of the current AI capabilities, issues, and possible futures. (YouTube, 3/20/2018, link to 61 minute video.)

Today’s smart phone has considerable artificial intelligence. It represents a huge step up on the intelligence scale when compared to the first hand held pocket calculator. Researchers and product developers are continuing to make considerable progress on AI-based machines that are further up on that scale.

Association for Computing Machinery (ACM). The ACM is a U.S.-based international non-profit professional society founded in 1947, and is the world’s largest scientific and educational computing society. More than half of its current 100,000 members live outside the U.S. (Association for Computing Machinery, 2020, link.)

Automatic Teller Machine (ATM). A self-service banking machine for deposits, withdrawals, and other banking activities. (McRobbie, L.R., 1/8/2015, link.) A variety of ideas for such a service were explored in the early 1960s, and the first commercial use began in London, England in 1967. For many years, these “online tellers” did not lead to a decrease in the number of bank teller jobs. Rather, it made it economically profitable for banks to open many small branch offices, and this led to a considerable net increase in the number of in-bank tellers that were needed. (Wikipedia, 2020, link.)

Autonomous vehicle. Mobile devices such as cars, trucks, airplanes, drones, and mobile robots that are controlled by on-board and/or remote computers. (Wired, 2020, link.) There is a huge potential market for autonomous cars, delivery vans, and larger trucks. Tesla is a world leader in this endeavor. (Trefis Team, 7/3/2020, link.)

Back-up. A duplicate copy of one’s on-computer work, or the process of creating such a copy. (Its Learning, 4/4/2018, link.)

Barcode or bar code. A method for representing data in a visual, machine-readable form that now is exceedingly widely used to identify products being sold in retail stores and for other item identification purposes. Developed by Bernard Silver and Norman Joseph Woodland, their patent was issued on October 7, 1952. (Wikipedia, 2020, link.)

BASIC (Beginners’ All-purpose Symbolic Instruction Code). A programming language designed in 1964 by John Kemeny and Tom Kurtz at Dartmouth College to be used on a time-shared computer system. Kemeny and Kurtz wanted to enable students in fields other than science and mathematics to use computers. Fortran was the dominant programming language for use in science and mathematics at that time. (Wikipedia, 2020, link.)

Big data. A large collection of structured, semi-structured, and unstructured data that can be mined for its information content and also used in machine learning projects, predictive modeling, and other advanced analytics applications. (Rouse, October, 2019, link.) The ability and facilities to gather and effectively process very large databases is a world changer. The Web and the data collected by the CERN Large Scale Hadron Collider are examples of VERY BIG DATA. (Gaillard, 7/6/2017, link.) By 2017, the CERN database was roughly equivalent in size to 50 full-length novels for every person on earth!

Binary number system (bits and bytes). The base 10 number system uses the ten digits 0, 1, … 9. The binary number system uses the two binary bits 0 and 1. A group of eight binary bits is called a byte, and is a commonly used unit of computer storage. One byte can represent any one of 256 different characters such as lower and upper case letters, punctuation marks, and so on. (Rouse, 9/15/2006, link.)

Bitcoin. A digital currency created in January, 2009. Functionally, it is a collection of computers, or nodes, that all run Bitcoin’s code and store its blockchain. (Frankenfield, 5/11/2020, link.)

Blog (blogger). A blog is a type of publication (typically a continuing sequence of documents) published on the Web. A blogger is a person who creates and publishes (posts) such documents. (Wikipedia, 2020, link.) The term blog dates back to the 1990s. Starting in about 2010, groups of individuals and also organizations began to write and publish blog entries.

Broadband connectivity. Connectivity to the Web ranges from very slow to very fast. The U.S. Federal Communications Commission currently defines broadband to mean at least 25 million bits per second of download speed, and 3 million bits per second of upload speed. Such specifications vary from country to country. (Wikipedia, 2020, link.)

Browser (Web browser). Computer software designed for browsing (searching) the World Wide Web. Google is a prominent example. (Wikipedia, 2020, link.)

Bug (software bug). An error in a computer program. Use of the word bug to designate an error in a constructed device dates back to at least 1878, when Thomas Edison wrote in a letter, “You were partly correct, I did find a ‘bug’ in my apparatus, but it was not in the telephone proper.” On September 9, 1947, Grace Hopper found a moth that had caused a malfunction in an analogue computer. Over the years since then, she often has been credited with finding the first bug in a computer. (McFadden, 6/12/2020 link.)

Cell phone. A portable telephone that can send and receive calls over a radio frequency link, also called a mobile phone. It lacks a number of the features of a smart phone. (Wikipedia, 2020, link.)

Chatbot. Computer system that can carry on a conversation (or a chat) with a user in natural language by use of artificial intelligence. (Expert System, 3/17/2020, link.)

Chromebook. A number of different companies manufacture and sell this laptop or tablet computer that uses the Linux-based Chrome Operating System. Chromebooks use the Google Chrome browser, with most applications and data residing in the cloud rather than on the machine itself. It was first introduced in 2011, and by March 2018, Chromebooks made up 60% of all computers purchased by schools for student use in the United States. (Wikipedia, 2020, link.)

Circuit board (printed circuit board, PCB). Used to mechanically support and to both electrically and electronically connect components in a circuit. Nowadays, these components are generally soldered onto the board by highly automated devices. (Wikipedia, 2020, link.)

Cloud storage. A widely used term for a data storage system in which data is stored on remote servers (physically located in ground-based computer centers) and accessed using the Internet or other connectivity to users. (Techopedia, 7/18/2017, link.)

Communication satellite. “An artificial satellite that relays and amplifies radio telecommunications signals via a transponder; it creates a communication channel between a source transmitter and a receiver at different locations on Earth.” (Wikipedia, 2020, link.)

Computational thinking. Thought processes involved in analyzing a problem and expressing its solution(s) as a procedure that a computer, or a human and computer working together, can carry out. (Moursund, 2018, link.) Knowledge and skill in computational thinking is a major component of computer literacy.

Computer (job description). A somewhat archaic term for a person who is skilled at a professional level in using aids such as mechanical and electric calculators to rapidly and accurately carry out long sequences of arithmetic calculations. (Wikipedia. 2020, link.)

Computer (machine). “A machine that automatically carries out processes, calculations, and other operations specified by instructions in a computer program.” (Techo pedia, 2020, link.)

Computer-aided learning (CAL). Originally called computer-assisted instruction (CAI) when this type of instruction was being developed by Patrick Suppes and others in the 1960s. Eventually, it became clear that the key goal is learning, rather than instruction. (Archived Information, 1993, link.) Substantial research over many years has demonstrated the effectiveness of the use of CAL in a number of different settings. An 8/9/2020 Google search of the expression research on the effectiveness of computer-assisted learning produced more than 200 million results. CAL materials vary widely in quality and effectiveness, and students vary widely in how well such materials fit their personal learning characteristics. This is an important ongoing area of research and development, and it is being strongly influenced by progress in artificial intelligence.

Computer-aided medicine. A broad term covering all aspects of using computers, robots, and artificial intelligence in medical diagnosis, gathering and processing medical data, carrying out medical procedures, dispensing medicines, and so on. (BCS Health and Care, 2020, link.)

Computer Algebra System (CAS). Any computer software with the ability to manipulate mathematical expressions in a way similar to the traditional manual algebraic computations of mathematicians and scientists. The first such systems were developed in the early 1960s. (Wikipedia, 2020, link.) A number of such systems are now in current use. For example, Wolfram Alpha LLC offers both a widely used free version, and also commercially available versions. (WolframAlpha, 2020, link.) These make extensive use of artificial intelligence.

Computer game (video game). A game played by a person interacting with a computer, whether through a handheld device, personal computer, or online connectivity. While often thought of only in terms of their entertainment uses, many computer games are designed partly or mainly for educational purposes. One of the early and widely used educational computer games is The Oregon Trail, an historical simulation developed by Don Rawitsch in 1971 that still is available in 2020. (94.3 The Point, 2020, link; Science Daily, n.d., link.)

There are a great many educational computer games. (Hopkins, 12/17/2018, link.) Some are considered to be dual purpose—being both quite educational and quite entertaining. Minecraft is a good example. (King, 1/19/2016, link; Open Education Database, n.d., link.)

Computer graphics. A branch of computer science dealing with using computers as an aid to generating still and motion graphic images. Two- and three-dimensional computer animation, including virtual reality, have greatly aided and helped to change animation processes. Virtual reality is an emerging power in both entertainment and education. (Wikipedia, 2020, link.)

Computer literacy. Nontechnical and limited-technology knowledge about computers, their applications, and how to use them. The term was widely used as computers were first becoming available to many students in K-12 schools. (Moursund, 1981, link to PDF file.)

Computer mouse (cordless mouse). A pointing device used to interact with a computer. Credited to Douglass Engelbart for his work in the mid-1960s, although key ideas were developed by others about 20 years earlier. (Wikipedia, 2020, link.)

The mouse with a cord coming out of its tail end (hence, why it is called a mouse) is still in use. However, in 1984, a wireless mouse using infrared connectivity to a computer first became commercially available. When the infrared connectivity was replaced by radio frequency, the wireless mouse quickly came into very wide use. (History-Computing, n.d., link.)

Computer music. This includes both music produced (that is, performed) by a computerized music synthesizer, as well as original music composed by a computer. (Road, et al., 1996, link.)

Computer network. Five types of computer networks, based on their size, are: LAN (Local Area Network); PAN (Personal Area Network); MAN (Metropolitan Area Network); WAN (Wide Area Network); and Internet (a network of networks.) (Javetpoint, n.d., link.) The idea of networking computers goes back to about 1961, but it took until 1971 before the first email message was sent and received. Ten years later, BITNET was created in 1981 as a network between IBM mainframe systems in the U.S., and in the same year CSNET (Computer Science Network) was developed by the U.S. National Science Foundation. (Computer Hope, 04/02/2019, link.)

Computer operating system. Computer software that manages computer hardware and software resources, and provides common services used in a variety of computer programs. Widely used examples include Chrome, Linux, macOS, and Windows. (Wikipedia, 2020, link.)

Computer program (high level programming language). A step-by-step set of instructions that can be carried out by (executed by) a computer, and designed to solve a type of problem or accomplish a type of task. There are many different “higher-level” programming languages used to write computer programs, and new programming languages are being developed from time to time. The first high-level commercially available programming language was Fortran (1956), and it is still widely used. LISP (1958) and COBOL (1960) are still in use. The programming languages Logo (1968) and Scratch (2002) were both developed mainly for the use of children and both still are widely used. (Wikipedia, 2020, link.) An extensive list of other programming languages and people involved in developing programming languages is available at the same site.

Computer simulation. The process of developing a mathematical model to be performed on a computer, and that is designed to predict the behavior of and/or the outcome of a real-world or physical system. For example, weather and climate change forecasters make extensive use of computer simulations. Other examples include car driving and airplane piloting simulators. (Wikipedia, 2020, link.) Courses in modeling and simulation are taught in some high schools, as well as in many post high school institutions. (Modeling & Simulation 101, 8/12/2009, link to 6:17 YouTube video.)

Computer stylus (computer pen). A small pen-shaped instrument with a tip whose location (when positioned on a touchscreen) can be detected by the screen. Used both as a pointing device and as a drawing device. (Wikipedia, 2020, link.)

Core memory (magnetic core memory). The use of rings of a hard magnetic material such as iron ferrite, with three or four wires passing through each core, to store one binary bit. The development of core memory was huge step forward in computer technology, and was widely used during 1955-1975. (Wikipedia, 2020, link.)

Data (raw data). A collection of facts and/or figures. Data that has been processed, organized, structured, or presented in other manners to make it more useful is called information. (Thakur, n.d., link.)

Data processing. Data in its “raw” form is converted into more useful/organized formats as an aid to solving problems and accomplishing tasks. In many cases, this begins with getting the data into a computer-readable form. Once data is stored in a computer, it can be manipulated (processed) using a wide variety of both general purpose and special purpose software. For example, a variety of readily available programs that can be used to convert data into graphs, charts, spreadsheets, and databases are in common use. Many businesses make use of specialized programs written to fit their specific business needs. (Pearlman, 5/27/2020, link.)

Database. A collection of data—which may or may not be on a computer—organized so that it can easily be accessed, managed, and updated. The Web provides an excellent and humongous example. (Guru99, 2020, link.) In 2020, the Web contains about 4.2 billion Web pages that can be accessed by the widely used browsers. (Wikipedia, 2020, link.) See also Deep Web.

Debug computer software. Locate and correct errors in a computer program. (Techopedia, 2/2/2017, link.) Debugging is an important component of computer programming.

Deep Web. The parts of the Web whose contents are not indexed by standard Web search engines (browsers). It is estimated to be more than 300 times as large as the part of the Web accessed by Google and other commonly used Web browsers. (Wikipedia, 2020, link.)

DIALOG. An online information storage and retrieval system developed by Lockheed Martin in 1966, and owned today by Pro-Quest. It was widely used long before the development of the Web. In the 1980s, a low-priced dial-up version of a subset of DIALOG named Knowledge Index became available. Remnants of the original DIALOG system are still in use. (Wikipedia, 2020, link.)

Digital camera. A still or motion image capturing device that stores the captured images digitally. Such images can then be viewed immediately, and can be edited using computer technology. Digital cameras were developed in the mid 1970s. (Wikipedia, 2020, link.)

Digital data storage device. Used for the temporary or permanent storage of digital data. Punch cards were the first such storage medium. Other widely used storage devices include magnetic tape, hard drive disk, floppy disk, CD (compact disc), DVD (digital video disc), Blu-ray disc, flash drive, secure digital card (SD card), and solid state drive (SSD). (Goodman, 7/12/2020, link).

Digital filing cabinet (for teachers). An electronic digital dataset specifically designed to meet the digital storage and retrieval needs of preservice and inservice teachers. (Moursund, 2016, link.)

Disk drive (hard drive, magnetic disk drive). A magnetic disk is a flat, circular platter coated with magnetizable material and used as a computer digital data storage device. A disk drive holds, spins, reads, and writes magnetic disks. First made available by IBM in 1956. (Wikipedia, 2020, link.)

Final Remarks

This list of terms is about one-third of the total list that has been created in my current writing project. The remaining two-thirds will be presented in the next two newsletters. These Final Remarks have been written to apply to the entire list.

A new word or short phrase (or a new meaning added to an existing word or short phrase) coming into a language can be thought of as an historical event, representing a tidbit of history. It is an example illustrating the fact that any language in wide use today can be a living, growing, and changing entity.

I’ll use the word singularity for an example. My doctorate in in mathematics, and singularity is an important math concept. In mathematics, a singularity is a point at which a given mathematical object is not defined, or a point where the mathematical object ceases to be well-behaved in some particular way. An example is provided by the function f(x) = 1/x. This is a well-behaved function except at the point x = 0. There, it is undefined. For positive values of x as x approaches 0, the value of the function gets larger and larger—it approaches infinity. For negative values of x approaching the point x = 0, the function values approach minus infinity.

The previous newsletter in this series included both John von Neuman and Ray Kurzweil as important people in the history of computers. Both used the word singularity to refer to a time when computers would become very much more intelligent than humans. In yet another use of this word, the last chapter of Steven Hawking’s doctoral dissertation written in the 1960s was titled Singularities and discussed the idea of the Big Bang creation of our universe. (Geach, 3/14/2018, link.) Thus, as both a math person and a computer person, I live with three vaguely related but quite different definitions of the term singularity.

The key idea is that the word singularity and its three definitions given above are all important parts of my life—of what I know, think about, and use in my communication with other people. This illustrates a very important concept in human communication. In order to communicate effectively with a person about a specific topic, I need to know quite a lot about that topic, but I also need to have a rough idea of what that person knows about the topic. This is the idea that underlies computer cultural literacy, and also applies to communication in any field of study.

My presenting you with a long list of computer-related terms, and you doing the same thing for your students, does not accomplish the goal of improving computer-related cultural literacy. Only when these terms become part of the working and thinking vocabulary used by you and your students will my goal of helping to improve computer cultural literacy have been accomplished.

While my lists in this as well as in the previous and upcoming newsletters are certainly long enough to start the reader thinking about the overall task, I am quite sure the lists are missing many important people and terms. I suggest that you talk with your students and other young people about the lists. Ask them for the vocabulary they use in talking with their friends about social networking, computer games, computer uses in schools and at-home schooling, and so on.

I have no expectation that teachers and others making use of my lists will agree with all of my choices. They may or may not decide to add parts of this historical information to their curriculum. They may or may not be motivated to learn more about specific terms I have included in my lists. They may or may not add additional terms to my lists. Moreover, I have not attempted to make a guess at the age level or grade level when it would be appropriate to introduce your students to the various terms.

Readers are strongly urged to make use of the Comments feature at the end of this newsletter to add their suggestions to both lists. Please include brief information about each person or term you want to add, with a link to help me locate more information.

Additional Resources

Rinconada, J. (9/6/2019). Most influential people in computer science. Retrieved 7/25/2020 from https://medium.com/@jrinconada/most-influential-people-in-computer-science-59fe9461c51b.

Wikipedia (2020.) List of pioneers in computer science. Retrieved 7/25/2020 from https://www.computerweekly.com/news/2240103681/IT-greats-Top-10-greatest-IT-people.

Author

David Moursund is an Emeritus Professor of Education at the University of Oregon, and editor of the IAE Newsletter. His professional career includes founding the International Society for Technology in Education (ISTE) in 1979, serving as ISTE’s executive officer for 19 years, and establishing ISTE’s flagship publication, Learning and Leading with Technology (now published by ISTE as Empowered Learner). He was the major professor or co-major professor for 82 doctoral students. He has presented hundreds of professional talks and workshops. He has authored or coauthored more than 60 academic books and hundreds of articles.