A common misconception about computers is that they are smarter thanhumans. Actually, the degree of a computer¹s intelligence depends on thespeed of its ignorance. Today¹s complex computers are not reallyintelligent at all. The intelligence is in the people who design them.Therefore, in order to understand the intelligence of computers, one mustfirst look at the history of computers, the way computers handleinformation, and, finally, the methods of programming the machines.
The predecessor to today¹s computers was nothing like the machineswe use today. The first known computer was Charles Babbage¹s AnalyticalEngine; designed in 1834. (Constable 9) It was a remarkable device for itstime. In fact, the Analytical Engine required so much power and would havebeen so much more complex than the manufacturing methods of the time, itcould never be built.
No more than twenty years after Babbage¹s death, Herman Hollerithdesigned an electromechanical machine that used punched cards to tabulatethe 1890 U.S. Census. His tabulation machine was so successful, he formedIBM to supply them. (Constable 11) The computers of those times workedwith gears and mechanical computation.
Unlike today¹s chip computers, the first computers werenon-programmable, electromechnical machines. No one would ever confuse thelimited power of those early machines with the wonder of the human brain.An example was the ENIAC, or Electronic Numerical Integrator and Computer.It was a huge, room-sized machine, designed to calculate artillery firingtables for the military. (Constable 9) ENIAC was built with more than19,000 vacuum tubes, nine times the amount ever used prior to this. Theinternal memory of ENIAC was a paltry twenty decimal numbers of ten digitseach. (Constable 12) (Today¹s average home computer can hold roughly20,480 times this amount.)
The Essay on The Microcomputer Computer Computers Machine
In a sense computers have been around for centuries. The abacus, a counting machine, was invented by the Chinese sometime between 500 and 400 BC. The numeral zero was first recognized and written by Hindu's in 650 AD, without which written calculations would be impossible. In 1623 the great grandfather of the processor was born, the calculating clock. Wilhelm Schick ard of Germany invented this ...
Today, the chip-based computer easily packs the power of more than10,000 ENIACs into a silicon chip the size of an infant¹s fingertip. (Reid64) The chip itself was invented by Jack Kilby and Robert Noyce in 1958,but their crude devices looked nothing like the sleek, paper-thin devicescommon now. (Reid 66) The first integrated circuit had but fourtransistors and was half an inch long and narrower than a toothpick. Chipsfound in today¹s PCs, such as the Motorola 68040, cram more than 1.2million transistors onto a chip half an inch square. (Poole 136)
The ENIAC was an extremely expensive, huge and complex machine,while PCs now are shoebox-sized gadgets costing but a few thousanddollars. Because of the incredible miniaturization that has taken place,and because of the seemingly ³magical² speed at which a computeraccomplishes its tasks, many people look at the computer as a replacementfor the human brain. Once again, though, the computer can only accomplishits amazing feats by breaking down every task into its simplest possiblechoices.
Of course, the computer must receive, process and store data inorder to be a useful tool. Data can be text, programs, sounds, video,graphics, etc. Some devices for entering data are keyboards, mice,scanners, pressure-sensitive tablets, or any instrument that tells thecomputer something. The keyboard is the most popular input device forentering text, commands, programs, and the like. (Tessler 157) Newercomputers which use a GUI (pronounced gooey), or Graphical User Interface,utilize a mouse as the main device for entering commands. A mouse is asmall tool with at least one button on it, and a small tracking ball atthe bottom. When the mouse is slid across a surface, the ball tracks themovement on the screen and sends the information to the computer. (Tessler155) A pressure-sensitive tablet is mainly used by graphic artists toeasily draw with the computer. The artist uses a special pen to draw onthe large tablet, and the tablet sends the data to the computer.
The Essay on Micro Chips Computer Chip
The impression that I have gotten from the latest magazines and websites about microchips is that the chip is definitely the mile stone in computer hardware. Computer chips make up our everyday lives enabling many of the things we use like coffee machines, microwaves, ATMs, and computers work and are reliable for use. These chips are no larger than a fingernail and are getting smaller every other ...
Once the data is entered into the computer, it does no good untilthe computer can process it. This is accomplished by the millions oftransistors compressed into the thumb-nail sized chip in the computer.These transistors are not at all randomly placed; they form a sequence,and together they make a circuit. A transistor alone can only turn on andoff. In the ³on² state, it will permit electricity to flow; in the ³off²state, it will keep electricity from flowing. (Poole 136) However, whenall the microscopic transistors are interconnected, they have the abilityto control, manipulate, and move data according to the condition of otherdata. A computer¹s chip is so ignorant, it must use a series of sixteentransistors and two resistors just to add two and two. (Poole 141)Nevertheless, this calculation can be made in just a microsecond, anexample of the incredible speed of the PC. The type of chip mainly usednow is known as a CISC, or Complex Instruction Set Chip. (Constable 98)Newer workstation variety
computers use the RISC type of chip, which stands for Reduced InstructionSet Chip. While the ³complex² type might sound better, the architecture ofthe RISC chip permits it to work faster. The first generation of CISC chipwas called SSI, or Small Scale Integration. SSI chips have fewer than onehundred components. (Reid 124) The period of the late 1960s is known asthe era of MSI, or Medium Scale Integration. MSI chips range from onehundred to one thousand components each. (Reid 124) LSI, or Large ScaleIntegration, was used primarily in the 1970s, each chip containing up toten thousand components. Chips used in the 1990s are known as VLSI, orVery Large Scale Integration, with up to a million or more components perchip. In the not-so-distant future, ULSI, or Ultra Large ScaleIntegration, will be the final limit of the miniaturization of the chip.The transistors will then be on the atomic level and the interconnectionswill be one atom apart. (Reid 124) Because further miniaturization is notpractic
alparallel² systems that split jobs among hundreds of processors will becomecommon in the future.
Once data is entered and processed, it will be lost forever if itis not stored. Computers can store information in a variety of ways. Thecomputer¹s permanent read-only memory, which it uses for basic tasks suchas system checks, is stored in ROM, or Read Only Memory. Programs, files,and system software are stored on either a hard disk or floppy disk inmost systems.
The Essay on Mind Identity Problem Computer Chips
What is the definition of identity? Better yet, what is the definition of the mind and a person? There are so many definitions for identity but the definition according to web is the distinct personality of an individual regarded as a persisting entity. This defines identity the way I define it because, I think, personality serves as an important identifying factor for people. What makes a person ...
The hard disk and floppy disk function similarly, but hard diskscan hold much more information. They work by magnetizing and demagnetizingsmall areas on a plastic or metal platter. The ³read² head then movesalong the tracks to read the binary information. When the program or filebeing read is opened, it is loaded into RAM (Random Access Memory) whereit can be quickly accessed by the processor. RAM is in small chips calledSIMMs, or Single Inline Memory Modules. The speed of RAM is much fasterthan a disk drive because there are no moving parts. The information isrepresented by either a one or a zero, and this amount of information iscalled a bit. (Constable 122) Four bits make a nybble, and two nybblesmake a byte. One byte can hold one character, such as ³A² or ³?². 1024bytes make a kilobyte, 1000 kilobytes make a megabyte, 1000 megabytes makea gigabyte, and 1000 gigabytes make a terabyte. Most personal computershave approximately eighty or so megabytes of hard drive space and eithertwo or four
megabytes of RAM on average. Most ROM on PCs is about 256 kilobytes.
Machine language is the way all computer handle instructions-thesimple, one or zero, yes or no, true or false boolean logic necessary forcomputers. (Reid 122) Boolean logic was invented by George Boole, a poorBritish mathematician in 1815. His new type of logic was mostly ignoreduntil makers of computers more than a century later realized his was theideal system of logic for the computers binary system. machine code is theonly programming ³language² the computer understands. Unfortunately, theendless and seemingly random strings of ones and zeros is almostincomprehensible by humans.
Not long after the computers such as ENIAC came along, programmersbegan to develop simple mnemonic ³words² to stand in the place of thecrude machine code. The words still had to be changed into machine code tobe run, though. This simple advancement greatly helped the programmerswith their tasks. Even with these improvements, the process of programmingwas still a mind-boggling task.
The Term Paper on Programming Language Languages Java Machine
... into assembly language or machine language by a compiler. Assembly language programs are translated into machine language by a program called an assembler.Every CPU has its own unique machine language. Programs must ... interpreter. Compiled Java code can run on most computers because Java interpreters and runtime environments, known as Java Virtual Machines (VMs), exist for ...
The so-called high-level languages are the type used forprogramming in the 90s. Rarely is there ever a need today for programmingin machine code. The way a high-level language works is by converting theEnglish-based commands into machine code by way of an Assembler program.(Constable 122) There are two types of Assembler programs: Compilers andInterpreters. A compiler converts the entire program into machine code.The interpreter is only capable of converting one line at a time.
The first compiler language was Fortran. Fortran became quitepopular after its release in 1957 and is still used for some purposes tothis day. Cobol is another high-level compiler language that has been usedwidely in the business world from 1960 until now. A compiler must beutilized before a program can be run. The compiler translates the programinto the ones and zeros of binary machine code. There are many compilerlanguages used today, such as C and Pascal, named for the French geniusBlaise Pascal. These two languages are the most popular high-levellanguages used for application development.
The interpreter languages are better suited for home computersthan business needs; they are less powerful, but much simpler to use. Aninterpreter language is translated into machine code and sent to theprocessor one line of code at a time. The first popular interpreterlanguage was BASIC, or Beginner¹s All-purpose Symbolic Instruction Code,written by John Kemeny and Toms Kurtz at Dartmouth College. BASIC is stilla much-used language, and is included free with many PCs sold today. BASICwas the first programming language to use the INPUT command, which allowsthe user to input information into the program as it is running.(Constable 29) Another newer and less popular interpreter language isHypertalk, a language that is very English-like and easy to understand.It is included free with every Macintosh computer.
The Essay on Chinese Room Program Searle Machine
Through the use of his famous Chinese room scenario, John R. Searle tries to prove there is no way artificial intelligence can exist. This means that machines do not posses minds. The debate between those who are in favor of strong and weak artificial intelligence (AI) is directly related to the philosophy of mind. The claim of weak AI is that it is possible to run a program on a machine, which ...
There are advantages and disadvantages to both the compiler andthe interpreter languages. The interpreter languages lack speed; however,because they compile as they run, they are very easily ³debugged² or fixedand changed. Before the programmer using a compiler language can try outhis program, he must wait for the compiler to translate his program intomachine code and then change it later. With an interpreter language, onthe other hand, the ease of modification comes with the price of slowerperformance and limited capabilities.
The history of computers, the way computers handle information,and the methods of programming all confirm that computers will never be asintelligent as the people who will design them.