What is a Computer?

A computer is an electronic machine that enables the reception and processing of data. The term comes from English, and this comes from Latin (calculate).

What is a Computer?

What is a PC (Personal Computer)?

Also known as a French computer term, it consists of a series of integrated circuits and other related components (hardware) that allow the execution of various sequences or instruction routines specified by the user or another program.

Computer programs (software) that provide certain data required for data processing are required for the computer to work. Once the desired information has been obtained, it can be used internally or transferred to other computers or electronic components.



Since the beginning of humanity, men have tried to make calculations more accurately and quickly, so that in 3000 BC, calculations were made with fingers and mobile accounts. Around 2500 BC, the abacus, perhaps the first mechanical accounting device to exist, was developed in China, and its effectiveness was to test the time. In 2000 BC, the first formulation of the binary system can also be seen in the Chinese “I-Ching or Mutations Book”.

In 600 BC, the Greek astronomer, mathematician, and philosopher Tales of Miletus described some aspects of static electricity. The word electron is used to denote the negative particles of the atom. Leonardo Da Vinci (1452-1519) laid the foundations in his great ideas and designs without the records of the structure of a mechanical collector.

In the 17th century, French mathematician Blaise Pascal created the first mechanical calculating machine, which was considered a direct pioneer in digital computers. The device created by Pascal in 1642 was named Pascaline. A set of ten-toothed wheels made of wood, each representing a number from 0 to 9, was used. The wheels were connected to advance the correct number so that they could add and subtract numbers.

In 1670, the German philosopher and mathematician Gottfried Wilhelm Leibniz perfected Pascal and invented an invention that allowed reproduction and division.

At the beginning of the 19th century, the English mathematician and inventor Charles Babbage detailed the principles of the modern digital computer. He invented a series of machines, such as the differential machine designed to solve different mathematical problems. Babbage and his assistant English mathematician Augusta Ada Byron are considered to be the true inventors of the modern digital computer. The technology of that time could not translate its correct concepts into practice; however, one of his inventions, the analytical machine (1833) already had many features of the modern computer. An input stream in the form of a stream or a punch card package included memory for storing data, a processor for mathematics, and a printer for permanent recording.

In 1847, British George Boole developed a new type of algebra (Boolean algebra) and began symbolic logic studies. Your algebra is a method of solving logic problems using binary values ​​(1 and 0) and three operators: and (y), or (o), and not (no). Thanks to binary algebra, binary code was later developed as the language used by all computers.

In the 19th century, when designing an automatic loom, French Joseph Marie Jacquard used thin-perforated wooden plates to control the fabric used in complex designs. The idea of ​​using punch cards was designed in 1880 by American statistician Herman Hollerith to process data. Hollerith managed to compile statistical information for the United States’ 1890 census using a system that passes punched cards through electrical contacts. The census was completed in just 3 years and started automatic data processing with a saving of around $5000000.

   First Computers

At the beginning of the 20th century, the first analog pc models that made calculations using rotary axles and gears began to be produced. These models have been used to evaluate the numerical approaches of equations that are very difficult to solve with other methods.

During World War I. and II., analog computer systems were used to predict the trajectory of torpedoes in submarines and to remotely control bombs in aviation.

The first fully electronic pc, II. It was designed by a team of scientists and mathematicians working in North London during World War II. Colossus, the name given to it, included 1500 vacuum tubes or valves and was fully functional to decrypt encrypted radio messages from Germans in December 1943.

In 1944, the first American electromechanical pc (valves and rails) started operating. MARK I. Howard Aiken started its design and construction in 1939. Mark received and delivered information about perforated tapes, it took a second to perform ten operations. It was 18 meters long and 2.5 meters high. Later, Mark II and Mark III were built.

ENIAC (Electronic Digital Numerical Calculator and Integrator), which was a thousand times faster than its electromechanical predecessors, was developed in 1945. It weighed 30 tons and covered an area of ​​450 square meters, filled a 6 MX 12 m room, and had to be programmed manually, connecting with 18,000 bulbs, 3 boards containing more than 6,000 keys. Unlike existing computers running on a binary system (0,1), ENIAC operates on a decimal system (0,1,2..9).

The ENIAC program was connected to the processor and had to be changed manually. Entering a new program was a tedious process that required days or even weeks. The successor of ENIAC was created with program storage based on the concepts of Hungarian-American mathematician John von Neumann, which provides computers with enormous flexibility and reliability, resulting in faster and less error.

Translator programs were the next big step in the development of PC design because they allowed people to communicate with computers using tools other than binary numbers. In 1952, Grace Murray Hopper, a US Navy officer, developed the first compiler to translate English-like phrases into a machine-readable binary code called COBOL (COmmon Business-Oriented Language).

In the late 1950s, the use of the transistor became widespread in computers, leading to the use of smaller, faster, and more versatile logic elements than valve machines allow. Due to its low energy consumption and longer lifetime, the development of transistors has led to the emergence of more complex machines called computers or second-generation computers. Reducing the size of the components and the gaps between them caused a reduction in manufacturing costs.

The appearance of the integrated circuit in the late 1960s made it possible to produce several transistors on a single silicon substrate where the interconnection cables are soldered. This allowed a reduction in price, size, and error rates on new computers.

The microprocessor turned into reality in the mid-1970s, along with LSI (Large Scale Integrated Integration Circuit) and then VLSI (Large Scale Integration Circuit).


The evolution of computers led computers to divide into generations.

   First Generation

The technology of this generation was based on large and heavy vacuum valves that overheated and need to be replaced frequently. These bulb computers were huge, produced a lot of heat, and used a lot of electricity.

Internal storage was carried out with a rapidly rotating drum, on which a read/write device placed magnetic signs. The input and output of the data were carried out with a special code with the help of cards or perforated tapes so that the information processing was slow and sequential. In this generation, the use of the binary system began to represent data.

Among the first systems of this generation was UNIVAC 1, which was used in the United States census in 1950. IBM and Remington Rand have established themselves as leaders in the production of computers, which, despite their expensive and limited use, are quickly accepted by private and government companies.

The most successful first-generation PC was the IBM 650, from which several faces were produced. This PC, which uses the secondary memory layout, was called the magnetic drum, the ancestor of today’s disks.

   Second Generation

The invention of the transistor has led to the emergence of faster, smaller, and more reliable computers than the vacuum valve predecessors. In addition, they needed less ventilation because they produced less heat.

Although there are compatibility problems between different systems, data has been stored on cylinders and magnetic tapes. Computer programs have also improved. Writing a program no longer requires a full understanding of PC hardware, and programs written for one system can be transferred to another system with minimal effort. Developed during the 1st generation, COBOL was already commercially available, and others such as Fortran appeared.

A new output peripheral is emerging throughout this generation, and printers and computers are being used for commercial purposes such as airline reservation systems, air traffic control, and general use simulations.

Although the cost was still high, companies began using computers to record storage tasks such as inventory management, payroll, and accounting. The U.S. Navy used second-generation devices to build the first Whirlwind I flight simulator.

In this generation, Honeywell was positioned as the first IBM competitor in the market as part of the BUNCH group (Burroughs, Univac, NCR, CDC, Honeywell).

   Third Generation

The emergence of integrated circuits or chips enabled the transition to third-generation computers.

A chip allows thousands of electronic components to be grouped on a silicon wafer in a much larger miniature integration than a transistor. In this way, compatibility between processing speed and equipment has been significantly improved.

The new systems have shrunk with less heat release and more energy savings. Computer manufacturers have been able to increase the flexibility of programs and standardize their models.

New potentials allowed multiple programming to occur, computers can perform both tasks for the first time; processing or mathematical analysis. Existing storage systems are developed and the software industry emerges and develops new programming languages ​​such as Pascal, BASIC, Logo.

Miniaturization of components led to the development of mini computers that did not reach a peak in the 60s, 70s of the 20th century, although they appeared throughout the second generation.

One of the first commercial computers to use integrated circuits was IBM-360, which has dominated the sale of third-generation computers since its release in 1965. Digital Equipment Corporation’s PDP-8 was the first minicomputer.

   Fourth Generation

The emergence of microprocessors enables the emergence of fourth-generation computers. Microprocessors are integrated circuits with high density and impressive speed. Another development with the emergence of this new generation was the replacement of memories with magnetic cores for silicon chips.

These advances have made it possible for personal computers (PCs) to emerge as a result of the microminiaturization of electronic circuits, and their cheaper production and expansion to the industrial market have expanded the use of computers by individual users.

The Internet has already become accessible from home because it is from the previous generation, and the lack of advanced programming skills contributed to the broad demand for computers of this generation.

   Fifth Generation

Given the accelerated microelectronics, the industrial community has undertaken the task of ensuring the development of software and systems where computers operate at this point.

International competition is emerging for the domination of the market. It works to achieve the ability to communicate with the system in a more everyday language, not through special codes or control languages.

The development of microcomputers, supercomputers, communication networks, and expert systems has been improved.

Priority is given to research that tries to apply human thought processes used to solve problems in the field of artificial intelligence on the system. There is a driving force in the development of robots, so robots are designed with artificial intelligence, so they can respond more effectively to unstructured situations.

Technology in Cuba

The first computers that came to Cuba and were also called “punch card accounting machines” were imported from two companies with different technologies owned by IBM companies and Remington Rand. The first became more common and led to the creation of an IBM branch office in Cuba to serve the Caribbean and Central American countries.

Regarding the rest of the world, Cuba has advanced a lot in the use of such equipment used for the first time around 1920.

In the late months of 1958, the first electronic system in Cuba was introduced as one of the first IBM RAMAC marketed outside the United States. It was a first-generation machine, as its electronics were based on valves and vacuum tubes and its programming was really primitive.

A British computer was purchased in 1963, Elliot 803-B was classified as the second generation, as its electronics were based on transistors and tubes. Elliot used magnetic films as an external information storage medium. In 1968, two French second-generation SEA-4000 computers (from CII) were purchased to be used to process information from the 1970 population and housing census.

Various variants of the low-level model IRIS 10 produced by IIC in 1970 were located in universities, production support centers (CAI), and other organizations.

Systems Built-in Cuba

One of the teams built in Cuba was the CID-201, a 12-bit “word” for scientific problems, a ferrite core memory and a mini-computer with a capacity of 4 kiloword. The next model of the CID family was 201-B, which could reach more internal memory (50,000 totals per second) with more internal memory capacity (12 bits each up to 32 K words each, 12 bits each).


In 1972, EC series mainframes were purchased: 1020, 22, 30, 35, 40, 45, 50, 55, and 60 models were used, first of all, in ministries and central organizations, and in computing centers for the public use.

The operating systems of the IBM company, such as DOS and OS, began to be studied in Cuba and other accompanying basic software such as COBOL, FORTRAN, and PL/1 programming languages; American database management systems such as TOTAL, IDMS and DB2, and BOMP and CICS business management systems.


On December 23, 1984, the CID 300/10, connected to the televisions and recorders of PCs and cassettes, opened a computing and electronics environment at Ernesto Guevara, the Center of Pioneers’ Center, with a built-in BASIC and smart keyboards.

The first microcomputers come to the end of this stage as a result of donations and individual purchases made by an officer from abroad. Since this arrival was the end of an era, Cubans eventually discovered other IT.

The Young Computer and Electronics Club (JCCE)

JCCE appeared on September 8, 1987, to contribute to the computerization of Cuban society.

Technological potential, print media, image digitization, storage, and reproduction of large volumes of information, with more than 6,000 computers, are at everyone’s disposal. These services have benefited more than a million people.

The mission of these centers is to provide an active, creative and value-creating role in the computerization process of Cuban society, and to provide a society of culture that gives priority to children and young people.

The University of Informatics Sciences (UCI)

The University of Informatics (UCI), which started its activities in the 2002-2003 academic year with an annual enrollment of 2,000 students, is a new type of university that won the first in the Intellectual War organized by the Cuban people. It is an innovative education model that combines work with production and research.

Starting from 2004, it started to show an efficient presence in the Software Industry with the development of projects related to computerization of society and exports.

Software and Hardware

PCs consist of two basic elements: hardware and software.


Hardware, the physical components of the device, everything is visible and tangible. The hardware performs 4 main activities: entry, processing, exit, and secondary storage.

You can clearly distinguish between internal components and external input/output peripherals within the hardware. Internal components may include the motherboard, which is the device that connects the computer, the processor, or all the elements that make up the CPU, and storage devices such as RAM and Hard Disk.

External peripherals can be defined on the input and output devices. The entries are mouse, keyboard, webcams, scanner, and others. And the outputs are a printer, monitor, speakers, etc. There are devices such as modems and touch screens with both input and output.


The second element that makes up the PC is software with a set of items and instructions that allow the use and control of the hardware, that is, programs that allow a PC to operate.

These include the operating system that allows you to manage the functions of the PC and run programs. User applications, such as antivirus or text editors, are programs that the user installs and are generally run by the operating system.


Since the first PC models appeared, the technologies used in them have undergone extensive changes. However, in the vast majority of them, concepts published by mathematician John von Neumann in the early 1940s continue to be used.

The architecture, summarized by Neumann, defines 4 basic sections connected by conductive channels called buses.


It is a numbered storage cell, each of which is a bit of a unit of information. These cells contain the data necessary to execute instructions.

   Input/Output Devices

These devices allow both information and communication of pc-generated results. There are a wide variety of input/output devices.

Two other elements mentioned by the Von Neumann Architecture are found as part of the Processor, also called the Central Processing Unit (CPU).

   Logical Arithmetic Unit

It is the part of the PC responsible for performing relational or comparative operations between data as well as basic operations such as arithmetic and logical operations.

   Control Unit

Also called the smart part of the microprocessor, it is responsible for distributing each of the operations to the relevant area for conversion. It is responsible for following the memory addresses with the instructions that the PC will perform, receiving the information, and sending it to the logical arithmetic unit to perform the related operation. Then he transfers the result back to memory and goes to the next instruction.

Usage and Health

Continuous use of PCs can lead to malfunctions in users’ health. Some of the health damage caused by intensive use of the PC are eye problems, fatigue problems, stress, and increased body weight. These can be determined by the poor distribution of workstations, including the monitor, keyboard, chair components.

The screen brings new insights into lighting and vision, motivating that the most common physical complaint of people who spend a lot of time in front of the monitor is tired vision.

Undoubtedly, the best solution is to reduce the hours in front of the machine, but if there is no other option, the best solution is to take frequent breaks and learn to use PC equipment correctly.

   Related Articles

What is RAM?
What is WEP?
Ethernet Port

Add a Comment

Your email address will not be published. Required fields are marked *