What is Intel?

Intel, known as Intel Corporation, is the largest known company ever to enter the technology market since the creation of the processor family known as x86 and to create the most commercial processors worldwide on PCs.

Intel Nedir?

What is Intel Corporation?

Historical

Intel is the world’s first microprocessor company. It was originally founded in 1968 by Gordon E. Moore and Robert Noyce, who wanted to call the company Moore Noyce, but the name sounded bad, so they chose the abbreviation Integrated Electronic.

The company started making memories before jumping into microprocessors. Until the 1970s, they became the leader thanks to the competitive market for DRAM, SRAM and ROM memory.

On November 15, 1971, they introduced the first microprocessor Intel 4004 to simplify the design of the calculator. Instead of designing several integrated circuits for each part of the calculator, they designed one that could do some actions or others, a microprocessor, according to a program stored in memory.

Soon after, on April 1, 1972, it announced an improved version of the Intel processor. It was 8008 and its main advantage over other models like 4004 was that it could access more memory and process 8 bits. Therefore, with this development, the clock speed reached 740KHz.

In April 1974, they launched the Intel 8080, with clock speed reaching 2 Mhz and 16-bit addressing, 8-bit bus and easy access to 64k memory.

The company then unveiled the first expected personal computer called Altair, which came from a target of the Enterprise ship in one of the episodes of the popular television series Star Trek. This computer had a cost of around $400 at the time, and the processor had to multiply by 10 of the previous performance, which had a 64kb memory thanks to its 2 Mhz speed.

However, the personal computer was not very popular until IBM appeared on the market. Between June 1978 and 1979, with the advent of 8086 and 8088 microprocessors, it continued to form the IBM PC that sells millions of computers.

The strongest of the two processors was 8086, with a 16-bit bus, 5, 8 and 10 Mhz clock speeds, 29,000 transistors using 3-micron technology, and a maximum of 1 Mega addressable memory.

As for the 8088 processor, it was exactly the same, except that it had an 8-bit bus instead of 16, was cheaper and got better support in the market.

It should be noted that the 8086 processor is a very well-known and proven microprocessor, even in 2002 NASA was getting used 8086 microprocessors.

On February 1, 1982, with the emergence of the first 80286, it gave a new direction to the industry, with a speed of 6 to 25 Mhz and much closer to the current microprocessors. As the main innovation, it is worth noting the fact that virtual memory, which can go up to 1 Giga in the case of 286, can finally be used.

The 286 has the honor to be the first microprocessor used to create clone computers in bulk, and thanks to the cross-licensing system, the first compatible IBM clone maker appeared.

Using this microprocessor, Compaq began producing desktop computers and using microprocessors released by Intel/IBM in 1985.

In 1986, the Intel 80386, known for 386, appeared, with a clock speed of 16 to 40 Mhz and a microprocessor with essentially 32-bit architecture.

He also noted that the production of the 80386 microprocessor continues to this day, and Intel will finish producing it in September 2007 in 2006. It appears that this microprocessor is still widely used for embedded systems today.

In 1988, Intel was developing a little late to a simple system to update the old 286, thanks to the look of the 80386SX, which sacrificed it to leave the bus 16 bits, but at a lower cost. These processors were broken by the explosion of the Windows graphics environment developed by Microsoft a few years ago but not sufficiently accepted by users.

On April 10, 1989, the Intel 80486DX, with 32-bit technology, and as the main innovations, was introduced into the chip of the level 1 (L1) cache, which greatly speeds up data transfer from this cache to the processor.

They then released two more versions of the DX: i486 DX2 at 50 and 66 MHz in 1992 and i486 DX4 at 75-100 MHz in 1994 focused on high-end processors. In 1989, they launched the 486, which reached speeds between 16 and 100 MHz and a curiosity. According to Wikipedia, it was named “i486” due to a judicial decision that prohibits the use of signs with numbers.

Therefore, the next microprocessor to be released in May 1993 was known as “Pentium”. Starting at an initial speed of 60 MHz, these processors reached 200 MHz, something no one could have predicted a few years ago. With a real 32-bit architecture, .8 micron technology was reused, thereby producing more units in less space.

On March 27, 1995, the Pentium Pro processor provided a new atmosphere in the home environment for network servers and workstations, as in Pentium.

The power of this processor has been unprecedented by that time thanks to the use of 64-bit architecture and a revolutionary technology such as .32 microns that allows five and a half million temporary content to be inserted into it.

The processor had a second chip in the same package, which was responsible for increasing the cache speed, resulting in a significant increase in performance.

The clock frequencies were kept as a limit above 200 MHz, starting at a minimum of 150 Mhz. It is an evolution Intel has recently demonstrated with a new processor called Pentium II. Pentium II is the fastest processor Intel has released, as this is new creativity that adds Pentium Pro’s technologies with MMX.

Besides having 0.25-micron architecture, the design of 0.07 architecture was planned until 2011. With this development, it is aimed to enter one billion transistors into the processor and reach a clock speed close to 10000 MHz, that is, 10 GHz.

MMX Technology

Although we cannot see MMX technology as a processor on its own, it would be unfair not to mention such a report. It is one of the biggest steps Intel has taken in the current decade, and all the processors they produce in the middle of next year will include this architecture.

For its development, a wide variety of programs were analyzed to determine the functioning of different tasks: video, audio or graphic decompression algorithms, speech recognition formats or image processing. The analysis resulted in a large number of algorithms that use repetitive loops that take less than 10% of the program code, but in practice, it accounts for 90% of the execution time.

Thus, MMX technology was born, consisting of 57 instructions and 4 new data types responsible for doing these cyclical tasks that spend less execution time. Previously, the same instruction had to be repeated 8 times to manipulate 8-byte graphics data. Now, with the new technology, a single command applied to 8 bytes can be used simultaneously, resulting in an 8x performance boost.

Intel dominates the microprocessor market. Currently, Intel’s main competitor in the market is Intel’s Advanced Micro Devices (AMD), where each partner has technology-sharing agreements that each partner can use patented technological innovations for free.

Among the Intel microprocessors, we can highlight the multi-core technologies implemented in Pentium D and Core 2 Duo processors, Centrino mobile technology developed for the laptop market and HyperThreading technology integrated into Intel Pentium 4 processors.

On June 6, 2005, Intel agreed with Apple Computer and Intel will transition from traditional IBM between 2006 and 2007 to provide processors for Apple computers. Finally, in January 2006, the first Apple computers were launched with a laptop and a desktop, dual-core Intel Core Duo processors.

Moore’s Law

Co-founder of Intel Corporation. Gordon Moore formulated a law known in 1965 as the “Moore’s Law,” which states that the number of transients on a chip doubled every eighteen months.

Originally designed for memory devices, but also for microprocessors, this statement complies with the law. It is a law for the user every eighteen months, which means that better technology can enjoy something that has been implemented for the last 30 years and is waiting to continue in the next fifteen or twenty years.

   Related Articles


What is Linux?
What is Kali Linux?
What is Debian?
What is Linux Mint?
What is Ubuntu?

Add a Comment

Your email address will not be published. Required fields are marked *

You cannot copy content of this page