top of page
Writer's pictureraewarpovergesstes

Microprocessor Book By B.ram: Develop Your Skills in Microprocessor Programming and Interfacing



  • New Age International

  • Amazon.com

  • Barnes&Noble.com

  • Books-A-Million

  • IndieBound


  • Find in a library

  • All sellers

Get Textbooks on Google PlayRent and save from the world's largest eBookstore. Read, highlight, and take notes, across web, tablet, and phone.




Microprocessor Book By B.ram



A microprocessor is a computer processor where the data processing logic and control is included on a single integrated circuit, or a small number of integrated circuits. The microprocessor contains the arithmetic, logic, and control circuitry required to perform the functions of a computer's central processing unit. The integrated circuit is capable of interpreting and executing program instructions and performing arithmetic operations.[1] The microprocessor is a multipurpose, clock-driven, register-based, digital integrated circuit that accepts binary data as input, processes it according to instructions stored in its memory, and provides results (also in binary form) as output. Microprocessors contain both combinational logic and sequential digital logic, and operate on numbers and symbols represented in the binary number system.


Before microprocessors, small computers had been built using racks of circuit boards with many medium- and small-scale integrated circuits, typically of TTL type. Microprocessors combined this into one or a few large-scale ICs. While there is disagreement over who deserves credit for the invention of the microprocessor, the first commercially available microprocessor was the Intel 4004, designed by Federico Faggin and introduced in 1971.[2]


Continued increases in microprocessor capacity have since rendered other forms of computers almost completely obsolete (see history of computing hardware), with one or more microprocessors used in everything from the smallest embedded systems and handheld devices to the largest mainframes and supercomputers.


A minimal hypothetical microprocessor might include only an arithmetic logic unit (ALU), and a control logic section. The ALU performs addition, subtraction, and operations such as AND or OR. Each operation of the ALU sets one or more flags in a status register, which indicate the results of the last operation (zero value, negative number, overflow, or others). The control logic retrieves instruction codes from memory and initiates the sequence of operations required for the ALU to carry out the instruction. A single operation code might affect many individual data paths, registers, and other elements of the processor.


As integrated circuit technology advanced, it was feasible to manufacture more and more complex processors on a single chip. The size of data objects became larger; allowing more transistors on a chip allowed word sizes to increase from 4- and 8-bit words up to today's 64-bit words. Additional features were added to the processor architecture; more on-chip registers sped up programs, and complex instructions could be used to make more compact programs. Floating-point arithmetic, for example, was often not available on 8-bit microprocessors, but had to be carried out in software. Integration of the floating-point unit, first as a separate integrated circuit and then as part of the same microprocessor chip, sped up floating-point calculations.


Thousands of items that were traditionally not computer-related include microprocessors. These include household appliances, vehicles (and their accessories), tools and test instruments, toys, light switches/dimmers and electrical circuit breakers, smoke alarms, battery packs, and hi-fi audio/visual components (from DVD players to phonograph turntables). Such products as cellular telephones, DVD video system and HDTV broadcast systems fundamentally require consumer devices with powerful, low-cost, microprocessors. Increasingly stringent pollution control standards effectively require automobile manufacturers to use microprocessor engine management systems to allow optimal control of emissions over the widely varying operating conditions of an automobile. Non-programmable controls would require bulky, or costly implementation to achieve the results possible with a microprocessor.


A microprocessor control program (embedded software) can be tailored to fit the needs of a product line, allowing upgrades in performance with minimal redesign of the product. Unique features can be implemented in product line's various models at negligible production cost.


The advent of low-cost computers on integrated circuits has transformed modern society. General-purpose microprocessors in personal computers are used for computation, text editing, multimedia display, and communication over the Internet. Many more microprocessors are part of embedded systems, providing digital control over myriad objects from appliances to automobiles to cellular phones and industrial process control. Microprocessors perform binary operations based on boolean logic, named after George Boole. The ability to operate computer systems using Boolean Logic was first proven in a 1938 thesis by master's student Claude Shannon, who later went on to become a professor. Shannon is considered "The Father of Information Theory".


Following the development of MOS integrated circuit chips in the early 1960s, MOS chips reached higher transistor density and lower manufacturing costs than bipolar integrated circuits by 1964. MOS chips further increased in complexity at a rate predicted by Moore's law, leading to large-scale integration (LSI) with hundreds of transistors on a single MOS chip by the late 1960s. The application of MOS LSI chips to computing was the basis for the first microprocessors, as engineers began recognizing that a complete computer processor could be contained on several MOS LSI chips.[6] Designers in the late 1960s were striving to integrate the central processing unit (CPU) functions of a computer onto a handful of MOS LSI chips, called microprocessor unit (MPU) chipsets.


While there is disagreement over who invented the microprocessor,[2] the first commercially produced microprocessor was the Intel 4004, released as a single MOS LSI chip in 1971.[7] The single-chip microprocessor was made possible with the development of MOS silicon-gate technology (SGT).[8] The earliest MOS transistors had aluminium metal gates, which Italian physicist Federico Faggin replaced with silicon self-aligned gates to develop the first silicon-gate MOS chip at Fairchild Semiconductor in 1968.[8] Faggin later joined Intel and used his silicon-gate MOS technology to develop the 4004, along with Marcian Hoff, Stanley Mazor and Masatoshi Shima in 1971.[9] The 4004 was designed for Busicom, which had earlier proposed a multi-chip design in 1969, before Faggin's team at Intel changed it into a new single-chip design. Intel introduced the first commercial microprocessor, the 4-bit Intel 4004, in 1971. It was soon followed by the 8-bit microprocessor Intel 8008 in 1972.


Other embedded uses of 4-bit and 8-bit microprocessors, such as terminals, printers, various kinds of automation etc., followed soon after. Affordable 8-bit microprocessors with 16-bit addressing also led to the first general-purpose microcomputers from the mid-1970s on.


Since the early 1970s, the increase in capacity of microprocessors has followed Moore's law; this originally suggested that the number of components that can be fitted onto a chip doubles every year. With present technology, it is actually every two years,[11][obsolete source] and as a result Moore later changed the period to two years.[12]


These projects delivered a microprocessor at about the same time: Garrett AiResearch's Central Air Data Computer (CADC) (1970), Texas Instruments' TMS 1802NC (September 1971) and Intel's 4004 (November 1971, based on an earlier 1969 Busicom design). Arguably, Four-Phase Systems AL1 microprocessor was also delivered in 1969.


The Four-Phase Systems AL1 was an 8-bit bit slice chip containing eight registers and an ALU.[13] It was designed by Lee Boysel in 1969.[14][15][16] At the time, it formed part of a nine-chip, 24-bit CPU with three AL1s. It was later called a microprocessor when, in response to 1990s litigation by Texas Instruments, Boysel constructed a demonstration system where a single AL1 formed part of a courtroom demonstration computer system, together with RAM, ROM, and an input-output device.[17]


In 1968, Garrett AiResearch (who employed designers Ray Holt and Steve Geller) was invited to produce a digital computer to compete with electromechanical systems then under development for the main flight control computer in the US Navy's new F-14 Tomcat fighter. The design was complete by 1970, and used a MOS-based chipset as the core CPU. The design was significantly (approximately 20 times) smaller and much more reliable than the mechanical systems it competed against and was used in all of the early Tomcat models. This system contained "a 20-bit, pipelined, parallel multi-microprocessor". The Navy refused to allow publication of the design until 1997. Released in 1998, the documentation on the CADC, and the MP944 chipset, are well known. Ray Holt's autobiographical story of this design and development is presented in the book: The Accidental Engineer.[18][19]


Ray Holt graduated from California Polytechnic University in 1968, and began his computer design career with the CADC. From its inception, it was shrouded in secrecy until 1998 when at Holt's request, the US Navy allowed the documents into the public domain. Holt has claimed that no one has compared this microprocessor with those that came later.[20] According to Parab et al. (2007), .mw-parser-output .templatequoteoverflow:hidden;margin:1em 0;padding:0 40px.mw-parser-output .templatequote .templatequoteciteline-height:1.5em;text-align:left;padding-left:1.6em;margin-top:0


In 1990, American engineer Gilbert Hyatt was awarded U.S. Patent No. 4,942,516,[23] which was based on a 16-bit serial computer he built at his Northridge, California home in 1969 from boards of bipolar chips after quitting his job at Teledyne in 1968;[2][24] though the patent had been submitted in December 1970 and prior to Texas Instruments' filings for the TMX 1795 and TMS 0100, Hyatt's invention was never manufactured.[24][25][26] This nonetheless led to claims that Hyatt was the inventor of the microprocessor and the payment of substantial royalties through a Philips N.V. subsidiary,[27] until Texas Instruments prevailed in a complex legal battle in 1996, when the U.S. Patent Office overturned key parts of the patent, while allowing Hyatt to keep it.[2][28] Hyatt said in a 1990 Los Angeles Times article that his invention would have been created had his prospective investors backed him, and that the venture investors leaked details of his chip to the industry, though he did not elaborate with evidence to support this claim.[24] In the same article, The Chip author T.R. Reid was quoted as saying that historians may ultimately place Hyatt as a co-inventor of the microprocessor, in the way that Intel's Noyce and TI's Kilby share credit for the invention of the chip in 1958: "Kilby got the idea first, but Noyce made it practical. The legal ruling finally favored Noyce, but they are considered co-inventors. The same could happen here."[24] Hyatt would go on to fight a decades-long legal battle with the state of California over alleged unpaid taxes on his patent's windfall after 1990, which would culminate in a landmark Supreme Court case addressing states' sovereign immunity in Franchise Tax Board of California v. Hyatt (2019). 2ff7e9595c


0 views0 comments

Recent Posts

See All

Comments


bottom of page