Sie sind auf Seite 1von 10

BASICS of Computer:

Computer Basics introduces general computer use and terminology. It describes the basic hardware components of a PC and introduces basic skills for using software programs in a windows environment. This course will address hardware and peripheral components of the computer and how to use them correctly.

A computer is an electronic device that has the ability to store, retrieve, and process data, and can be programmed with instructions that it remembers. The physical parts that make up a computer (the central processing unit, input, output, and memory) are called hardware. Programs that tell a computer what to do are called software. A set of instructions that perform a particular task is called a program, software program, or software. Peripherals are any hardware device connected to a computer, any part of the computer outside the CPU and working memory. Some examples of peripherals are keyboards, the mouse, monitors, printers, scanners, disk and tape drives, microphones, speakers, joysticks, plotters, and cameras.

A. MONITOR

The computer monitor is an output device that displays input on a screen and is very similar to a television monitor. When the computer wants to display something, it calculates how it needs to change the color and brightness of the different pixels, and changes the values in the video memory.

Controls for the monitor are located on the monitor itself. The monitor has an ON/OFF Button/Switch (which powers only the monitor) and an indicator light (green or amber).

A green indicator light denotes that the monitor is on. An amber light indicates that the computer is in sleep mode. Software in newer computers automatically shuts the monitor down when the computer is shut down-

-the monitor is put in a sleep mode and the indicator light turns amber. When the monitor light is amber, if the computer is booted up, the monitor will automatically come on when you move the mouse or press any key on the keyboard.

B.

INTERNAL COMPONENTS OF A COMPUTER

Motherboard Sometimes called the system board or main board, the motherboard is the main circuit board of a PC. The motherboard is the central nervous system and circulatory system, plus much more, all rolled into one. The motherboard typically contains the processor (or CPU), BIOS (basic input/output system), memory, mass storage interfaces, serial and parallel ports, expansion slots, and all the controllers required to communicate with standard peripheral devices, such as the display screen, mouse, keyboard and disk drive. Collectively, some of the chips which reside on the motherboard are known as the motherboard's chipset. Chipset The chipset controls the system and its capabilities. All components communicate with the processor through the chipset - it is the hub of all data transfer. The chipset uses the DMA controller and the bus controller to organize the steady flow of data that it controls. The chipset is a series of chips attached directly to the motherboard, and is usually second in size only to the processor. Chipsets are integrated (soldered onto the motherboard) and are not upgradable without a new motherboard. BIOS (Basic Input Output System) An integral part of the PC, the BIOS is the program a microprocessor uses to get the computer started after you turn it on. It also manages the data flow between the computer's operating system and attached peripheral devices. CPU (Central Processing Unit) The CPU is the computer's control center. Think of it as the brain that does all the thinking (computation). It reads instructions from your software and tells your computer what to do. The actual CPU is about 1.5 inches square, yet it is the most critical part of the computer. The speed at which the CPU processes information internally is measured in MegaHertz (MHz) and GigaHertz (GHz). 1 GHz is equal to 1,000 MHz. Generally, processors with

higher MHz or GHz enhance your ability to run creative, entertainment, communication, and productivity applications. MegaHertz -- One million cycles per second used to measure the speed of a CPU chip. ROM (Read Only Memory) A type of memory chip that does not lose information, even when the power is turned off. Once data is programmed into the ROM chip, its contents cannot be altered. For example, ROM BIOS chips are used to store information for starting up your computer.

RAM (Random Access Memory) Available for storing data and programs currently being processed. RAM is erased automatically when the power is turned off. Can be accessed without touching preceding bytes.

DIMM (Dual Inline Memory Module), a device that adds memory to a computer RDRAM (Rambus Dynamic Random Access Memory) Developed by Rambus Corporation, the narrow, high-performance channel also offers performance and capacity scalability through the use of multiple channels in parallel. Capable of providing up to 1.6 GB/sec bandwidth per channel. RDRAM is able to load a new stream of data before the previous stream has completed, resulting in less waiting time and therefore faster access speeds.

SDRAM (Synchronous Dynamic Random Access Memory). SDRAM synchronizes itself with the processors bus and is capable of running at 133 MHz. SDRAM enables a system to run applications and temporarily store documents that are being worked on.

SIMM (Single In-Line Memory Module) A narrow circuit board that contains RAM (also called DRAM) memory chips. The more RAM chips you add to the computer, the faster it operates and the more programs you can run at the same time. Depending on the computer, SIMMs may need to be installed in multiples of two or four.

Evolution of the Computer:

The first counting device was the abacus, originally from Asia. It worked on a placevalue notion meaning that the place of a bead or rock on the apparatus determined how much it was worth.

1600s: John Napier discovers logarithms. Robert Bissaker invents the slide rule which will remain in popular use until

1642: Blaise Pascal, a French mathematician and philosopher, invents the first mechanical digital calculator using gears, called the Pascaline. Although this machine could perform addition and subtraction on whole numbers, it was too expensive and only Pascal himself could repair it.

1804: Joseph Marie Jacquard used punch cards to automate a weaving loom. 1812: Charles P. Babbage, the "father of the computer", discovered that many long calculations involved many similar, repeated operations. Therefore, he designed a machine, the difference engine which would be steam-powered, fully automatic and commanded by a fixed instruction program. In 1833, Babbage quit working on this machine to concentrate on the analytical engine.

1840s: Augusta Ada. "The first programmer" suggested that a binary system should be used for storage rather than a decimal system.

1850s: George Boole developed Boolean logic which would later be used in the design of computer circuitry.

1890: Dr. Herman Hollerith introduced the first electromechanical, punched-card data-processing machine which was used to compile information for the 1890 U.S. census. Hollerith's tabulator became so successful that he started his own business to market it. His company would eventually become International Business Machines (IBM).

1906: The vacuum tube is invented by American physicist Lee De Forest. 1939: Dr. John V. Atanasoff and his assistant Clifford Berry build the first electronic digital computer. Their machine, the Atanasoff-Berry-Computer (ABC) provided the foundation for the advances in electronic digital computers.

1941, Konrad Zuse (recently deceased in January of 1996), from Germany, introduced the first programmable computer designed to solve complex engineering equations. This machine, called the Z3, was also the first to work on the binary system instead of the decimal system.

1943: British mathematician Alan Turing developed a hypothetical device, the Turing machine which would be designed to perform logical operation and could read and write. It would presage programmable computers. He also used vacuum technology to build British Colossus, a machine used to counteract the German code scrambling device, Enigma.

1944: Howard Aiken, in collaboration with engineers from IBM, constructed a large automatic digital sequence-controlled computer called the Harvard Mark I. This computer could handle all four arithmetic operations, and had special built-in programs for logarithms and trigonometric functions.

1945: Dr. John von Neumann presented a paper outlining the stored-program concept. 1947: The giant ENIAC (Electrical Numerical Integrator and Calculator) machine was developed by John W. Mauchly and J. Presper Eckert, Jr. at the University of Pennsylvania. It used 18, 000 vacuums, punch-card input, weighed thirty tons and occupied a thirty-by-fifty-foot space. It wasn't programmable but was productive from 1946 to 1955 and was used to compute artillery firing tables. That same year, the transistor was invented by William Shockley, John Bardeen and Walter Brattain of Bell Labs. It would rid computers of vacuum tubes and radios.

1949: Maurice V. Wilkes built the EDSAC (Electronic Delay Storage Automatic Computer), the first stored-program computer. EDVAC (Electronic Discrete Variable Automatic Computer), the second stored-program computer was built by Mauchly, Eckert, and von Neumann. An Wang developed magnetic-core memory which Jay Forrester would reorganize to be more efficient.

1950: Turing built the ACE, considered by some to be the first programmable digital computer.

The First Generation (1951-1959)

1951: Mauchly and Eckert built the UNIVAC I, the first computer designed and sold commercially, specifically for business data-processing applications.

1950s: Dr. Grace Murray Hopper developed the UNIVAC I compiler. 1957: The programming language FORTRAN (Formula translator) was designed by John Backus, an IBM engineer.

1959: Jack St. Clair Kilby and Robert Noyce of Texas Instruments manufactured the first integrated circuit, or chip, which is a collection of tiny little transistors.

The Second Generation (1959-1965)

1960s: Gene Amdahl designed the IBM System/360 series of mainframe (G) computers, the first general-purpose digital computers to use integrated circuits.

1961: Dr. Hopper was instrumental in developing the COBOL (Common Business Oriented Language) programming language.

1963: Ken Olsen, founder of DEC, produced the PDP-I, the first minicomputer (G). 1965: BASIC (Beginners All-purpose Symbolic Instruction Code) programming language developed by Dr. Thomas Kurtz and Dr. John Kemeny.

The Third Generation (1965-1971)


1969: The Internet is started. 1970: Dr. Ted Hoff developed the famous Intel 4004 microprocessor (G) chip. 1971: Intel released the first microprocessor, a specialized integrated circuit which was ale to process four bits of data at a time. It also included its own arithmetic logic unit. PASCAL, a structured programming language, was developed by Niklaus Wirth.

The Fourth Generation (1971-Present)

1975: Ed Roberts, the "father of the microcomputer" designed the first microcomputer, the Altair 8800, which was produced by Micro Instrumentation and Telemetry Systems (MITS). The same year, two young hackers, William Gates and Paul Allen approached MITS and promised to deliver a BASIC compiler. So they did and from the sale, Microsoft was born.

1976: Cray developed the Cray-I supercomputer (G). Apple Computer, Inc was founded by Steven Jobs and Stephen Wozniak.

1977: Jobs and Wozniak designed and built the first Apple II microcomputer. 1980: IBM offers Bill Gates the opportunity to develop the operating system for its new IBM personal computer. Microsoft has achieved tremendous growth and success today due to the development of MS-DOS. Apple III was also released.

1981: The IBM PC was introduced with a 16-bit microprocessor.

1982: Time magazine chooses the computer instead of a person for its "Machine of the Year."

1984: Apple introduced the Macintosh computer, which incorporated a unique graphical interface, making it easy to use. The same year, IBM released the 286-AT.

1986: Compaq released the DeskPro 386 computer, the first to use the 80036 microprocessor.

1987: IBM announced the OS/2 operating-system technology. 1988: A nondestructive worm was introduced into the Internet network bringing thousands of computers to a halt.

1989: The Intel 486 became the world's first 1,000,000 transistor microprocessor.

Recent Developments in Information Technology in India


IT term was coined in the late 1970s. It incorporates the whole of computing and telecommunication technology, together with major parts of consumer electronics and broadcasting. From a nascent start, India has made rapid progress in the IT domain. The opening of the economy in the early 1990s gave it a tremendous boost. Indias prowess in the software field is now recognized the world over, with its software consultancy firms having a truly global presence and with nearly all the top Fortune 500 Companies as its clients. The existence of a separate Department of Information Technology under the Central Government is a reminder to the importance attached to IT and the impact it is having on various fronts including governance. Information Technology has come to be recognized as a key-leveraging factor in the National Development. It has had a profound effect on other industries in increasing productivity and

changing cost structure. The Indian IT success story has also highlighted Indias attractiveness as an investment destination far beyond the IT sector. Another key impact of the global sourcing model popularised by the growth of IT and IT Enabled Services (ITES) has been the reversal of the brain drain as people of Indian origin (who went to pursue careers abroad), as well as young expatriates, are now attracted to work in India. The rapid growth of ITES-BPO and the IT industry as a whole has made a deep impact on the socioeconomic dynamics of the country, having a significant multiplier effect on the Indian economy. Apart from the direct impact on national income, the sector has risen to become the biggest employment generator with the number of jobs added almost doubling each year, has spawned a number of ancillary businesses such as transportation, real estate and catering; played a key role in the rise in direct-tax collection and has contributed to a rising class of young consumers with high disposable incomes. The industrys contribution to the national economic output is estimated to account for 4.1 per cent of the national GDP in the year 2004-05. The IT services and software sector is expected to add 109,000 jobs in the current fiscal, ITES-BPO another 94,500. The number of professionals employed in India by the IT and ITES sector is estimated at 1,045, 000 by March, 2005. Of these, 345,000 were in the IT software and services export industry; nearly 348,000 were in the ITES-BPO sector; 30,000 in the domestic software market and over 322,000 in user organizations. The Indian software and services export was to the tune of Rs. 78,230 crore (US$ 17.2 billion) in 2004-05, as compared to Rs. 58,240 crore (US $12.8 billion) in 200304, an increase of 34 per cent both in rupee terms and dollar terms. The Indian ITES-BPO (Business Process Outsourcing) sector industry also continues to grow from strength to strength, witnessing high levels of activity both onshore as well as offshore. Export revenues from ITES-BPO exports from India have exceeded the US $ 5 billion mark in the year 2004-05. Unprecedented growth of telecom subscribers have occurred during the last few years. As many as 22.18 million subscribers were added during the year 2004. Indias current tele-density is expected to have crossed 9 per 100 persons by now with the total number of connections currently standing at well over a 100 million. An interesting observation is that wireless phones have overtaken fixed line connections. Further, the consumer electronics sector is estimated to have achieved

a production level of Rs. 16,800 crore during 2004-05.

The change in the broadcasting scenario has been nothing sort of astounding with no less than a hundred broadcasters in the field competing with each other. Radio has made a big comeback with the launch of FM channels. Direct to Home (DTH) broadcast service in Ku-Band through satellite has been started by the National Broadcaster, in addition to one private DTH service provider. Good quality digital broadcast reception is available almost everywhere in the country to the citizens, on their television sets through the use of small dish antenna and a Set Top Box (STB). Besides bridging the entertainment divide, it has opened up an opportunity for manufacturing of set top boxes (STB) on a large scale. The Government has recognized the potential of Information and Communication Technology (ICT) for rapid and all round development in general and transforming governance in particular. Plans are afoot to promote e-Governance on a massive scale and a National e-Governance Plan has been drawn which seeks to implement 25 Mission Mode Projects for the present at the Centre, State and integrated service levels so as to create a citizen-centric and business-centric environment for governance, create the right governance and institutional mechanisms, set up core infrastructure, formulate key policies and channelise private sector technical and financial resources into the national e-Governance efforts. To ensure availability of trained manpower, spread of IT education has been given the necessary impetus both at the government and private level. Significant is the opening of Indian Institutes of Information Technology (IIITs) on the lines of Indian Institutes of Technology (IITs). Besides these, various certification courses like the highly popular DOEACC courses have been started. The National Association of Software and Service Companies (NASSCOM) have played a key role in the popularization of IT in India. With the rapid penetration of Internet and mobile communication technologies, a host of problems have cropped up including cyber crime, easy access to pornography, rampant piracy, invasion of privacy and a massive threat to intellectual property. This has also created a lot of social and ethical problems, which society has to address in a decisive manner. To tackle the problem of cyber crimes, the Information Technology Act 2000 has been passed, which was a landmark event. The IT Act 2000 provides the legal framework for establishing trust

in the electronic environment in the country. To keep pace with the times, more teeth is being added to it through major amendments. We are now in an era of what is termed as convergence. Today, technologies do not exist or work in isolation but in tandem. It is now foolish to talk about IT as a separate identity, as it is helping other branches to flourish and in turn it is being helped by them to prosper. The most significant development in the coming years would be the convergence of IT, Biotechnology and Nanotechnology to propel India into the big league. As a conclusion, it can be said that India is now an integral part of the Global Village, thanks to the developments witnessed in Information

Das könnte Ihnen auch gefallen