Sie sind auf Seite 1von 7

Tommy Casey MGNT 4215 Summer 2011 28 June 2011 What is the future of computers?

For over 30 years the personal computer or PC for short has been an important part of our daily lives, once just a luxury it quickly became a common device in many homes, but has the PC been given a death sentence? If you believe Steve Jobs, the co-founder and chief executive officer of Apple, it has. During a presentation this month at the Apple Worldwide Developers Conference in San Francisco, to showcase their software and service, Steve Jobs laid out the plan to delegate the role of a PC to strictly a device. As stated by Mr. Jobs, We are going to demote the PC to just be a device. We are going to move the digital hub, the center of your digital life, into the cloud. Once these new services begin later this fall, people who buy an iOS device, Apples mobile operating system, can fully get by without a computer. They will no longer need to plug an iOS device into a PC to activate it; iCloud will automatically sync and backup peoples photos, music and documents. All software will be updated over the Internet. Mr. Jobs who has likened the PC to a pickup truck used only for work, it really could be the beginning of the end for the home computer (Bilton). As we have transformed into an always connected global community, computer technology will continue to evolve to meet those needs. This statement has given proof to a prediction made about 40 years ago by Intel co-founder Gordon Moore. His prediction, popularly known as Moores Law, states that transistor density on integrated circuits doubles about every two years. As a result the scale gets smaller and smaller (Intel). At one time what filled an entire room and could only provide one calculation at a time could now fit onto a desktop and perform multiple calculations. This was one of the important

1 of 7

Tommy Casey MGNT 4215 Summer 2011 28 June 2011 achievements that enabled computers to be moved from government labs and universities into the home and in the future, if Steve Jobs is correct, into the cloud.

The future of computer development cannot be thoroughly understood without first understanding its history. Computer development is often characterized by generations; generations refer to a major technological development that allowed for advancement of the technology, advancements that led to smaller, cheaper and more reliable computers. In my research I was able to find a timeline at Webopedia.com entitled The Five Generations of Computers. I found Webopedia to be useful since unlike many wiki sites, Webopedia uses multiple sources from full-time editors to professionals working in the field to verify the information posted. From the timeline, the first generation <1940-1956> describes the first computers. The primary technological breakthrough was the use of vacuum tubes for circuitry and magnetic drums for memory. The negative aspect of using vacuum tubes was that they required a lot of space which would take up entire rooms. In addition they were expensive and used a lot of electricity. Because of this aspect they would create a great amount of heat that would often lead to computer malfunctions. In addition the first generation computers relied on machine language, the lowest programming language only understood by computers, to perform operations and they could only solve one problem at a time. The first generation computers used punch cards for input and printouts for output. From the first generation of computers we received the first commercial computer, the UNIVAC and ENIAC. As stated in the article, The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951 (Webopedia).

2 of 7

Tommy Casey MGNT 4215 Summer 2011 28 June 2011 The second generation <1956-1963> of computers was the result of transistors replacing vacuum tubes. Transistors were invented in the late 40s but were not widely used until the 50s. Although transistors generated a vast amount of heat, just as the vacuum tubes had, it was an improvement over vacuum tubes. Transistors allowed computers to become smaller, faster, cheaper and more efficient. In addition the language changed from cryptic binary machine language to symbolic, or assembly, languages which allowed for instructions to be specified in words instead of computer code. As with the first generation of computers, the second generation still required the use of punch cards and print outs for their means of input and output.

The technology advancement which ushered in the third generation <1964-1971> was the development of the integrated circuit. As noted in the article The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers (Wedopedia). In addition punch cards and printouts were replaced with input devices such as keyboards and monitors therefore allowing for easier input of information.

The new technology that brought in the fourth generation <1971-Present> was the microprocessor. The microprocessor allowed for thousands of integrated circuits to be built onto a single chip therefore, as with its predecessors, allowing for computers to become smaller, faster, cheaper and more efficient. In addition the fourth generation led to the inventions of GUIs, the mouse and other handheld devices allowing for further interactions with computers. Major milestones of this generation were the introduction of computers by both IBM and Apple

3 of 7

Tommy Casey MGNT 4215 Summer 2011 28 June 2011 for home use along with the development of the Internet which connected multiple computers onto networks.

Moving into the future of computers the fifth generation <Present and Beyond> includes technologies that are currently in development today such as artificial intelligence which includes expert systems, neural networks, genetic algorithms and intelligent agents, quantum computation and mobile computing.

Artificial Intelligence, the science of making machines imitate human behavior and thinking, is already in use today in hospitals, government agencies, universities and colleges, as well as in many businesses. While artificial intelligence can be applied to many different applications, it provides business with many opportunities to increase efficiency, cost and profits. As defined in the textbook Management Information Systems, MIS for short, AI can provide you with an expert system that can capture expertise, thus making it available to those who are not experts so that they can use it, either to solve a problem or learn how to solve a problem. An expert system, also called a knowledge-based system, is an artificial intelligence system that applies reasoning capabilities to reach a conclusion (Haag 168). Another way of simulating human behavior is by the use of a neural network. By simulating how humans learn by example in order to classify things, neural networks have many uses. For example the ability to determine financial patterns, crime patterns, fraud, and/or any other areas where a pattern can be seen by the input of vast information. Identifying a pattern provides expertise in how best to come to a solution to a problem. Genetic algorithms is another form of AI, as the MIS textbook states A genetic algorithm is an artificial intelligence system that mimics the evolutionary, survival of the 4 of 7

Tommy Casey MGNT 4215 Summer 2011 28 June 2011 fittest process to generate increasingly better solutions to a problem. In other words, a genetic algorithm is an optimizing system: It finds the combination of inputs that give the best outputs (Haag 174). As mentioned in the MIS textbook, An intelligent agent is software that assists you, or acts on your behalf, in performing repetitive computer-related tasks. Future intelligent agents will most likely be autonomous, acting independently, and will learn to adapt to changing circumstances (Haag 175). An intelligent agents task includes searches, monitoring and surveillance, data mining and user or personal agents. As computers become more intelligent they will require more computing ability and processing power, the next logical step will be quantum computing. Quantum computing harnesses the power of atoms and molecules rather than silicon based chips to perform calculations much faster. As noted in an article by Kevin Bonsor and Jonathan Strickland entitled How Quantum Computers Work, Quantum computers aren't limited to two states; they encode information as quantum bits, or qubits, which can exist in superposition. Qubits represent atoms, ions, photons or electrons and their respective control devices that are working together to act as computer memory and a processor. Because a quantum computer can contain these multiple states simultaneously, it has the potential to be millions of times more powerful than today's most powerful supercomputers. This superposition of qubits is what gives quantum computers their inherent parallelism. According to physicist David Deutsch, this parallelism allows a quantum computer to work on a million computations at once, while your desktop PC works on one (Bonsor). The advancements and future development of artificial intelligence and quantum computers will have a great impact on mobile technology, or mobile computing. As defined in our textbook Business Driven Technology, Mobile computing allows people to use IT without being tied to a single location (Baltzan 5 of 7

Tommy Casey MGNT 4215 Summer 2011 28 June 2011 180). As I discussed in the introduction, Steve Jobs along with many other manufacturers are betting that well be a mobile society thatll want our technology on the go. Not to be confined to a desk but to have information on demand, anywhere at any time from the cloud. Cloud computing as defined in our textbook Business Driven Technology refers to resources and applications hosted remotely as a shared service over the Internet(Baltzan 459). Due to technological advances cloud computing has also become cost effective for many businesses as well as individual use. Due in fact to the advantages such as the ease of switching providers, pay per usage, lower total cost of ownership, time to market, easier upgrades and improved security.

Although no one knows for certain what the future holds, advances in artificial intelligence, computing speeds through quantum computing and mobile computing are all technologies that hold much promise as being the future of computers.

6 of 7

Tommy Casey MGNT 4215 Summer 2011 28 June 2011 Works Cited Baltzan, Paige. Phillips, Amy. Business Driven Technology. pp 180:459.4th ed. McGraw-Hill Irwin. 2010. Bilton, Nick. Apple Sounds the PC Death Knell. 6 June 2011.The New York Times.20 June 2011 <http://bits.blogs.nytimes.com/2011/06/06/apple-really-is-slowly-killing-the-pc/>. Bonsor, Kevin. Strickland, Jonathan. How Quantum Computers Work. Discover Communications, LLC. 20 June 2011 <http://computer.howstuffworks.com/quantum-computer1.htm>. Haag, Stephen. Cummings, Maeve. Management Information Systems for the Information Age. pp 168:174:175. 7th ed. McGraw-Hill Irwin. 2008. Intel. Moores Law. Intel Coorporation.20 June2011 <http://download.intel.com/museum/Moores_Law/Printed_Materials/Moores_Law_2pg.pdf>. Webopedia. The Five Generations of Computers. 8 Oct. 2010. 20 June 2011 <http://www.webopedia.com/DidYouKnow/Hardware_Software/2002/FiveGenerations.asp>.

7 of 7

Das könnte Ihnen auch gefallen