Sie sind auf Seite 1von 8

Sharat Vyas

Website: http://svyas7.wix.com/modern-computing
Website Intro Blurb
Hello user of the World Wide Web. My name is Sharat Vyas. This is a website I created to help
you the reader learn about the roots of computing and where computing is headed. Take any
possible occupation you can think of and you will soon realize that computing is intertwined in
that job. This is why computing is so important. It has become a driver of all major industries
and has allowed those industries to increase the capabilities and efficiency.
Important Figure: Alan Turing
Alan Turing is undoubtedly one of the most important figures in computing. According to Ian
Watson, a columnist at Scientific American Journal, he Turing is considered the father of
modern computing because of his theoretical invention, the Turing Machine. The Turing
Machine was conceptualized by Turing to understand the limitations of computing as well as
what can be computed. His theoretical machine paved the way for how modern computer
principles would be shaped. It is difficult to imagine a world without Turing's contributions.

Fun Fact: Turing is credited with saving the lives of approximately 14-21 million individuals. He
helped break the German enigma code.

Important Figure: Dennis Ritchie


Dennis Ritchie is like Midas, anything he touched would run on UNIX or C. According to the
BBC, UNIX and C were two of his greatest contributions and both changed the landscape of
computer software. UNIX is a multi-user operating system and C is a high-level (easier to

Formatted: Font: Italic

Sharat Vyas

understand) programming language. UNIX is at the heart of just about everything from Internet
servers- to every single Apple device on the face of this earth. If you don't think Ritchie is
important, just think of a world without Apple!

Microprocessors Tab
A microprocessor, also known as a CPU (central processing unit), it is the brains of the
computer. It allows for the computer to run a given software. CPU's started as towering racks of
vacuum tubes and switches. According to Tony Gaddis, author of Starting out with C++, tThe
CPU contains two2 major parts, the control unit and the algorithm logic unit. The control unit is
tasked with coordinating the computers operations, while the algorithm logic unit is tasked with
doing any mathematical calculations. The CPU carries out jobs in a specific matter.
1. Fetch- In this step, the CPU fetches a set of instructions from the RAM.
2. Decode- In this step, the instruction is coded into a number, after which it is sentd to the control
unit and decoded. The control unit then generates a electrical signal
3. Execute- In this step, the electrical signal is sent to the appropriate device whether it be the
memory, algorithm logic unit, or screen.
Why Is It Important?
It is due to advances in the CPU that our computers are able to run as swiftly as they do and
perform the complex operations that we ask of them. Due to their gradually decreasing size, we
are now able to make our laptops smaller and smaller. This has been a big part in why
smartphones have been able to do calculations thousands of times faster than the mainframe
computer of the 60's that took up a full room and took days to do the same task. The decrease in

Sharat Vyas

size of microprocessors along with their increase in productivity has paved the way for smaller
laptops, multi-functional smartphones, smartwatches, and even Apple's future smart ring.

Random Access Memory


According to Tony Gaddis, Aauthor of Starting out with C++, there are two types of memory in
a computer. .The first is random access memory(RAM) which is similar to a

human's muscles. Usually the more muscular the better. RAM is what allows
your computer to do the heavy lifting and run large applications.The first is random
access memory. It is known as random access because the CPU has the ability to access the data
held in this set of memory very quickly. RAM is volatile memory, meaning after the computer is
turned off, whatever data was on that memory is now gone unless it has been saved to the
secondary source of memory known as a disc drive. RAM is important because it allows the user
to run applications on it., Iin other words, when you run an application you are opening it and
running it on the RAM.

Chances are when you go to your local Best Buy to buy a computer, the salesman will tell you it
has 8GB RAM or 16GB RAM. 16GB of RAM means that there is 16GB of storage

available in the RAM that can be used to run an application compared to the
8GB, 6GB, or 4GB on another computer. RAM is a vital part of the computer. Part of the
reasons older computers could not manage large applications or games a computer was because
they did not have enough RAM or the RAM was too expensive.

Sharat Vyas

Fun fact: Your smartphone is thousands of times more powerful than any computer was in the
1980's!

Why Is It Important?
Advances in manufacturing technology, paired with growing big data demands have allowed
RAM to become significantly cheaper as well as smaller and more efficient. This has given
society as a whole, the ability to buy stronger computers and phones that can execute demanding
and large size applications. Due to increased efficiency, many phones and tablet manufacturers
have begun to add more and more RAM to their devices. This has caused the industry to see a
shift from consumers buying more laptops to consumers buying laptop/tablet hybrids and highend phones that can do the same tasks.

Operating Systems
Every single phone, tablet, smartwatch, and computer you own runs on a specific operating
system. According to Vangie Beal, a writer for WebOpeida WebOpedia, an operating system
is the vital software that supports a device's basic functions such as running an application,
getting input from the user, and displaying output on a screen. An operating system is like a

computer's manager, it's job is to make sure all the hardware and software are
doing what they are supposed to and make sure that tasks are happening
efficiently and to the user's liking.

Operating systems began being utilized in the early 1950's and were usually customized for a
specific user which were usually large companies. Some of the earliest examples include General
Motors Operating System which was created in 1955 and University of Michigans Executive
System developed in 1958.

Sharat Vyas

With TimeAs time wore on, the prices of computers began to deflate and become more
affordable for the user. It was at this point that universally used operating systems become more
and more common. The introduction of UNIX (1969), MS-DOS (1981), and Mac System
Software (1984) was the beginning of the operating systems race. It is still continuing as new
operating systems are constantly being introduced such as OSX El Capitan and Windows 10. It is
hard to imagine a world without the never ending battle of Windows vs. Apple which is in
essence a battle of flavor and functionality.

Why Is It Important?

Operating systems have continued to evolve as rapidly as the computer itself has evolved. As the
complexity and processing capability of the computer increases, so does the expectations of an
operating system. They are a crucial part of how a user interacts with a device. Their evolution
has been essential to the development of modern computers and smartphone/smartwatch
operating systems that we have grown to love and depend on.

Future of Computing
Bill Gates once said "640K ought to be enough for anybody (qtd inQTDN Computer Trade Show
1981)." He is referring to RAM (Random Access Memory) and specifically about 640 kilobytes
of it. To put it in perspective, that is .00064 gigabytes. Most laptops now days have 8 gigabytes
of RAM which 3,125 times more than his original prediction.

Sharat Vyas

As you can see, even industry titans like Bill Gates can make mistakes. Computing has come
quite a long way since 1980 let alone basic computers from the late 1940s800's. Not only has it
come a long way, but computing has an even brighter future. Computers are continuing to
become more and more affordable meaning it is becominge more and more accessible. Its
increase in accessibility means that it is now empowering more individuals than it ever has
before.

Moores Law
Gordon E. Moore is co-founder of Intel as well as the creator of Moore's law. In Moore's
Lawlaw, he observes that the number of transistors(an electrical switch to represent 1 and 0

on computer for binary) on in an integrated circuit tends to double approximately every two
years. His law provides us an insight of the future of computing. By 2018, processors should
hold about 20 bBillion transistors on a chip. Alex Fitzpatrick, a writer for TIMETime Magazine
believes 20 bBillion transistors on a single chip is unfathomable. If his law holds true, our
computers should be able to handle substantially larger programs that they ever have before.

Some scientists believe that in the coming year Moore's Law will no longer hold true
because it will become impossible to double the number of transistors on a integrated
circuit.

Perceptual Computing
Tech giant Intel has recently begun work on a new form of computing they like to call perceptual
computing. Bradley Jones, a writer for WebOpedia believes perceptual computing simply means
that the computer you have has the capability of reading your gestures, voice, and facial

Sharat Vyas

expressions, and your surrounding environment. This will be made possible in part by Intel's
RealSense device. It will have finger tracking, depth perception, cameras to provide a 3D effect,
as well as virtual reality capabilities.
The success of this venture, has the ability to completely alter the landscape of computing as we
know it. With facial recognition, the need of passwords will be removed. Users will able to point
their device at any given object and determine the distance. The whole experience of PC pc
gaming will be completely altered and will become quite immersive.

Artificial Intelligence
If you asked someone what artificial intelligence is, most would likely respond "robots" or
"computers that can think", Wwhile their responses can be considered true, they do not tell the
full story. Artificial Intelligence(AI) if the science of making computers tasks that require
intelligence when done by a human. Artificial IntelligenceAI has the versatility to be applied in a
vast number of fields. It is beginning to be applied in the automotive industry where Toyota is
investing $1 bBillion in research towards its applications in their cars.
Computing is heading towards an AI route where less and less of the work load has to be
managed in a hands-on fashion.
Artificial Iintelligence has continued to gain prominence since the late 1950's.
Luke Muehlhauser's research indicates that by 2040, advances in AI will be so significant that
AI's will be smarter than humans.
What Will Computers Look Like In The Future

Scientists believe that in the future computers will continue to get smaller and more
devices like smartwatches, smartrings will be powerful enough to perform the tasks

Sharat Vyas

asked of a computer. Paul Eng, a writer at ABC News believes that computers will move
from the use of a keyboard and mouse to voice. He thinks that a user will be able to tell
the user what they want and the computer will take care of it. In essence it will be a
ramped up version of Siri with thousands of more capabilites. Eng adds that he believes
that these features will be available and bug-free within 20 years.

* Chang, Karl K. "John Vincent Atanasoff and the Birth of Electronic Digital Computing." John
Vincent Atanasoff. Iowa State University Department of Computer Science, n.d. Web. 10
Nov. 2015.
* Jones, Bradley L. "It's Happening Now: Perceptual Computing Is Real." WebOpedia.
WebOpedia, 26 Mar. 2015. Web. 10 Nov. 2015.
* Fitzpatrick, Alex. "Here's Why IBM's New Computer Chip Matters." TIME. Time, 9 July
2015. Web. 10 Nov. 2015.

Das könnte Ihnen auch gefallen