Sie sind auf Seite 1von 6

Introduction to Computing and Information System (ICB10103)

Assignment 1

Title : Research on history of information system and its technology application for the future.

Semester 1/January 2012 Student Name Student ID Lecturers Name : : : Presilla Bunfat 52245112125 Puan Siti Salwa Hasbullah

What is Information System ? Information System or commonly called IS can be described as an interrelated component that retrived, process, store and finally distribute it among users. IS is also can be defined as a system that assembles, stores, processes and delivers the informations to the users whom wish to use it. IS solemnly used by an organizations such as in society (commonly used by business enterprises) to operates their systems. Over the years, the realibility of the underlying technology improved and there was a realization that IS had a social dimension, as many of the problems were identified as being people-related. The definition of an IS became broader, and the components of an IS were such as; people,data or information, procedure, software, hardware, and communication. In general, The IS allows the organizations to shares all the informations among themselves and citizens and functions as a whole. Peoples tend to have difficulties in differentiated the difference between the Data and the Information because its seems alike even there are not. To be exact,The Data is a raw fact taken from various sources and its cannot be fully trusted. Not like Data, Information at the meanwhile is more trusted because its had been processed using various ways which is 99.9% truthful. In IS, there are three activity that need to be done to get the desired informations which are Input, Processing and Output or commonly called IPO concept. The Input is a data that is have been provided but cannot be trusted because not being processed ( Processing ) yet. After the data been processed, the computers will display the final result which is called Output. This Output then will be send out to the users whom seek for it. This prosess is called giving a feedback. However, this ISs IPO is not just cannot be broadly described as Input-Processing-Output mechanism. This is because, during this process, the computers required the users to be not just computer literate be also be creative in inventing the structured organizations and let its to functions as a whole. Examples of 'information inputs' would be transactions, event which would undergo processing in the form of sorting, listing, merging and updating resulting in Outputs such as detailed reports, list and summary. Another example would be in the manufacturing environment with 'information inputs' such as design specs material requirements and the SOPs (standard operating procedures). These would be 'processed' by the information system by modeling and simulation techniques and would result in standard production models along with the overall cost of the production process which is calculated by the information system from the knowledge base containing material costs, hourly labor costs and other indirect costs. Hence almost totally eliminating a distinct costing function in the scheme of things. When its come to the History of IS, most peoples believed that IS started around 6000 years ago when humans know how to write and use to record things on a cave, rock or even a book (papyrus to be exact).There were The first thing was an Abacus that was created based on basic IS. This Abacus can calculate certain amount that was inserted by the users (Input) , then the users will calculate the amount using this Abacus (processing), and finally will get a final result (Output). By telling other peoples about the final result can be considered as a Feedback. In this case, users play an important rules in IPO concept. The first information explosion started back on early 1450 until 1840 where were born a lot of inventor whom had been created a lot of devices

which contributes to the rise of IS. The early IS gave rise to the development of the written language, taxations, accountancy, and banking (Steve Benson, p.p 6-7). The application of computing technology to information processing occurred exact at the end of 1950s. Later in 1960s, computer systems began to be used for truly commercial purposes. Before that time,they had been used in science and engineering in the first phase of the information revolution. Just as the Industrial Revolution extended physical capabilities, so the information revolution extended mental capabilities, initially in the area of calculation, as problems which could take a human a lifetime to solve could be handled by a computer on a time scale that was almost immediate . From there it was a short step to using computer to access process and filter huge quatities of data. This did not produce knowledge or give insight, but simply allowed mundane task to be automated and interesting question to be raised and answered. Man scientist and engineer who had worked in a computer science field moved into information-related activities,bringing whit them a mindset that was rigorous,logical and grounded in mathematical notation. This exerted a huge influence over the development of the disciple for many years to come. Contrary to the speed of information today, just over forty years ago, the business climate in United States was experiencing post-war growth much like it had never seen. Much of the experience that grew the economy had been learned during World War Two in tooling up the nations industries into producing an effective war machine. The field that developed out of this push to win the war was Operations Research (OR). When the war end those involved with OR were released from government work, thus unleashing an experienced and highly skilled field, like no other in history, into business and industry, which launched the US into a era of prosperity and growth that lasted over twenty-years. World War Two also saw the birth of the first practical computers or Turing Machines, which were responsible for cracking the German codes and giving the allies advanced warning of enemy movements. By today's standards these first practical computers were not that practical, half a million dollars and far less powerful than a pocket calculator which today purchased for under ten dollars. However these first computers gave Operations Researchers the power they needed to begin simulate larger and more complicated systems which in business and industry help greatly to hone uses capital expenditures into profitable ventures. This background from the early days of simulation, OR, and new technologies birthed studies into the areas of what became known as Information Systems. By the mid-sixties IS was already forging its way into business mainstream. While computers remained out of reach for most businesses, telecommunications made its mark with the TELEX machine. This step gave businesses the ability to communicate within its own organization anywhere in the world at any time and effectively pass instructions and information The use of computer in business and industry usually started off in the accounting departments. It was assumed that this area would know the most about using numerical machines and the lack of understanding in how important databases could be other areas of the business. By this time a number of business school began developing Management Information System (MIS) programs to meet the growing need of IS managers.

During the seventies more upper management recognized the importance of IS and the flexibility it was bring to business. The TELEX became the standard of information transfer and the mainframe computer became the standard for database creation. As the need for organized and easy access to data became apparent, information based businesses began moving the mainframes from under the accounting management to it own department. When its come to prediction of what kind of technology IS might posses five years onwards, there were many speculations. One of them is Audio Web Surfing which is the way much easier that typing the Web Address using our keyboard.

Audio Web Surfing

With this technology, it will make the users life more easier. They can surf the net on the blink of an eyes, which is mean, they only need to use their own voice to give command to the computer of what we were looking for and the computer will automatically search the command on the internet.

Multi Touch Computing or GUI/10

Other than Audio Web Surfing, Multi Touch Computing or GUI/10 is the another future technology that we might have in five to ten years from now. With this GUI/10, the users no longer need to use a mouse to click the data on the computer, as ubiquitous as it is, weakly offers only two sets of coordinates, while the users are capable of so much more. Touch screen technology is only scratching the surface on intuitive approaches to interfacing. There is lots of room for growth here. Simple Gesture implementation on the trackpad in OSX is so good that using gestureless laptops becomes unthinkable.

Artificial Intelligence (AI)

The term Artificial Intelligence was coined in 1956 by John McCarthy at the Massachusetts Institute of Technology. It is a branch of computer science that aims to make computers behave like humans.Artificial Intelligence includes programming computers to make decisions in real life situations (e.g. some of these expert systems help physicians in the diagnosis of diseases based on symptoms), programming computers to understand human languages (natural language), programming computers to play games such as chess and checkers (games playing), programming computers to hear, see and react to other sensory stimuli(robotics) and designing systems that mimic human intelligence by attempting to reproduce the types of physical connections between neurones in the human brain (neural networks). Natural-language processing would allow ordinary people who dont have any knowledge of programming languages to interact with computers.Through nanotechnology, computing devices are becoming progressively smaller and more powerful. Everyday devices with embedded technology and connectivity are becoming a reality. Nanotechnology has led to the creation of increasingly smaller and faster computers that can be embedded into small devices. This has led to the idea of pervasive computing which aims to integrate software and hardware into all man made and some natural products. It is predicted that almost any items such as clothing, tools, appliances, cars, homes, coffee mugs and the human body will be imbedded with chips that will connect the device to an infinite network of other devices. Hence, in the future network technologies will be combined with wireless computing, voice recognition, Internet capability and artificial intelligence with an aim to create an environment where the connectivity of devices is embedded in such a way that the connectivity is not inconvenient or outwardly visible and is always available. In this way, computer technology will saturate almost every facet of our life. What seems like virtual reality at the moment will become the human reality in the future of computer technology. As a conclusion, while reading along the history line of what is IS really was, I had realize that IS has done so many things and helps a lot f people confronting their daily lifes. Users tend to use IS more into expanding business and industry into global market than any other conventions in history. As we can see nowday, IS plays a very important role in many things. As a prove, we can see the backbone of IS is know as the World Wide Web (WWW), Internet, or with a business a Local Area Network (LAN) along with lists of acronym buzz word EDI,EIS,ERP,SCM, and other host of other to describe a new way in which IS can be employed to grow an organizations. By

predicting what kind of technology we might have in five to ten years onwards, we can see that all the predictions main purposes is to helps users facing their daily lifes.

Das könnte Ihnen auch gefallen