Sie sind auf Seite 1von 8

ASSIGNMENT

Course Code : MS - 07
Course Title : Information Systems for Managers
Assignment Code : MS-07/TMA/SEM - II/2016
Coverage : All Blocks
Note: Attempt all the questions and submit this assignment on or before 31st October, 2016
to the coordinator of your study centre.
1. Describe the management functions at various levels in the context of relationships
between management and information needs.
2. Describe the role of transaction processing system in management functions. Briefly
mention the subsystems of operations management.
3. How do Information systems contribute the Total Quality Management (TQM)? Justify
your answer with examples.
4. Why is data mining being put to use in more and more businesses? Also, briefly mention
some popular data mining techniques. .
5. Write short notes on:
a) Hyper Text Transfer Protocol (HTTP) & File Transfer Protocol (FTP)
b) Business software solutions provided by Microsoft
c) Features of JAVA

Answer
1. Describe the management functions at various levels in the context of relationships
between management and information needs.
Ans.: The application of management principles to the acquisition, organization, control,
dissemination and use of information relevant to the effective operation of organizations of
all kinds. 'Information' here refers to all types of information of value, whether having their
origin inside or outside the organization, including data resources, such as production data;
records and files related, for example, to the personnel function; market research data; and
competitive intelligence from a wide range of sources. Information management deals with
the value, quality, ownership, use and security of information in the context of organizational
performance.
The 21st century has brought with it a new workplace, one in which everyone must adapt to a
rapidly hanging society with constantly shifting demands and opportunities. The economy
has become global and is driven by innovations and technology and organizations have to
transform themselves to serve new customer expectations. Today’s economy presents
challenging opportunities as well as dramatic uncertainty. The new economy has become
knowledge based and is performance driven. The themes in the present context area ‘respect’,

1
participation, empowerment, teamwork and self management. In the light of the above
challenges a new kind of leader is needed to guide business through turbulence. Managers in
organizations do this task.
A manager is someone who coordinates and oversees the work of other people so that
organizational goals can be accomplished. It is not about personal achievement but helping
others do their job. Managers may also have additional work duties not related to
coordinating the work of others.
An organization is a deliberate arrangement of people to accomplish some specific purpose.
Organizations share three common characteristics: (1) Each has a distinct purpose (2) Each is
composed of people (3) Each develops some deliberate structure so members can do their
work. Although these three characteristics are important in defining what an organization is,
the concept of an organization is changing. The characteristic of new organizations of today
include: flexible work arrangements, employee work teams, open communication systems,
and supplier alliances. Organizations are becoming more open, flexible, and responsive to
changes. Organizations are changing because the world around them has changed and is
continuing to change. These societal, economic, global, and technological changes have
created an environment in which successful organizations must embrace new ways of getting
their work done.
The importance of studying management in today’s dynamic global environment can be
explained by looking at the universality of management, the reality of work, and the rewards
and challenges of being a manager.
A further difficulty in defining information management arises out of the often synonymous
use of the term information resource (or resources) management (IRM), the term used by the
US National Commission on Federal Paperwork in its report (1977), where 'paperwork',
including electronic documents of all kinds, was defined as constituting the information in
IRM. This usage appears to limit the idea of IRM, but the report goes on to say that an IRM
function (in US government agencies) would incorporate a wide range of disparate activities,
including records management, library management, computer systems, printing and
reprography, microforms and word-processing centres. Schneyman (1985) elaborates on this
definition of IRM to cover five types of 'information resources': systems support, including
computers and telecommunications; processing data, images, etc.; conversion and
transformation, including reprographics; distribution and communication, including network
management and telecommunications; and, finally, retention, storage and retrieval, which
covers libraries, record centres, filing systems, and internal and external databases. He adds
that, 'IRM supports IM by providing the technical capability and overall guidance for IM to
do its job', which the defines as managing the ownership, content, quality and use of
information.
2. Describe the role of transaction processing system in management functions. Briefly
mention the subsystems of operations management.
Ans.: A transaction is an elementary activity conducted during business operations.
Transaction processing systems (TPS) process the company's business transactions and thus

2
support the operations of an enterprise. A TPS records a non-inquiry transaction itself, as well
as all of its effects, in the database and produces documents relating to the transaction. TPS
are necessary to conduct business in almost any organization today. TPSs bring data into the
organizational databases, these systems are also a foundation on which management oriented
information systems rest.
Consider for a moment all the events that take place on a daily basis in an organization. Let's
take an electronics store as an example. A single store can easily carry 10,000 different items.
Throughout the day, customers come into the store, select a product and pay for it at the
checkout counter. Staff is continuously taking items from the stock room and placing them on
the shelves. When the stock runs low, new shipments are ordered. Other customers come in
to exchange items and deal with warranty issues.
All of these events are referred to as transactions, and keeping track of them requires a
transaction processing system. A transaction processing system, or TPS, is a system to
capture and process the detailed information necessary to update data on the fundamental
operations of an organization.
A transaction is essentially a single event that changes something. There are many different
types of transactions. For example, customer orders, receipts, invoices, payments, etc. The
actual processing of transactions includes the collection, editing, manipulation and storage of
data. The result of processing a transaction is that the records of an organization are updated
to reflect the new conditions at the time of the last processed transaction.
Consider the example of the electronics store. A customer buys a video game and pays for it
with cash at the register. This event is recorded as a sale transaction. However, it also triggers
other transactions.
First, the amount of cash at the register has just gone up. Second, the inventory of the
particular video game has gone down by one. These transactions are logically linked - they
occur on the same day at the same time and involve the same item. Linking the transactions
provides improved data consistency since one cannot exist without the other. The amount of
cash in the register cannot go up unless some transaction makes this happen.
There are many different types of transaction processing systems, such as payroll, inventory
control, order entry, accounts payable, accounts receivable and others. Transaction processing
produces valuable input into many other systems in an organization, such as management
information systems and decision support systems. A TPS serves as the foundation for these
other systems. A TPS tracks routine operations but does not provide much support for
decision making.
For example, in the case of a bank account, a TPS keeps track of all the events associated
with a single account: deposits, withdrawals, transfers, fees, interest paid, etc. This provides a
good description of the account activity.
Now let's say the customer comes into the bank and requests a car loan. The account activity
is useful information but not enough for the bank to make a decision on the car loan. This
requires combining information from different sources and analyzing the financial profile of
the customer.

3
3. How do Information systems contribute the Total Quality Management (TQM)? Justify
your answer with examples.
Ans.: “Total quality management” as a term to describe an organization's quality policy and
procedure has fallen out of favor as international standards for quality management have been
developed. Please see our series of pages on Quality Management Systems for more
information. When planning and implementing a total quality management system or quality
management strategy, there is no one solution for every situation. Each organization is unique
in terms of the culture, management practices, and the processes used to create and deliver its
products and services. The quality management strategy will then vary from organization to
organization; however, a set of primary elements should be present in some format.
The word system (System) as a concept found since the last century and quickly spread to
become using widely and pictures of many that became associated with a lot of various
aspects of life, We hear the economic system and the political system , information System
word system (System) similar and differed uses therefore knows the system as "a set of
interrelated elements that interact together to achieve common goals and specific objectives.
The system is a set of components or parts that are integrated with some are governed
relations and mechanisms of action of certain in a specific range is intended to achieve a
particular goal which type of information systems, which are designed to provide
administrators in the organization with the necessary information for planning, organization
and command and control on the activity of the organization, or to help them make decisions.
Information technology for Total Quality Management has been significantly implemented
on most organizations and each has been widely researched. Many organizations are
providing better products and services with the help of introducing Information Technology
in Total Quality Management. The global competition has enhanced the role of quality in
business world whereas competition is adding to pressure to the organization. These
challenges and pressures have placed a renewed focus on quality improvement for the long-
term survival of the organization. Technology acts as an enabling mechanism, which results
in enriched jobs and increased job satisfaction. TQM is a philosophy of management and
asset for customer centric practices for delivering quality. The TQM principle, practices and
techniques can be applied to all functions within an organization including information
system, marketing, finance and research and development.
Total quality management is a management approach that originated in the 1950’s and has
steadily become more popular since the early 1980’s. Total Quality is a description of the
culture, attitude and organization of a company that strives to provide customers with
products and services that satisfy their needs. The culture requires quality in all aspects of the
company’s operations, with processes being done right the first time and defects and waste
eradicated from operations.
Total Quality (TQ) constitutes a new managerial method, a new system of values, new
priorities as a basis of decision making, a new way of managing human resources, a new and
more concrete approach to problem solving as well as far too many concepts. In other words,
TQM is a management approach for improving organizational performance that encompasses
a variety of topics both technical and behavioral. Deming, for example, prescribed TQM in

4
14 points which he claimed to be a set of principles to remain competitive in providing
products and services (Deming, 1986). Anderson (1994) studied Deming principles and
developed a conceptual framework for TQM using seven concepts, which include visionary
leadership, internal and external communication, learning, process management, continuous
improvement, employee fulfillment and customer satisfaction.
4. Why is data mining being put to use in more and more businesses? Also, briefly mention
some popular data mining techniques. .
Ans.: Generally, data mining (sometimes called data or knowledge discovery) is the process
of analyzing data from different perspectives and summarizing it into useful information -
information that can be used to increase revenue, cuts costs, or both. Data mining software is
one of a number of analytical tools for analyzing data. It allows users to analyze data from
many different dimensions or angles, categorize it, and summarize the relationships
identified. Technically, data mining is the process of finding correlations or patterns among
dozens of fields in large relational databases.
Although data mining is a relatively new term, the technology is not. Companies have used
powerful computers to sift through volumes of supermarket scanner data and analyze market
research reports for years. However, continuous innovations in computer processing power,
disk storage, and statistical software are dramatically increasing the accuracy of analysis
while driving down the cost.
For example, one Midwest grocery chain used the data mining capacity of Oracle software to
analyze local buying patterns. They discovered that when men bought diapers on Thursdays
and Saturdays, they also tended to buy beer. Further analysis showed that these shoppers
typically did their weekly grocery shopping on Saturdays. On Thursdays, however, they only
bought a few items. The retailer concluded that they purchased the beer to have it available
for the upcoming weekend. The grocery chain could use this newly discovered information in
various ways to increase revenue. For example, they could move the beer display closer to
the diaper display. And, they could make sure beer and diapers were sold at full price on
Thursdays.
Data mining derives its name from the similarities between searching for valuable business
information in a large database — for example, finding linked products in gigabytes of store
scanner data — and mining a mountain for a vein of valuable ore. Both processes require
either sifting through an immense amount of material, or intelligently probing it to find
exactly where the value resides. Given databases of sufficient size and quality, data mining
technology can generate new business opportunities by providing these capabilities:
Automated prediction of trends and behaviors. Data mining automates the process of finding
predictive information in large databases. Questions that traditionally required extensive
hands-on analysis can now be answered directly from the data — quickly. A typical example
of a predictive problem is targeted marketing. Data mining uses data on past promotional
mailings to identify the targets most likely to maximize return on investment in future
mailings. Other predictive problems include forecasting bankruptcy and other forms of
default, and identifying segments of a population likely to respond similarly to given events.

5
Automated discovery of previously unknown patterns. Data mining tools sweep through
databases and identify previously hidden patterns in one step. An example of pattern
discovery is the analysis of retail sales data to identify seemingly unrelated products that are
often purchased together. Other pattern discovery problems include detecting fraudulent
credit card transactions and identifying anomalous data that could represent data entry keying
errors.
Data mining techniques can yield the benefits of automation on existing software and
hardware platforms, and can be implemented on new systems as existing platforms are
upgraded and new products developed. When data mining tools are implemented on high
performance parallel processing systems, they can analyze massive databases in minutes.
Faster processing means that users can automatically experiment with more models to
understand complex data. High speed makes it practical for users to analyze huge quantities
of data. Larger databases, in turn, yield improved predictions.
Databases can be larger in both depth and breadth: More columns. Analysts must often
limit the number of variables they examine when doing hands-on analysis due to time
constraints. Yet variables that are discarded because they seem unimportant may carry
information about unknown patterns. High performance data mining allows users to explore
the full depth of a database, without preselecting a subset of variables.
More rows. Larger samples yield lower estimation errors and variance, and allow users to
make inferences about small but important segments of a population.
A recent Gartner Group Advanced Technology Research Note listed data mining and
artificial intelligence at the top of the five key technology areas that "will clearly have a
major impact across a wide range of industries within the next 3 to 5 years."2 Gartner also
listed parallel architectures and data mining as two of the top 10 new technologies in which
companies will invest during the next 5 years. According to a recent Gartner HPC Research
Note, "With the rapid advance in data capture, transmission and storage, large-systems users
will increasingly need to implement new and innovative ways to mine the after-market value
of their vast stores of detail data, employing MPP [massively parallel processing] systems to
create new sources of business advantage (0.9 probability)."
The technique that is used to perform these feats in data mining is called modeling. Modeling
is simply the act of building a model in one situation where you know the answer and then
applying it to another situation that you don't. For instance, if you were looking for a sunken
Spanish galleon on the high seas the first thing you might do is to research the times when
Spanish treasure had been found by others in the past. You might note that these ships often
tend to be found off the coast of Bermuda and that there are certain characteristics to the
ocean currents, and certain routes that have likely been taken by the ship’s captains in that
era. You note these similarities and build a model that includes the characteristics that are
common to the locations of these sunken treasures. With these models in hand you sail off
looking for treasure where your model indicates it most likely might be given a similar
situation in the past. Hopefully, if you've got a good model, you find your treasure.

6
This act of model building is thus something that people have been doing for a long time,
certainly before the advent of computers or data mining technology. What happens on
computers, however, is not much different than the way people build models. Computers are
loaded up with lots of information about a variety of situations where an answer is known and
then the data mining software on the computer must run through that data and distill the
characteristics of the data that should go into the model. Once the model is built it can then be
used in similar situations where you don't know the answer. For example, say that you are the
director of marketing for a telecommunications company and you'd like to acquire some new
long distance phone customers. You could just randomly go out and mail coupons to the
general population - just as you could randomly sail the seas looking for sunken treasure. In
neither case would you achieve the results you desired and of course you have the
opportunity to do much better than random - you could use your business experience stored
in your database to build a model.
5. Write short notes on:
a) Hyper Text Transfer Protocol (HTTP) & File Transfer Protocol (FTP)
Ans.: HTTP and FTP are both network protocols for file transfer. HTTP is short for Hyper
Text Transfer Protocol, and FTP is short for File Transfer Protocol. Both use TCP
(Transmission Control Protocol) to transfer files.
The difference between the two is that HTTP is a protocol used by the World Wide Web that
allows the transfer of files from a web server to a user’s web browser for viewing web pages
on the Internet, while the FTP protocol is used to transfer files from a computer to and from
an FTP server. FTP facilitates the transfer of files from one computer to another.
HTTP transfers only web page content to the browser so as to view it. The transferred file is
not saved in the memory space. FTP, on the other hand, transfers the whole file to the another
computer, and the file is saved in memory. HTTP does not require a password and user name
to access the server to transfer files, whereas the FTP protocol requires authentication.
HTTP is faster and more efficient for transferring smaller files, while FTP is faster and more
efficient in transferring larger files. HTTP is able to use a single connection to transfer
multiple files, while FTP requires a new connection to be created with each file transfer.
Both HTTP and FTP can be used for transferring files. One difference, from a user's
standpoint is that HTTP uses port 80, while FTP uses port 21 for its command channel and
port 20 or some random port for its data channel, which depends on the mode being used, i.e.
active or passive (Active v.s. Passive FTP Simplified). For this reason, HTTP connections are
much easier to configure on firewalls.
b) Business software solutions provided by Microsoft
Ans.: Customer Relationship Management is a fast growing segment of Enterprise Software.
It is a strategic market for Microsoft that offers tremendous growth opportunity. Microsoft
CRM delivers best of breed user experiences and leverages Microsoft technologies to
transform the way customers manage their businesses. It is an agile team that delivers in a
fast paced cloud services environment. CRM is one of the first Microsoft products to offer a

7
complete cloud service, and delivers true ‘Power of Choice’ by offering both Cloud and on
premises solutions via Outlook, web browser and mobile clients. CRM is one of the fastest
growing business applications from Microsoft and the fastest growing CRM solution in the
industry.
The Dynamics CRM team at MSIDC develops Sales, Customer Service and Marketing
applications. Along with applications, the Dynamics CRM team at IDC is working on a next
generation mobile enterprise application platform. The team has also contributed to the
development of integration with SharePoint for document collaboration scenarios in CRM. In
the latest release of Dynamics CRM, the MSIDC team has worked on the next gen user
experiences in Sales & Customer Service applications.
c) Features of JAVA
Ans.: Java programming language was originally developed by Sun Microsystems which was
initiated by James Gosling and released in 1995 as core component of Sun Microsystems'
Java platform (Java 1.0 [J2SE]).
The latest release of the Java Standard Edition is Java SE 8. With the advancement of Java
and its widespread popularity, multiple configurations were built to suite various types of
platforms. Ex: J2EE for Enterprise Applications, J2ME for Mobile Applications.
The new J2 versions were renamed as Java SE, Java EE and Java ME respectively. Java is
guaranteed to be Write Once, Run Anywhere.
There is given many features of java. They are also known as java buzzwords. The Java
Features given below are simple and easy to understand.
 Simple
 Object-Oriented
 Platform independent
 Secured
 Robust
 Architecture neutral
 Portable
 Dynamic
 Interpreted
 High Performance
 Multithreaded
 Distributed