Sie sind auf Seite 1von 33

August 2009

Bachelor of Science in Information Technology (BScIT) – Semester 4


BT0047 – Management Information System – 2 Credits
(Book ID: B0048)
Assignment Set – 1 (30 Marks)

Answer all questions 5 x 6 = 30

Ques 1. Define MIS and its objectives. What are the characteristics of MIS?

Ans: A management information system (MIS) is a subset of the overall internal


controls of a business covering the application of people, documents, technologies,
and procedures by management accountants to solve business problems such as
costing a product, service or a business-wide strategy. Management information
systems are distinct from regular information systems in that they are used to
analyze other information systems applied in operational activities in the
organization. Academically, the term is commonly used to refer to the group of
information management methods tied to the automation or support of human
decision making, e.g. Decision Support Systems, Expert systems, and Executive
information systems. It has been described as, "MIS 'lives' in the space that
intersects technology and business. MIS combines tech with business to get people
the information they need to do their jobs better/faster/smarter. Information is the
lifeblood of all organizations - now more than ever. MIS professionals work as systems
analysts, project managers, systems administrators, etc., communicating directly with
staff and management across the organization."

Definition
An 'MIS' is a planned system of the collecting, processing, storing and disseminating
data in the form of information needed to carry out the functions of management. In a
way it is a documented report of the activities those were planned and executed.
According to Philip Kotler "A marketing information system consists of people,
equipment, and procedures to gather, sort, analyze, evaluate, and distribute needed,
timely, and accurate information to marketing decision makers."

The terms MIS and information system are often confused. Information systems
include systems that are not intended for decision making. The area of study called
MIS is sometimes referred to, in a restrictive sense, as information technology
management. That area of study should not be confused with computer science. IT
service management is a practitioner-focused discipline. MIS has also some
differences with Enterprise Resource Planning (ERP) as ERP incorporates elements
that are not necessarily focused on decision support. Professor Allen S. Lee states
that "...research in the information systems field examines more than the
technological system, or just the social system, or even the two side by side; in
addition, it investigates the phenomena that emerge when the two interact."
An MIS provides the following advantages.

1. It Facilitates planning: MIS improves the quality of plants by providing relevant


information for sound decision – making. Due to increase in the size and complexity of
organizations, managers have lost personal contact with the scene of operations.
2. It Minimizes information overload: MIS change the larger amount of data in to
summarized form and there by avoids the confusion which may arise when managers
are flooded with detailed facts.

3. MIS Encourages Decentralization: Decentralization of authority is possibly when


there is a system for monitoring operations at lower levels. MIS is successfully used
for measuring performance and making necessary change in the organizational plans
and procedures.

4. It brings Co ordination: MIS facilities integration of specialized activities by


keeping each department aware of the problem and requirements of other
departments. It connects all decision centers in the organization.

5. It makes control easier: MIS serves as a link between managerial planning and
control. It improves the ability of management to evaluate and improve performance.
The used computers has increased the data processing and storage capabilities and
reduced the cost.

6. MIS assembles, process, stores , Retrieves , evaluates and Disseminates the


information .
Characteristics of a Management Information System

a) Provides reports with fixed and standard formats


b) Hard-copy and soft-copy reports
c) Uses internal data stored in the computer system
d) End users can develop custom reports
e) Requires formal requests from users

Management Information Systems for Competitive Advantage


a)Provides support to managers as they work to achieve corporate goals
b)Enables managers to compare results to established company goals and identify
problem areas and opportunities for improvement

MIS and Web Technology

a)Data may be made available from management information systems on a


company’s intranet
b)Employees can use browsers and their PC to gain access to the data

Functional Aspects
a)MIS is an integrated collection of functional information systems, each
supporting particular functional areas.
Ques 2. Explain data processing.

Computer data processing is any process that uses a computer program to enter data
and summaries, analyze or otherwise convert data into usable information. The
process may be automated and run on a computer. It involves recording, analyzing,
sorting, summarizing, calculating, disseminating and storing data. Because data are
most useful when well-presented and actually informative, data-processing systems
are often referred to as information systems. Nevertheless, the terms are roughly
synonymous, performing similar conversions; data-processing systems typically
manipulate raw data into information, and likewise information systems typically take
raw data as input to produce information as output.

Data processing may or may not be distinguished from data conversion, when the
process is merely to convert data to another format, and does not involve any data
manipulation.
Data analysis When the domain from which the data are harvested is a science or
an engineering field, data processing and information systems are considered terms
that are too broad and the more specialized term data analysis is typically used. This
is a focus on the highly-specialized and highly-accurate algorithmic derivations and
statistical calculations that are less often observed in the typical general business
environment. In these contexts data analysis packages like DAP, gretl or PSPP are
often used. This divergence of culture is exhibited in the typical numerical
representations used in data processing versus numerical; data processing's
measurements are typically represented by integers or by fixed-point or binary-coded
decimal representations of numbers whereas the majority of data analysis's
measurements are often represented by floating-point representation of rational
numbers. Processing Practically all naturally occurring processes can be viewed as
examples of data processing systems where "observable" information in the form of
pressure, light, etc. are converted by human observers into electrical signals in the
nervous system as the senses we recognize as touch, sound, and vision. Even the
interaction of non-living systems may be viewed in this way as rudimentary
information processing systems. Conventional usage of the terms data processing
and information systems restricts their use to refer to the algorithmic derivations,
logical deductions, and statistical calculations that recur perennially in general
business environments, rather than in the more expansive sense of all conversions of
real-world measurements into real-world information in, say, an organic biological
system or even a scientific or engineering system.

Elements of data processing


In order to be processed by a computer, data needs first be converted into a machine
readable format. Once data is in digital format, various procedures can be applied on
the data to get useful information. Data processing may involve various processes,
including:

•Data acquisition
•Data entry
•Data cleaning
•Data coding
•Data tranformation
•Data translation
•Data summarization
•Data aggregation
•Data validation
•Data tabulation
•Statistical analysis
•Computer graphics
•Data warehousing
•Data mining

Ques3 What are the recent developments in database technology?


A database is an integrated collection of logically related records or files consolidated
into a common pool that provides data for one or more multiple uses. One way of
classifying databases involves the type of content, for example: bibliographic, full-
text, numeric, image. Other classification methods start from examining database
models or database architectures: see below.

Software organizes the data in a database according to a database model. As of


2009[update] the relational model occurs most commonly. Other models such as the
hierarchical model and the network model use a more explicit representation of
relationships.

Recent developments in database technology

During the past three decades, the database technology for information systems has
undergone four generations of evolution, and the fifth generation database
technology is currently under development. The transition from one generation to the
next has always been necessitated by the ever-increasing complexity of database
applications and the cost of implementing, maintaining, and extending these
applications. The first generation was file system, such as ISAM and VSAM. The
second generation was hierarchical database systems, such as IMS and System 2000.
The third generation was CODASYL database systems, such as IDS, TOTAL, ADABAS,
IDMS, etc. The second and third generation systems realized the sharing of an
integrated database among many users within an application environment. The lack
of data independence and the tedious navigational access to the database gave rise
to the fourth-generation database technology, namely relational database
technology. Relational database technology is characterized by the notion of a
declarative query. Fifth-generation database technology will be characterized by a
richer data model and a richer set of database facilities necessary to meet the
requirements of applications beyond the business data-processing applications for
which the first four generations of database technology have been developed.

The transition from one generation to the next of the database technology has been
marked by the offloading of some tedious and repetitive bookkeeping functions from
the applications into the database system. This has made it easy for the application
programmers to program database application; however, it made the performance of
database systems a major problem, and required considerable research and
development to increase the performance of the new generation database systems to
an acceptable level. This point was particularly true with the transition into the area
of relational databases. The introduction of declarative queries in relational databases
relieved application programmers of the tedious chore of programming navigational
retrieval of records from the database. However, a major new component, namely the
query optimizer, had to be added to the database system to automatically arrive at
an optimal plan for executing any given query, such that the plan will make use of
appropriate access methods available in the system.

During the 1970s research and development activities in databases were focused on
realizing the relational database technology. These efforts culminated in the
introduction of commercially available systems in late 70s and early 80s, such as
Oracle, SQL/DB and DB2 and INGRES. However, relational database technology, just
as each of the previous generation database technology, was developed for the
conventional business data-processing applications, such as inventory control,
payroll, accounts, and so on. Attempts to make use of relational database technology
in a wide variety of other types of application have quickly exposed several serious
shortcomings of the relational and past-generation database technology. These
applications include computer-aided design, engineering, software engineering and
manufacturing (CAD, CAE, CASE and CAM) systems and applications that run on
them; knowledge-based systems (expert systems and expert system shells);
multimedia systems which deal with images, voice, and textual documents and
programming language systems. Relational and past-generation database systems
will henceforth be called conventional database systems.

Let us review several of the well-known shortcomings of the conventional database


technology:

• A conventional data model, especially the relational model, is too simple


for modeling complex nested entities, such as design and engineering
objects, and complex documents.

• Conventional database systems support only a limited set of atomic data


types, such as integer, string, etc.; they do not support general data
types found in programming languages.

• The performance of conventional database systems, especially relational


database systems, is unacceptable for various types of compute-
intensive applications, such as simulation programs in computer-aided
design and programming language environments.

• Application programs are implemented in some algorithmic


programming language (such as COBOL, FORTRAN, C) and some
database language embedded in it. Database languages are very
different from programming languages, in both data model and data
structure. This impedance-mismatch problem motivated the
development of fourth-generation languages (4GL).

• The model of transactions supported in conventional database systems


is inappropriate for long-duration transactions necessary in interactive,
cooperative design environments. Conventional database systems do
not support facilities for representing and managing the temporal
dimension in databases, including the notion of time and versions of
objects and schema, and change notifications.

The discovery of the shortcomings of conventional database technology has provided


impetus for database professionals for the most of the 1980s to pave the way for the
fifth-generation of the database technology. The next-generation database
technology must necessarily build on conventional database technology and
incorporate solutions to many of the problems outlined above in order to meet
requirements of the current and newly emerging database applications. There are
currently at least two proposed approaches for transitioning from fourth-generation
database technology to the fifth-generation technology: extended relational database
technology and object-oriented database technology. The fundamental differences
between them are the basic data model and the database language. The extended
relational approach starts with the relational model of data and a relational query
language, and extends them in various ways to allow the modeling and manipulation
of additional semantic relationships and database facilities. POSTGRES is the best-
known next-generation database system which is based on the extended relational
approach. The object-oriented approach, adopted in MCC's ORION system and
number of others systems (such as Onto, GemStone, IRIS,O2...) starts with an object-
oriented data model and a database language that captures it, and extends them in
various ways to allow additional capabilities.

One important point which we must recognize is that an object-oriented data


model is a more natural basis than an extended relational model for addressing some
of the deficiencies of the conventional database technology previously outlined; for
example, support for general data types, nested objects, and support for compute-
intensive applications. There are important differences between an object-oriented
data model and the relational data model. An object-oriented data model includes the
object-oriented concepts of encapsulation, inheritance and polymorphism; these
concepts are not part of the conventional models of data.
The difference between object-oriented database systems and non-object-oriented
database systems is that an object-oriented database system can directly support the
needs of the applications that create and manage objects that have the object-
oriented semantics, namely object-oriented programming languages or applications
designed in an object-oriented style.

Further, object-oriented programming languages may be extended into a unified


programming and database language. The resulting language is subject to the
problem of impedance mismatch to a far less extent than the approach of embedding
a current-generation database language in one of the conventional programming
languages. The reason is that an object-oriented programming language is built on
the object-oriented concepts, and object-oriented concepts consist of a number of
data modeling concepts, such as aggregation, generalization, and membership
relationships. An object-oriented database system which supports such a unified
object-oriented programming and database language will be better platform for
developing object-oriented database applications than an extended relational
database system which supports an extended relational database language.

We belive that both the extended relational and object-oriented approaches are
viable, and that most likely systems adopting either approach will co-exist.
Ques 4 What are group DSS? Explain.

Group Decision Support Systems (GDSS) are a class of electronic meeting systems, a
collaboration technology designed to support meetings and group work. GDSS are
distinct from computer supported cooperative work (CSCW) technologies as GDSS are
more focused on task support, whereas CSCW tools provide general communication
support .

Group Decision Support Systems (GDSS) were referred to as a Group Support System
(GSS) or an electronic meeting system since they shared similar foundations.
However today's GDSS is characterized by being adapted for a group of people who
collaborate to support integrated systems thinking for complex decision making.
Participants use a common computer or network to enable collaboration.

Significant research supports measuring impacts of:

•Adapting human factors for these technologies,


•Facilitating interdisciplinary collaboration, and
•Promoting effective organizational learning.

Group Decision Support Systems are categorized within a time-place paradigm.


Whether synchronous or asynchronous the systems matrix comprises:

•same time AND same place


•same time BUT different place
•different time AND different place
•different time BUT same place

Several commercial software products support GDSS practices. The newest is


ynSyte's WIQ, a collaborative decision engine whose technology platform has
increased the speed of collaborative decisions by 51%. ynSyte CEO Patricia Caporaso
recently demonstrating WIQ at a MAFN meeting "said the marketplace is anxious to
see the application of social software to the decision making process because it
transcends time and space."

.There is also an initiative to create open-source software that can support similar
group processes in education, where this category of software has been called a
Discussion Support System. See CoFFEE.
Ques 5 Briefly explain prototyping.
Software prototyping, an activity during certain software development, is the creation
of prototypes, i.e., incomplete versions of the software program being developed. A
prototype typically simulates only a few aspects of the features of the eventual
program, and may be completely different from the eventual implementation.

The conventional purpose of a prototype is to allow users of the software to evaluate


developers' proposals for the design of the eventual product by actually trying them
out, rather than having to interpret and evaluate the design based on descriptions.

Prototyping can also be used by end users to describe and prove requirements that
developers have not considered, so "controlling the prototype" can be a key factor in
the commercial relationship between solution providers and their clients.

Prototyping has several benefits: The software designer and implementer can obtain
feedback from the users early in the project. The client and the contractor can
compare if the software made matches the software specification, according to which
the software program is built. It also allows the software engineer some insight into
the accuracy of initial project estimates and whether the deadlines and milestones
proposed can be successfully met. The degree of completeness and the techniques
used in the prototyping have been in development and debate since its proposal in
the early 1970s.

This process is in contrast with the 1960s and 1970s monolithic development cycle of
building the entire program first and then working out any inconsistencies between
design and implementation, which led to higher software costs and poor estimates of
time and cost. The monolithic approach has been dubbed the "Slaying the (software)
Dragon" technique, since it assumes that the software designer and developer is a
single hero who has to slay the entire dragon alone. Prototyping can also avoid the
great expense and difficulty of changing a finished software product.

Overview

The process of prototyping involves the following steps


1.Identify basic requirements
a. Determine basic requirements including the input and output information
desired. Details, such as security, can typically be ignored.

2.Develop Initial Prototype


a. The initial prototype is developed that includes only user interfaces.
3.Review
a. The customers, including end-users, examine the prototype and provide
feedback on additions or changes.
4.Revise and Enhance the Prototype
a. Using the feedback both the specifications and the prototype can be
improved. Negotiation about what is within the scope of the
contract/product may be necessary. If changes are introduced then a
repeat of steps #3 ands #4 may be needed.

Types of prototyping

Software prototyping has many variants. However, all the methods are in some way
based on two major types of prototyping: Throwaway Prototyping and Evolutionary
Prototyping.

Throwaway prototyping

Also called close ended prototyping. Throwaway or Rapid Prototyping refers to the
creation of a model that will eventually be discarded rather than becoming part of the
final delivered software. After preliminary requirements gathering is accomplished, a
simple working model of the system is constructed to visually show the users what
their requirements may look like when they are implemented into a finished system.

Rapid Prototyping involved creating a working model of various parts of the


system at a very early stage, after a relatively short investigation. The method used
in building it is usually quite informal, the most important factor being the speed with
which the model is provided. The model then becomes the starting point from which
users can re-examine their expectations and clarify their requirements. When this has
been achieved, the prototype model is 'thrown away', and the system is formally
developed based on the identified requirements.

The most obvious reason for using Throwaway Prototyping is that it can be done
quickly. If the users can get quick feedback on their requirements, they may be able
to refine them early in the development of the software. Making changes early in the
development lifecycle is extremely cost effective since there is nothing at that point
to redo. If a project is changed after a considerable work has been done then small
changes could require large efforts to implement since software systems have many
dependencies. Speed is crucial in implementing a throwaway prototype, since with a
limited budget of time and money little can be expended on a prototype that will be
discarded. Another strength of Throwaway Prototyping is its ability to construct
interfaces that the users can test. The user interface is what the user sees as the
system, and by seeing it in front of them, it is much easier to grasp how the system
will work.

It is asserted that revolutionary rapid prototyping is a more effective manner in which


to deal with user requirements-related issues, and therefore a greater enhancement
to software productivity overall. Requirements can be identified, simulated, and
tested far more quickly and cheaply when issues of resolvability, maintainability, and
software structure are ignored. This, in turn, leads to the accurate specification of
requirements and the subsequent construction of a valid and usable system from the
user's perspective via conventional software development models.

Prototypes can be classified according to the fidelity with which they resemble the
actual product in terms of appearance, interaction and timing. One method of
creating a low fidelity Throwaway Prototype is Paper Prototyping. The prototype is
implemented using paper and pencil, and thus mimics the function of the actual
product, but does not look at all like it. Another method to easily build high fidelity
Throwaway Prototypes is to use a GUI Builder and create a click dummy, a prototype
that looks like the goal system, but does not provide any functionality.

Not exactly the same as Throwaway Prototyping, but certainly in the same family, is
the usage of storyboards, anima tics or drawings. These are non-functional
implementations but show how the system will look.

SUMMARY:-In this approach the prototype is constructed with the idea that it will be
discarded and the final system will be built from scratch. The steps in this approach
are:

o Write preliminary requirements


o Design the prototype
o User experiences/uses the prototype, specifies new requirements.
o Writing final requirements
o Developing the real products.

Evolutionary prototyping

Evolutionary Prototyping (also known as breadboard prototyping) is quite different


from Throwaway Prototyping. The main goal when using Evolutionary Prototyping is to
build a very robust prototype in a structured manner and constantly refine it. "The
reason for this is that the Evolutionary prototype, when built, forms the heart of the
new system, and the improvements and further requirements will be built.
When developing a system using Evolutionary Prototyping, the system is continually
refined and rebuilt.

"…evolutionary prototyping acknowledges that we do not understand all the


requirements and builds only those that are well understood."

This technique allows the development team to add features, or make changes that
couldn't be conceived during the requirements and design phase. For a system to be
useful, it must evolve through use in its intended operational environment. A product
is never "done;" it is always maturing as the usage environment changes…we often
try to define a system using our most familiar frame of reference---where we are now.
We make assumptions about the way business will be conducted and the technology
base on which the business will be implemented. A plan is enacted to develop the
capability, and, sooner or later, something resembling the envisioned system is
delivered.

Evolutionary Prototypes have an advantage over Throwaway Prototypes in that they


are functional systems. Although they may not have all the features the users have
planned, they may be used on an interim basis until the final system is delivered."It is
not unusual within a prototyping environment for the user to put an initial prototype
to practical use while waiting for a more developed version…The user may decide
that a 'flawed' system is better than no system at all."

In Evolutionary Prototyping, developers can focus themselves to develop parts of the


system that they understand instead of working on developing a whole system.

To minimize risk, the developer does not implement poorly understood features. The
partial system is sent to customer sites. As users work with the system, they detect
opportunities for new features and give requests for these features to developers.
Developers then take these enhancement requests along with their own and use
sound configuration-management practices to change the software-requirements
specification, update the design, recode and retest.

Incremental prototyping

The final product is built as separate prototypes. At the end the separate prototypes
are merged in an overall design.

Extreme prototyping
Extreme Prototyping as a development process is used especially for developing web
applications. Basically, it breaks down web development into three phases, each one
based on the preceding one. The first phase is a static prototype that consists mainly
of HTML pages. In the second phase, the screens are programmed and fully functional
using a simulated services layer. In the third phase the services are implemented. The
process is called Extreme Prototyping to draw attention to the second phase of the
process, where a fully-functional UI is developed with very little regard to the services
other than their contract.

Advantages of prototyping

There are many advantages to using prototyping in software development – some


tangible, some abstract.

Reduced time and costs: Prototyping can improve the quality of requirements and
specifications provided to developers. Because changes cost exponentially more to
implement as they are detected later in development, the early determination of what
the user really wants can result in faster and less expensive software.

Improved and increased user involvement: Prototyping requires user involvement and
allows them to see and interact with a prototype allowing them to provide better and
more complete feedback and specifications. The presence of the prototype being
examined by the user prevents many misunderstandings and miscommunications
that occur when each side believe the other understands what they said. Since users
know the problem domain better than anyone on the development team does,
increased interaction can result in final product that has greater tangible and
intangible quality. The final product is more likely to satisfy the users desire for look,
feel and performance.

Disadvantages of prototyping

Using, or perhaps misusing, prototyping can also have disadvantages.

Insufficient analysis: The focus on a limited prototype can distract developers from
properly analyzing the complete project. This can lead to overlooking better solutions,
preparation of incomplete specifications or the conversion of limited prototypes into
poorly engineered final projects that are hard to maintain. Further, since a prototype
is limited in functionality it may not scale well if the prototype is used as the basis of
a final deliverable, which may not be noticed if developers are too focused on building
a prototype as a model.
User confusion of prototype and finished system: Users can begin to think that a
prototype, intended to be thrown away, is actually a final system that merely needs
to be finished or polished. (They are, for example, often unaware of the effort needed
to add error-checking and security features which a prototype may not have.) This
can lead them to expect the prototype to accurately model the performance of the
final system when this is not the intent of the developers. Users can also become
attached to features that were included in a prototype for consideration and then
removed from the specification for a final system. If users are able to require all
proposed features be included in the final system this can lead to conflict.

Developer misunderstanding of user objectives: Developers may assume that users


share their objectives (e.g. to deliver core functionality on time and within budget),
without understanding wider commercial issues. For example, user representatives
attending Enterprise software (e.g. PeopleSoft) events may have seen demonstrations
of "transaction auditing" (where changes are logged and displayed in a difference grid
view) without being told that this feature demands additional coding and often
requires more hardware to handle extra database accesses. Users might believe they
can demand auditing on every field, whereas developers might think this is feature
creep because they have made assumptions about the extent of user requirements. If
the solution provider has committed delivery before the user requirements were
reviewed, developers are between a rock and a hard place, particularly if user
management derives some advantage from their failure to implement requirements.

Developer attachment to prototype: Developers can also become attached to


prototypes they have spent a great deal of effort producing; this can lead to problems
like attempting to convert a limited prototype into a final system when it does not
have an appropriate underlying architecture. (This may suggest that throwaway
prototyping, rather than evolutionary prototyping, should be used.)

Excessive development time of the prototype: A key property to prototyping is the


fact that it is supposed to be done quickly. If the developers lose sight of this fact,
they very well may try to develop a prototype that is too complex. When the
prototype is thrown away the precisely developed requirements that it provides may
not yield a sufficient increase in productivity to make up for the time spent
developing the prototype. Users can become stuck in debates over details of the
prototype, holding up the development team and delaying the final product.

Expense of implementing prototyping: the start up costs for building a development


team focused on prototyping may be high. Many companies have development
methodologies in place, and changing them can mean retraining, retooling, or both.
Many companies tend to just jump into the prototyping without bothering to retrain
their workers as much as they should.

A common problem with adopting prototyping technology is high expectations for


productivity with insufficient effort behind the learning curve. In addition to training
for the use of a prototyping technique, there is an often overlooked need for
developing corporate and project specific underlying structure to support the
technology. When this underlying structure is omitted, lower productivity can often
result.

August 2009
Bachelor of Science in Information Technology (BScIT) – Semester 4
BT0047 – Management Information System – 2 Credits
(Book ID: B0048)
Assignment Set – 2 (30 Marks)
Answer all questions 5 x 6 = 30
Ques1 Explain the various approaches to MIS development in an
organization.
Definition of MIS:

• Provides information to support managerial function like planning,


organizing, directing, controlling.
• Collects information in a systematic and routine manner, which is
accordance with a well-defined set of rules.
• Includes files, hardware, and software and operations research models of
processing, storing, retrieving and transmitting information to the users.

Characteristics of MIS:
• Management oriented: The system is designed form the top to work
downwards. It does not mean that the system is designed to provide
information directly to the top management. Other levels of management are
also provided with relevant information.
• Management directed: Management orientation of MIS, it is necessary
that management should continuously make reviews. For example, in the
marketing information system, the management must determine what sales
information is necessary to improve its control over marketing operations.
• Integrated: The word 'integration' means that system has to cover of all
the functional areas of an organization so as to produce more meaningful
management information, with a view to achieving the objectives of the
organization. It has to consider various sub-Systems, their objectives,
information needs, and recognize the independence, that these sub-systems
have amongst themselves, so that common areas of information are identified
and processed without repetition and overlapping. For example, in the
development of an effective production scheduling system, a proper balance
amongst the following factors is desired:

 Set up costs
 Overtime
 Manpower
 Production capacity
 Inventory level
 Money available
 Customer service.
•Common data flows: The integration concept of MIS, common data flow concept
avoids repetition and overlapping in data collection and storage, combining
similar functions, and simplifying operations wherever possible. For example, in
the marketing operations, orders received for goods become the basis billing of
goods ordered, setting up of the accounts receivable, initiating production
activity, sales analysis and forecasting etc.

•Heavy element: A management information system cannot be established


overnight. It takes almost 2 to 4 years to establish it successfully in an
organization. Hence, long-term planning is required for MIS development in
order to fulfill the future needs and objectives of the organization. The designer
of an information system should therefore ensure that it would not become
obsolete before it actually gets into operation. An example of such a feature of
MIS may be seen in a transportation system where a highway is designed not
to handle today's traffic requirements but to handle the traffic requirements
five to ten years.

•Flexibility and ease of use: While building an MIS system all types of possible
means, which may occur in future, are added to make it flexible. A feature that
often goes with flexibility is the ease of use. The MIS should be able to
incorporate all those features that make it readily accessible to wide range of
users with easy usability.

Ques 2 Explain information systems.

In a very broad sense, the term information system is frequently used to refer to the
interaction between people, processes, data and technology. In this sense, the term is
used to refer not only to the information and communication technology (ICT) an
organization uses, but also to the way in which people interact with this technology in
support of business processes. Some make a clear distinction between information
systems, ICT and business processes. Information systems are distinct from
information technology in that an information system is typically seen as having an
ICT component. Information systems are also different from business processes.
Information systems help to control the performance of business processes.

Alter argues for an information system as a special type of work system. A work
system is a system in which humans and/or machines perform work using resources
(including ICT) to produce specific products and/or services for customers. An
information system is a work system whose activities are devoted to processing
(capturing, transmitting, storing, retrieving, manipulating and displaying)
information .

Part of the difficulty in defining the term information system is due to vagueness in
the definition of related terms such as system and information. Beynon-Davies argues
for a clearer terminology based in systemic and semiotics. He defines an information
system as an example of a system concerned with the manipulation of signs. An
information system is a type of socio-technical system. An information system is a
mediating construct between actions and technology.

As such, information systems inter-relate with data systems on the one hand and
activity systems on the other. An information system is a form of communication
system in which data represent and are processed as a form of social memory. An
information system can also be considered a semi-formal language which supports
human decision making and action. Information systems are the primary focus of
study for the information systems discipline and for organizational informatics

Ques 3 explain the various models of DSS.

As with the definition, there is no universally-accepted taxonomy of DSS either.


Different authors propose different classifications. Using the relationship with the user
as the criterion, Haettenschwiler[5] differentiates passive, active, and cooperative DSS.
A passive DSS is a system that aids the process of decision making, but that cannot
bring out explicit decision suggestions or solutions. An active DSS can bring out such
decision suggestions or solutions. A cooperative DSS allows the decision maker (or its
advisor) to modify, complete, or refine the decision suggestions provided by the
system, before sending them back to the system for validation. The system again
improves, completes, and refines the suggestions of the decision maker and sends
them back to her for validation. The whole process then starts again, until a
consolidated solution is generated.

Another taxonomy for DSS has been created by Daniel Power. Using the mode of
assistance as the criterion, Power differentiates communication-driven DSS, data-
driven DSS, document-driven DSS, knowledge-driven DSS, and model-driven DSS.

• A communication-driven DSS supports more than one person working on a


shared task; examples include integrated tools like Microsoft's NetMeeting or
Groove

• A data-driven DSS or data-oriented DSS emphasizes access to and


manipulation of a time series of internal company data and, sometimes, external
data.

• A document-driven DSS manages, retrieves, and manipulates unstructured


information in a variety of electronic formats.

• A knowledge-driven DSS provides specialized problem-solving expertise


stored as facts, rules, procedures, or in similar structures.

• A model-driven DSS emphasizes access to and manipulation of a statistical,


financial, optimization, or simulation model. Model-driven DSS use data and
parameters provided by users to assist decision makers in analyzing a situation;
they are not necessarily data-intensive. Dicodess is an example of an open
source model-driven DSS generator.
Using scope as the criterion, Power differentiates enterprise-wide DSS and desktop
DSS. An enterprise-wide DSS is linked to large data warehouses and serves many
managers in the company. A desktop, single-user DSS is a small system that runs on
an individual manager's PC.

Architecture

Design of a Drought Mitigation Decision Support System.


Three fundamental components of a DSS architecture are:
the database (or knowledge base),
• the model (i.e., the decision context and user criteria), and
• the user interface.
The users themselves are also important components of the architecture.

Development Frameworks
DSS systems are not entirely different from other systems and require a structured
approach. Such a framework includes people, technology, and the development
approach.
DSS technology levels (of hardware and software) may include:
• The actual application that will be used by the user. This is the part of the
application that allows the decision maker to make decisions in a particular
problem area. The user can act upon that particular problem.
• Generator contains Hardware/software environment that allows people to
easily develop specific DSS applications. This level makes use of case tools or
systems such as Crystal, AIMMS, and iThink.
• Tools include lower level hardware/software. DSS generators including special
languages, function libraries and linking modules
An iterative developmental approach allows for the DSS to be changed and
redesigned at various intervals. Once the system is designed, it will need to be tested
and revised for the desired outcome.

Classifying DSS
There are several ways to classify DSS applications. Not every DSS fits neatly into one
category, but a mix of two or more architecture in one.
Holsapple and Whinston classify DSS into the following six frameworks: Text-oriented
DSS, Database-oriented DSS, Spreadsheet-oriented DSS, Solver-oriented DSS, Rule-
oriented DSS, and Compound DSS.
A compound DSS is the most popular classification for a DSS. It is a hybrid system
that includes two or more of the five basic structures described by Holsapple and
Whinston.
The support given by DSS can be separated into three distinct, interrelated categories
: Personal Support, Group Support, and Organizational Support.
DSS components may be classified as:
• Inputs: Factors, numbers, and characteristics to analyze
• User Knowledge and Expertise: Inputs requiring manual analysis by the
user
• Outputs: Transformed data from which DSS "decisions" are generated
• Decisions: Results generated by the DSS based on user criteria
DSSs which perform selected cognitive decision-making functions and are based on
artificial intelligence or intelligent agents technologies are called Intelligent Decision
Support Systems (IDSS).
The nascent field of Decision engineering treats the decision itself as an engineered
object, and applies engineering principles such as Design and Quality assurance to an
explicit representation of the elements that make up a decision.

Applications
As mentioned above, there are theoretical possibilities of building such systems in
any knowledge domain.
One example is the Clinical decision support system for medical diagnosis. Other
examples include a bank loan officer verifying the credit of a loan applicant or an
engineering firm that has bids on several projects and wants to know if they can be
competitive with their costs. DSS is extensively used in business and management.
Executive dashboard and other business performance software allow faster decision
making, identification of negative trends, and better allocation of business resources.
A growing area of DSS application, concepts, principles, and techniques is in
agricultural production, marketing for sustainable development. For example, the
DSSAT4 package, developed through financial support of USAID during the 80's and
90's, has allowed rapid assessment of several agricultural production systems around
the world to facilitate decision-making at the farm and policy levels. There are,
however, many constraints to the successful adoption on DSS in agriculture. DSS are
also prevalent in forest management where the long planning time frame demands
specific requirements. All aspects of Forest management, from log transportation,
harvest scheduling to sustainability and ecosystem protection have been addressed
by modern DSSs. A comprehensive list and discussion of all available systems in
forest management is being compiled under the COST action Forsys

A specific example concerns the Canadian National Railway system, which tests its
equipment on a regular basis using a decision support system. A problem faced by
any railroad is worn-out or defective rails, which can result in hundreds of derailments
per year. Under a DSS, CN managed to decrease the incidence of derailments at the
same time other companies were experiencing an increase. DSS has many
applications that have already been spoken about. However, it can be used in any
field where organization is necessary. Additionally, a DSS can be designed to help
make decisions on the stock market, or deciding which area or segment to market a
product toward.
CACI has begun integrating simulation and decision support systems. CACI
defines three levels of simulation model maturity. “Level 1” models are traditional
desktop simulation models that are executed within the native software package.
These often require a simulation expert to implement modifications, run scenarios,
and analyze results. “Level 2” models embed the modeling engine in a web
application that allows the decision maker to make process and parameter changes
without the assistance of an analyst. “Level 3” models are also embedded in a web-
based application but are tied to real-time operational data. The execution of “level
3” models can be triggered automatically based on this real-time data and the
corresponding results can be displayed on the manager’s desktop showing the
prevailing trends and predictive analytics given the current processes and state of the
system. The advantage of this approach is that “level 1” models developed for the
FDA projects can migrate to “level 2 and 3” models in support of decision support,
production/operations management, process/work flow management, and predictive
analytics. This approach involves developing and maintaining reusable models that
allow decision makers to easily define and extract business level information (e.g.,
process metrics). “Level 1” models are decomposed into their business objects and
stored in a database. All process information is stored in the database, including
activity, resource, and costing data. The database becomes a template library that
users can access to build, change, and modify their own unique process flows and
then use simulation to study their performance in an iterative manner.
Benefits of DSS
• Improves personal efficiency
• Expedites problem solving (speed up the progress of problems solving in an
organization)
• Facilitates interpersonal communication
• Promotes learning or training
• Increases organizational control
• Generates new evidence in support of a decision
• Creates a competitive advantage over competition
• Encourages exploration and discovery on the part of the decision maker
• Reveals new approaches to thinking about the problem space
• Helps automate the managerial processes.

Ques 4 Write short notes on:


4a Rapid application Development (RAD)

The relative effectiveness of RAD


Rapid Application Development (RAD) refers to a type of software development
methodology that uses minimal planning in favor of rapid prototyping. The "planning"
of software developed using RAD is interleaved with writing the software itself. The
lack of extensive pre-planning generally allows software to be written much faster,
and makes it easier to change requirements.

Rapid Application Development is a software development methodology that involves


techniques like iterative development and software prototyping. According to Whitten
(2004), it is a merger of various structured techniques, especially data-driven
Information Engineering, with prototyping techniques to accelerate software systems
development.

In Rapid Application Development, structured techniques and prototyping are


especially used to define users' requirements and to design the final system. The
development process starts with the development of preliminary data models and
business process models using structured techniques. In the next stage, requirements
are verified using prototyping, eventually to refine the data and process models.
These stages are repeated iteratively; further development results in "a combined
business requirements and technical design statement to be used for constructing
new systems".

RAD approaches may entail compromises in functionality and performance in


exchange for enabling faster development and facilitating application maintenance.

The shift from traditional session-based client/server development to open session


less and collaborative development like Web 2.0 has increased the need for faster
iterations through the phases of the SDLC. This, coupled with the growing utilization
of open source frameworks and products in core commercial development, has, for
many developers, rekindled interest in finding a silver bullet RAD methodology.
Although most RAD methodologies foster software re-use, small team structure and
distributed system development, most RAD practitioners recognize that, ultimately,
there is no single “rapid” methodology that can provide an order of magnitude
improvement over any other development methodology.
Agile software development
Minimizes feature creep by developing in short intervals resulting in
Pros
miniature software projects and releasing the product in mini-increments.
Short iteration may not add enough functionality, leading to significant
delays in final iterations. Since Agile emphasizes real-time communication
(preferably face-to-face), utilizing it is problematic for large multi-team
Cons
distributed system development. Agile methods produce very little written
documentation and require a significant amount of post-project
documentation.
Extreme Programming (XP)
Lowers the cost of changes through quick spirals of new requirements. Most
Pros
of the design activity takes place incrementally and on the fly.
Programmers are required to work in pairs (which may be difficult for some
developers). There is no up-front “detailed design”, which could result in
Cons more redesign effort in the long run. The business champion attached to the
project full time can potentially become a single point-of-failure for the
project and a major source of stress for the team.
Joint Application Development (JAD)
Captures the voice of the customer by involving them in the design and
Pros development of the application through a series of collaborative workshops
called JAD sessions.
The client may create an unrealistic product vision and request extensive
Cons
gold-plating, leading the team to over- or under-develop functionality.
Lean software development (LD)
Creation of minimalist solutions (i.e., needs determine technology) and
Pros delivering less functionality earlier (as per the paradigm that 80% today is
better than 100% tomorrow).
Product may lose its competitive edge because of insufficient core
Cons
functionality and may exhibit poor overall quality.
Rapid Application Development (RAD)
Promotes strong collaborative atmosphere and dynamic gathering of
Pros requirements. Business owner actively participates in prototyping, writing
test cases and performing unit testing.
Dependency on strong cohesive teams and individual commitment to the
project. Success depends on disciplined developers and their exceptional
Cons technical skills and ability to “turn on a dime”. Decision making relies on the
feature functionality team and a communal decision-making process with
lesser degree of centralized PM and engineering authority.
Scrum
Improvement in productivity in teams previously paralyzed by heavy
“process”, ability to prioritize work, utilization of backlog for completing
Pros
items in a series of short iterations or sprints, daily measured progress and
communications.
Reliance on facilitation by a master who may lack the political clout to
remove impediments and deliver the sprint goal. Due to its reliance on self-
Cons
organizing teams and the rejection of the traditional centralized "process
control", internal power struggles may paralyze the team.
All flavors of RAD have the potential for providing a good framework for faster product
development with improved code quality, but successful implementation and benefits
often hinge on project type, schedule, software release cycle and corporate culture. It
may also be of interest that some of the largest software vendors such as Microsoft
and IBM do not extensively utilize RAD in the development of their flagship products
and for the most part, they still primarily rely on traditional waterfall methodologies
with some degree of spiraling.

The following table contains a high-level summary of some of the major flavors of
RAD and their relative strengths and weakness.
Table1: Pros and Cons of various RAD flavors

Criticism
Since rapid application development is an iterative and incremental process, it can
lead to a succession of prototypes that never culminate in a satisfactory production
application. Such failures may be avoided if the application development tools are
robust, flexible, and put to proper use. This is addressed in methods such as the 2080
Development method or other post-agile variants.

Practical implications with rapid development methodologies


When organizations adopt rapid development methodologies, care must be taken to
avoid role and responsibility confusion and communication breakdown within the
development team, and between the team and the client. In addition, especially in
cases where the client is absent or not able to participate with authority in the
development process, the system analyst should be endowed with this authority on
behalf of the client to ensure appropriate prioritisation of non-functional
requirements. Furthermore, no increment of the system should be developed without
a thorough and formally documented design phase

4b End user computing


End User Computing (EUC) is a group of approaches to computing that aim at better
integrating end users into the computing environment or that attempt to realize the
potential for high-end computing to perform in a trustworthy manner in problem
solving of the highest order.

The EUC Ranges section describes two types of approaches that are at different ends
of a spectrum. A simple example of these two extremes can use the SQL context.
• The first approach would have canned queries and reports that for the most
part would be invoked with buttons and/or simple commands. In this approach,
a computing group would keep these canned routines up to date through the
normal development/maintenance methods.

• For the second approach, SQL administration would allow for end-user
involvement at several levels including administration itself. Users would also
define queries though the supporting mechanism may be constrained in order
to reduce the likelihood of run-away conditions that would have negative
influence on other users. We see this already in some business intelligence
methods which build SQL, including new databases, on the fly. Rules might
help dampen effects that can occur with the open-ended environment. The
process would expect, and accommodate, the possibility of long run times,
inconclusive results and such. These types of unknowns are undecidable
'before the fact'; the need to do 'after the fact' evaluation of results is a prime
factor of many higher-order computational situations but cannot (will not) be
tolerated by an end user in the normal production mode.

Between these two extremes view of EUC there are many combinations. Some of the
factors contributing to the need for further EUC research are knowledge processing,
pervasive computing, issues of ontology, interactive visualization and analysis
coupling schemes (see Duck test), and the like.

EUC Ranges
EUC might work by one type of approach that attempts to integrate the human
interface ergonomically into a user centered design system throughout its life cycle.
In this sense, EUC's goal is to allow unskilled staff to use expensive and highly skilled
knowledge in their jobs, by putting the knowledge and expertise into the computer
and teaching the end user how to access it. At the same time, this approach is used
when highly critical tasks are supported by computational systems (commercial flight,
nuclear plant, and the like).

Another approach to EUC allows end users (SMEs, domain experts) to control and
even perform software engineering and development. In this case, it can be argued
that this type of approach results mainly from deficiencies in computing that could be
overcome with better tools and environments. But, high-end roles for the computer in
non-trivial domains necessitate (at least, for now) a more full interchange (bandwidth
for conversation) that is situational and subject to near exhaustive scrutiny (there are
limits influencing how far we can go (bringing up, the necessity for a behavioral (also,
see black box below) framework)). Such cannot be filled by a pre-defined system in
today's world. In a sense, the computer needs to have the same credentials as does a
cohort (scientific method of peer review) in the discipline. This type of computing falls
on the more 'open' side of the fence where scientific knowledge is not wrapped within
the cloak of IP.

In the first type of approach of EUC described above, it appears easier to teach
factory workers, for example, how to read dials, push buttons, pull levers, and log
results than to teach them the manufacturing process and mathematical models. The
current computing trend is to simulate a console with similar dials, sliders, levers, and
switches, which the end user is taught to use. To further reduce end user training,
computer consoles all contain components which are shaped, labeled, coloured, and
function similarly. EUC developers assume that once the end user knows what and
how a particular lever works, they will quickly identify it when it appears in a new
console. This means that once staff learns one console, they will be able to operate all
consoles. Admittedly each console will have new components, but training is limited
to those, not the whole console. This approach requires more than just Pavlovian
responses as the console content will have meaning that is of use and power to the
particular computing domain. That is, there may be training that reduces the time
between sensor reading and action (such as the situation for a pilot of a commercial
plane) however, the meaning behind the reading will include other sensor settings as
well as whole context that may be fairly involved.

Computing of this type can be labeled black box where trust will be an essential part,
behavioral analysis is the name of the game (see Duck test), and there is a disparate
(and very, very wide) gap between the domain and the computer-support ontologies.

In the other type of EUC described above, it has been argued that a (teaching
programming and computing concepts to a domain expert (say, one of the sciences
or engineering disciplines) and letting the expert develop rules (this type of action
can be subsumed under the topic of business rules)) is easier than b (teaching the
intricacies of a complex discipline to a computer worker). b is the normal approach of
the IT-driven situation. a has been the reality since day one of computing in many
disciplines. One may further argue that resolving issues of a and b is not unlike the
interplay between distributed and centralized processing (which is an age-old concern
in computing). In this sense of EUC, there may be computer scientists supporting
decisions about architecture, process, and GUI. However, in many cases, the end user
owns the software components. One thrust related to this sense of EUC is a focus on
providing better languages to the user. ICAD was an example in the KBE context. Of
late, this discipline has moved to a co-joint architecture that features advanced
interactive domain visualization coupled with a complicated API accessed via VBA, C+
+, and the like. This type of co-jointness is an example of a domain tool augmented
with non-trivial extensibility.

Trend
The historical view regarding end users is being eroded by the internet and wireless
communication, where the traditional end user is able to actively contribute and add
value to the computer system. wikis are one example where end users provide the
content and free the webmaster to manage the site. Another example within the
computer field is free software, where end users can engage in all aspects of software
development, from feature requests, through testing and reviews, to usability,
documentation, and distribution. Music, pictures, and documents are remixed and
edited to satisfy personal taste and demand. The consequence is that many countries
and industries have been slow or unwilling to adjust to this emerging society, but
some have seen the potential and are exploring economic possibilities.
Another trend is where users specify, and even develop rules that may be fairly
normal relationships (SQL) or be hard-core numerical processes that may require
attention being given to serious computational characteristics, such as ill-
conditioning, parallelisms and similar issues of an ongoing nature.

Research
The human interface receives continuous attention as emerging interfaces reveal
more possibilities and risks. The quest to both internationalize (i18n) and localize
(L10n) software is hampered by computers designed for the English alphabet, but
other major languages, such as Chinese, Japanese, and Arabic, have different
requirements.

Other studies range from website accessibility to pervasive computing, with the focus
ranging from the human to the computer. The issue centers around how much the
human can safely and reliably adjust to the computer's I/O devices on the one hand,
and how unobtrusively the computer can detect the human's needs on the other.
Furthermore, issues related to computing ontologies (example: the Language/Action
perspective has found success in CRM, etc.) continue to be of interest to EUC.

Analysis

The concepts related to the end user cover a wide range (novice user to intellectual
borg—see Slogan 2), hence End User Computing can have a range of forms and
values. Most early computer systems were tightly controlled by an IT department;
'users' were just that. However, at any point in the evolution of computer systems
through time, there was serious work in several domains that required user
development. The dynamics of the power struggles between centralized and
decentralized computing have been a fact; this was partially due to the emergence of
the mid-sized computers (VAX, etc.). Then, the advent of the personal workstation
opened up the door, so to speak, since it allowed a more pervasive type of
computation to emerge. The recent advent of 'web' services has extended the issues
to a more broad scope.

In the sense of serious domain computing and given the intertwining of computation
into all advanced disciplines, any tool (inclusive of any type of capability related to a
domain/discipline) that is provided by a computer becomes part of the discipline
(methodology, etc.). As such, the issue arises about how open the tool is to scrutiny.
Some disciplines require more understanding of the tool set than do others. That is,
tools that are operational in scope require less understanding than those that are
ontological. As an example of the latter type of influence on disciplines, consider the
impact that the computer has had on the scientific method. Some of the issues
related to End User Computing concern architecture (iconic versus language
interface, open versus closed, and others). These continue to be studied. Other issues
relate to IP, configuration, maintenance, End User Computing allows more user input
into system affairs that can range from personalization to full-fledged ownership of
the system.

Examples of End User Computing are systems built using the 4GLs, such as MAPPER
or SQL, or one of the 5GLs, such as ICAD. ICAD (in the KBE domain) stands as a prime
example since it is associated with the pervasive use of Lisp (one of the 3GLs) by
Engineers to accomplish remarkable effects through a long economic cycle.

Slogans

• Computing concerns and good End User Computing can be antithetically


related.
• Good End User Computing practices might help temper things such as the AI
Winter.
• The computational needs to wed with the phenomenal (are 'borgs' inevitable?).
• There is always more than meets the eye (or, GUI, or any interface, is only part
of the truth

Ques 5 Discuss the Role of MIS in following business areas:


• Inventory information systems
• Marketing information systems
• R&D information systems

A management information system (MIS) is a subset of the overall internal


controls of a business covering the application of people, documents, technologies,
and procedures by management accountants to solve business problems such as
costing a product, service or a business-wide strategy. Management information
systems are distinct from regular information systems in that they are used to
analyze other information systems applied in operational activities in the
organization.[1] Academically, the term is commonly used to refer to the group of
information management methods tied to the automation or support of human
decision making, e.g. Decision Support Systems, Expert systems, and Executive
information systems.
It has been described as, "MIS 'lives' in the space that intersects technology and
business. MIS combines tech with business to get people the information they need to
do their jobs better/faster/smarter. Information is the lifeblood of all organizations -
now more than ever. MIS professionals work as systems analysts, project managers,
systems administrators, etc., communicating directly with staff and management
across the organization."

Inventory control system


A process for keeping track of objects or materials. In common usage, the term may
also refer to just the software components. Modern inventory control systems rely
upon barcodes, and potentially RFID tags, to provide automatic identification of
inventory objects. In an academic study performed at Wal-Mart, RFID reduced Out of
Stocks by 30 percent for products selling between 0.1 and 15 units a day. Inventory
objects could include any kind of physical asset: merchandise, consumables, fixed
assets, circulating tools, library books, or capital equipment. To record an inventory
transaction, the system uses a barcode scanner or RFID reader to automatically
identify the inventory object, and then collects additional information from the
operators via fixed terminals (workstations), or mobile computers.

Applications
An inventory control system may be used to automate a sales order fulfillment
process. Such a system contains a list of order to be filled, and then prompts
workers to pick the necessary items, and provides them with packaging and
shipping information. Real time inventory control systems use wireless, mobile
terminals to record inventory transactions at the moment they occur. A
wireless LAN transmits the transaction information to a central database.
Physical inventory counting and cycle counting are features of many inventory
control systems which can enhance the organization.
Marketing information systems are intended to support management decision
making. Management has five distinct functions and each requires support from an
MIS. These are: planning, organizing, coordinating, decisions and controlling.
Information systems have to be designed to meet the way in which managers tend to
work. Research suggests that a manager continually addresses a large variety of
tasks and is able to spend relatively brief periods on each of these. Given the nature
of the work, managers tend to rely upon information that is timely and verbal
(because this can be assimilated quickly), even if this is likely to be less accurate then
more formal and complex information systems. Managers play at least three separate
roles: interpersonal, informational and decisional. MIS, in electronic form or otherwise,
can support these roles in varying degrees. MIS has less to contribute in the case of a
manager's informational role than for the other two.
Three levels of decision making can be distinguished from one another:
• Strategic
• Control (or tactical)
• Operational.
Again, MIS has to support each level. Strategic decisions are characteristically one-off
situations. Strategic decisions have implications for changing the structure of an
organization and therefore the MIS must provide information which is precise and
accurate. Control decisions deal with broad policy issues and operational decisions
concern the management of the organization’s marketing mix.

A marketing information system has four components: the internal reporting system,
the marketing research systems, the marketing intelligence system and marketing
models. Internal reports include orders received, inventory records and sales invoices.
Marketing research takes the form of purposeful studies either ad hoc or continuous.
By contrast, marketing intelligence is less specific in its purposes, is chiefly carried
out in an informal manner and by managers themselves rather than by professional
marketing researchers.

R&D Management System The phases of the research and development vary,
depending on the type of product. Briefly, five stages can be classified: 1. idea
generation stage, 2. planning stage, 3. design stage, 4. pilot production stage, and 5.
initial production stage. Figure 1 illustrates each process and the corresponding
responsible departments. The emphasis of each stage is described as below:

1. Idea generation stage


This stage is the origin of the research/development process. Ideas of the
product may originate from customers' responses collected by the sales
department, the market trend surveyed by the planning department, or the
strategic command of the top management. All of these factors are merely the
conceptual description of a new product. This stage focuses on describing the
product idea as clearly as possible. Formal written descriptions and the
business specifications must be prepared to evaluate the feasibility of the
following three aspects:

(1). Technological feasibility: such as engineering human resources, production


equipment, inspection capability, and the like.
(2). Marketing feasibility: such as strategies of the competitors, market
potentiality, and product positioning.
(3). Synthetic feasibility: such as cost estimation and return on investment
analysis.

2. Planning stage
Once a decision is made, a project team should be organized to perform the
design tasks. A project leader must be allocated. to ensure that the product is
completed with high quality, on schedule, and at low cost. In this stage, the
business specifications are converted into manufacture able engineering
specifications and the quality function deployment (QFD) can be used to assist
the transformation. A master schedule should be established for each project.
Schedule of the contractual design project should include the customer's
requirements and due date. The engineering specifications and design
schedule should be carefully reviewed to ensure that the design objective can
be achieved. Information that can be collated should include to enforce the
design function and to reduce the research/development risk. Previous design
project and customers' complaint records are two of the useful references.
3. Product design stage
The converted engineering specifications are the quality level that should be
accomplished in the design stage. The major tasks of this stage consist of:

(1). Preservation of complete design records, including numerical


computations, selection of components, simulation of interference between
parts, and so on.
(2). Construction and verification of prototypes.
(3). Production of design output, such as bill of materials, component and
assembly drawings, circuit charts, and software.
(4). Estimation of production cost.
(5). Application of patents.

The design output and records should be reviewed, and the results of FMEA or
FTA must be checked. The design review emphasizes the following points:

(1). Correspondence of design output and standards, and engineering


specifications.
(2). Appropriateness of design processes.
(3). Manufacturability of design output.
(4). Testability of design output.
(5). Acceptability of cost variation.
(6). Purchas ability of materials.
(7). Acceptable allowances of the design output.

All reviewed records must be preserved. If the project leader observes a


significant deviation between planned schedule and actual schedule, the
remaining phases should be rescheduled accordingly. If any engineering
specification must be adjusted, the exact revision should be determined in the
review meeting.

4. Pilot production
When the design output is proven acceptable after modification, the next step
is to construct the prototype and perform the pilot production. This stage aims
to transfer the designed results from the R&D department to the production
department. Several tasks should be completed before pilot production;
therefore, a separate stage is implemented by some enterprises. The
preparatory tasks include the following:

(1). Preparation of quality control plan, such as QC engineering table,


inspection standard, and sampling.
(2). Preparation of test equipment.
(3). Design of production flow, such as layout, power, and construction works.
(4). Preparation of jigs and fixtures.
(5). Preparation of operation sheets.
(6). FMEA of processes.
(7). Procurement, installation, and examination of production and testing
equipments.
(8). Determination of quantity of prototype and pilot production.
(9). Training of relevant personnel.

As widely known, educated people usually demand an independent working


environment. Research/development staff is characterized by disregarding trifles, and
superficial ness, whereas, at the same time, they are cautious and sharp. They dislike
to be bound in an uniform management style, such as check in and out, standard
operation time, and the like. The main reason is that design processes involve
creation and imagination; excessive control will constrain the imagination and,
consequently, retard the creation. Therefore, when planning the design management
system, it is necessary to convince staff that reporting the progress is to seek
problems for management support, rather than to dig every design detail. Otherwise,
they may feel intruded. To accomplish this, the checking points of the design stages
should be carefully applied so that the required information can be extracted without
invading the design engineers. Furthermore, design engineers tend to strive for
technological leader, and appropriate award system can be designed according to
their perception. To avoid trial and error, the external management consultant can be
utilized to accelerate the system's maturity.

Das könnte Ihnen auch gefallen