Beruflich Dokumente
Kultur Dokumente
Ques 1. Define MIS and its objectives. What are the characteristics of MIS?
Definition
An 'MIS' is a planned system of the collecting, processing, storing and disseminating
data in the form of information needed to carry out the functions of management. In a
way it is a documented report of the activities those were planned and executed.
According to Philip Kotler "A marketing information system consists of people,
equipment, and procedures to gather, sort, analyze, evaluate, and distribute needed,
timely, and accurate information to marketing decision makers."
The terms MIS and information system are often confused. Information systems
include systems that are not intended for decision making. The area of study called
MIS is sometimes referred to, in a restrictive sense, as information technology
management. That area of study should not be confused with computer science. IT
service management is a practitioner-focused discipline. MIS has also some
differences with Enterprise Resource Planning (ERP) as ERP incorporates elements
that are not necessarily focused on decision support. Professor Allen S. Lee states
that "...research in the information systems field examines more than the
technological system, or just the social system, or even the two side by side; in
addition, it investigates the phenomena that emerge when the two interact."
An MIS provides the following advantages.
5. It makes control easier: MIS serves as a link between managerial planning and
control. It improves the ability of management to evaluate and improve performance.
The used computers has increased the data processing and storage capabilities and
reduced the cost.
Functional Aspects
a)MIS is an integrated collection of functional information systems, each
supporting particular functional areas.
Ques 2. Explain data processing.
Computer data processing is any process that uses a computer program to enter data
and summaries, analyze or otherwise convert data into usable information. The
process may be automated and run on a computer. It involves recording, analyzing,
sorting, summarizing, calculating, disseminating and storing data. Because data are
most useful when well-presented and actually informative, data-processing systems
are often referred to as information systems. Nevertheless, the terms are roughly
synonymous, performing similar conversions; data-processing systems typically
manipulate raw data into information, and likewise information systems typically take
raw data as input to produce information as output.
Data processing may or may not be distinguished from data conversion, when the
process is merely to convert data to another format, and does not involve any data
manipulation.
Data analysis When the domain from which the data are harvested is a science or
an engineering field, data processing and information systems are considered terms
that are too broad and the more specialized term data analysis is typically used. This
is a focus on the highly-specialized and highly-accurate algorithmic derivations and
statistical calculations that are less often observed in the typical general business
environment. In these contexts data analysis packages like DAP, gretl or PSPP are
often used. This divergence of culture is exhibited in the typical numerical
representations used in data processing versus numerical; data processing's
measurements are typically represented by integers or by fixed-point or binary-coded
decimal representations of numbers whereas the majority of data analysis's
measurements are often represented by floating-point representation of rational
numbers. Processing Practically all naturally occurring processes can be viewed as
examples of data processing systems where "observable" information in the form of
pressure, light, etc. are converted by human observers into electrical signals in the
nervous system as the senses we recognize as touch, sound, and vision. Even the
interaction of non-living systems may be viewed in this way as rudimentary
information processing systems. Conventional usage of the terms data processing
and information systems restricts their use to refer to the algorithmic derivations,
logical deductions, and statistical calculations that recur perennially in general
business environments, rather than in the more expansive sense of all conversions of
real-world measurements into real-world information in, say, an organic biological
system or even a scientific or engineering system.
•Data acquisition
•Data entry
•Data cleaning
•Data coding
•Data tranformation
•Data translation
•Data summarization
•Data aggregation
•Data validation
•Data tabulation
•Statistical analysis
•Computer graphics
•Data warehousing
•Data mining
During the past three decades, the database technology for information systems has
undergone four generations of evolution, and the fifth generation database
technology is currently under development. The transition from one generation to the
next has always been necessitated by the ever-increasing complexity of database
applications and the cost of implementing, maintaining, and extending these
applications. The first generation was file system, such as ISAM and VSAM. The
second generation was hierarchical database systems, such as IMS and System 2000.
The third generation was CODASYL database systems, such as IDS, TOTAL, ADABAS,
IDMS, etc. The second and third generation systems realized the sharing of an
integrated database among many users within an application environment. The lack
of data independence and the tedious navigational access to the database gave rise
to the fourth-generation database technology, namely relational database
technology. Relational database technology is characterized by the notion of a
declarative query. Fifth-generation database technology will be characterized by a
richer data model and a richer set of database facilities necessary to meet the
requirements of applications beyond the business data-processing applications for
which the first four generations of database technology have been developed.
The transition from one generation to the next of the database technology has been
marked by the offloading of some tedious and repetitive bookkeeping functions from
the applications into the database system. This has made it easy for the application
programmers to program database application; however, it made the performance of
database systems a major problem, and required considerable research and
development to increase the performance of the new generation database systems to
an acceptable level. This point was particularly true with the transition into the area
of relational databases. The introduction of declarative queries in relational databases
relieved application programmers of the tedious chore of programming navigational
retrieval of records from the database. However, a major new component, namely the
query optimizer, had to be added to the database system to automatically arrive at
an optimal plan for executing any given query, such that the plan will make use of
appropriate access methods available in the system.
During the 1970s research and development activities in databases were focused on
realizing the relational database technology. These efforts culminated in the
introduction of commercially available systems in late 70s and early 80s, such as
Oracle, SQL/DB and DB2 and INGRES. However, relational database technology, just
as each of the previous generation database technology, was developed for the
conventional business data-processing applications, such as inventory control,
payroll, accounts, and so on. Attempts to make use of relational database technology
in a wide variety of other types of application have quickly exposed several serious
shortcomings of the relational and past-generation database technology. These
applications include computer-aided design, engineering, software engineering and
manufacturing (CAD, CAE, CASE and CAM) systems and applications that run on
them; knowledge-based systems (expert systems and expert system shells);
multimedia systems which deal with images, voice, and textual documents and
programming language systems. Relational and past-generation database systems
will henceforth be called conventional database systems.
We belive that both the extended relational and object-oriented approaches are
viable, and that most likely systems adopting either approach will co-exist.
Ques 4 What are group DSS? Explain.
Group Decision Support Systems (GDSS) are a class of electronic meeting systems, a
collaboration technology designed to support meetings and group work. GDSS are
distinct from computer supported cooperative work (CSCW) technologies as GDSS are
more focused on task support, whereas CSCW tools provide general communication
support .
Group Decision Support Systems (GDSS) were referred to as a Group Support System
(GSS) or an electronic meeting system since they shared similar foundations.
However today's GDSS is characterized by being adapted for a group of people who
collaborate to support integrated systems thinking for complex decision making.
Participants use a common computer or network to enable collaboration.
.There is also an initiative to create open-source software that can support similar
group processes in education, where this category of software has been called a
Discussion Support System. See CoFFEE.
Ques 5 Briefly explain prototyping.
Software prototyping, an activity during certain software development, is the creation
of prototypes, i.e., incomplete versions of the software program being developed. A
prototype typically simulates only a few aspects of the features of the eventual
program, and may be completely different from the eventual implementation.
Prototyping can also be used by end users to describe and prove requirements that
developers have not considered, so "controlling the prototype" can be a key factor in
the commercial relationship between solution providers and their clients.
Prototyping has several benefits: The software designer and implementer can obtain
feedback from the users early in the project. The client and the contractor can
compare if the software made matches the software specification, according to which
the software program is built. It also allows the software engineer some insight into
the accuracy of initial project estimates and whether the deadlines and milestones
proposed can be successfully met. The degree of completeness and the techniques
used in the prototyping have been in development and debate since its proposal in
the early 1970s.
This process is in contrast with the 1960s and 1970s monolithic development cycle of
building the entire program first and then working out any inconsistencies between
design and implementation, which led to higher software costs and poor estimates of
time and cost. The monolithic approach has been dubbed the "Slaying the (software)
Dragon" technique, since it assumes that the software designer and developer is a
single hero who has to slay the entire dragon alone. Prototyping can also avoid the
great expense and difficulty of changing a finished software product.
Overview
Types of prototyping
Software prototyping has many variants. However, all the methods are in some way
based on two major types of prototyping: Throwaway Prototyping and Evolutionary
Prototyping.
Throwaway prototyping
Also called close ended prototyping. Throwaway or Rapid Prototyping refers to the
creation of a model that will eventually be discarded rather than becoming part of the
final delivered software. After preliminary requirements gathering is accomplished, a
simple working model of the system is constructed to visually show the users what
their requirements may look like when they are implemented into a finished system.
The most obvious reason for using Throwaway Prototyping is that it can be done
quickly. If the users can get quick feedback on their requirements, they may be able
to refine them early in the development of the software. Making changes early in the
development lifecycle is extremely cost effective since there is nothing at that point
to redo. If a project is changed after a considerable work has been done then small
changes could require large efforts to implement since software systems have many
dependencies. Speed is crucial in implementing a throwaway prototype, since with a
limited budget of time and money little can be expended on a prototype that will be
discarded. Another strength of Throwaway Prototyping is its ability to construct
interfaces that the users can test. The user interface is what the user sees as the
system, and by seeing it in front of them, it is much easier to grasp how the system
will work.
Prototypes can be classified according to the fidelity with which they resemble the
actual product in terms of appearance, interaction and timing. One method of
creating a low fidelity Throwaway Prototype is Paper Prototyping. The prototype is
implemented using paper and pencil, and thus mimics the function of the actual
product, but does not look at all like it. Another method to easily build high fidelity
Throwaway Prototypes is to use a GUI Builder and create a click dummy, a prototype
that looks like the goal system, but does not provide any functionality.
Not exactly the same as Throwaway Prototyping, but certainly in the same family, is
the usage of storyboards, anima tics or drawings. These are non-functional
implementations but show how the system will look.
SUMMARY:-In this approach the prototype is constructed with the idea that it will be
discarded and the final system will be built from scratch. The steps in this approach
are:
Evolutionary prototyping
This technique allows the development team to add features, or make changes that
couldn't be conceived during the requirements and design phase. For a system to be
useful, it must evolve through use in its intended operational environment. A product
is never "done;" it is always maturing as the usage environment changes…we often
try to define a system using our most familiar frame of reference---where we are now.
We make assumptions about the way business will be conducted and the technology
base on which the business will be implemented. A plan is enacted to develop the
capability, and, sooner or later, something resembling the envisioned system is
delivered.
To minimize risk, the developer does not implement poorly understood features. The
partial system is sent to customer sites. As users work with the system, they detect
opportunities for new features and give requests for these features to developers.
Developers then take these enhancement requests along with their own and use
sound configuration-management practices to change the software-requirements
specification, update the design, recode and retest.
Incremental prototyping
The final product is built as separate prototypes. At the end the separate prototypes
are merged in an overall design.
Extreme prototyping
Extreme Prototyping as a development process is used especially for developing web
applications. Basically, it breaks down web development into three phases, each one
based on the preceding one. The first phase is a static prototype that consists mainly
of HTML pages. In the second phase, the screens are programmed and fully functional
using a simulated services layer. In the third phase the services are implemented. The
process is called Extreme Prototyping to draw attention to the second phase of the
process, where a fully-functional UI is developed with very little regard to the services
other than their contract.
Advantages of prototyping
Reduced time and costs: Prototyping can improve the quality of requirements and
specifications provided to developers. Because changes cost exponentially more to
implement as they are detected later in development, the early determination of what
the user really wants can result in faster and less expensive software.
Improved and increased user involvement: Prototyping requires user involvement and
allows them to see and interact with a prototype allowing them to provide better and
more complete feedback and specifications. The presence of the prototype being
examined by the user prevents many misunderstandings and miscommunications
that occur when each side believe the other understands what they said. Since users
know the problem domain better than anyone on the development team does,
increased interaction can result in final product that has greater tangible and
intangible quality. The final product is more likely to satisfy the users desire for look,
feel and performance.
Disadvantages of prototyping
Insufficient analysis: The focus on a limited prototype can distract developers from
properly analyzing the complete project. This can lead to overlooking better solutions,
preparation of incomplete specifications or the conversion of limited prototypes into
poorly engineered final projects that are hard to maintain. Further, since a prototype
is limited in functionality it may not scale well if the prototype is used as the basis of
a final deliverable, which may not be noticed if developers are too focused on building
a prototype as a model.
User confusion of prototype and finished system: Users can begin to think that a
prototype, intended to be thrown away, is actually a final system that merely needs
to be finished or polished. (They are, for example, often unaware of the effort needed
to add error-checking and security features which a prototype may not have.) This
can lead them to expect the prototype to accurately model the performance of the
final system when this is not the intent of the developers. Users can also become
attached to features that were included in a prototype for consideration and then
removed from the specification for a final system. If users are able to require all
proposed features be included in the final system this can lead to conflict.
August 2009
Bachelor of Science in Information Technology (BScIT) – Semester 4
BT0047 – Management Information System – 2 Credits
(Book ID: B0048)
Assignment Set – 2 (30 Marks)
Answer all questions 5 x 6 = 30
Ques1 Explain the various approaches to MIS development in an
organization.
Definition of MIS:
Characteristics of MIS:
• Management oriented: The system is designed form the top to work
downwards. It does not mean that the system is designed to provide
information directly to the top management. Other levels of management are
also provided with relevant information.
• Management directed: Management orientation of MIS, it is necessary
that management should continuously make reviews. For example, in the
marketing information system, the management must determine what sales
information is necessary to improve its control over marketing operations.
• Integrated: The word 'integration' means that system has to cover of all
the functional areas of an organization so as to produce more meaningful
management information, with a view to achieving the objectives of the
organization. It has to consider various sub-Systems, their objectives,
information needs, and recognize the independence, that these sub-systems
have amongst themselves, so that common areas of information are identified
and processed without repetition and overlapping. For example, in the
development of an effective production scheduling system, a proper balance
amongst the following factors is desired:
Set up costs
Overtime
Manpower
Production capacity
Inventory level
Money available
Customer service.
•Common data flows: The integration concept of MIS, common data flow concept
avoids repetition and overlapping in data collection and storage, combining
similar functions, and simplifying operations wherever possible. For example, in
the marketing operations, orders received for goods become the basis billing of
goods ordered, setting up of the accounts receivable, initiating production
activity, sales analysis and forecasting etc.
•Flexibility and ease of use: While building an MIS system all types of possible
means, which may occur in future, are added to make it flexible. A feature that
often goes with flexibility is the ease of use. The MIS should be able to
incorporate all those features that make it readily accessible to wide range of
users with easy usability.
In a very broad sense, the term information system is frequently used to refer to the
interaction between people, processes, data and technology. In this sense, the term is
used to refer not only to the information and communication technology (ICT) an
organization uses, but also to the way in which people interact with this technology in
support of business processes. Some make a clear distinction between information
systems, ICT and business processes. Information systems are distinct from
information technology in that an information system is typically seen as having an
ICT component. Information systems are also different from business processes.
Information systems help to control the performance of business processes.
Alter argues for an information system as a special type of work system. A work
system is a system in which humans and/or machines perform work using resources
(including ICT) to produce specific products and/or services for customers. An
information system is a work system whose activities are devoted to processing
(capturing, transmitting, storing, retrieving, manipulating and displaying)
information .
Part of the difficulty in defining the term information system is due to vagueness in
the definition of related terms such as system and information. Beynon-Davies argues
for a clearer terminology based in systemic and semiotics. He defines an information
system as an example of a system concerned with the manipulation of signs. An
information system is a type of socio-technical system. An information system is a
mediating construct between actions and technology.
As such, information systems inter-relate with data systems on the one hand and
activity systems on the other. An information system is a form of communication
system in which data represent and are processed as a form of social memory. An
information system can also be considered a semi-formal language which supports
human decision making and action. Information systems are the primary focus of
study for the information systems discipline and for organizational informatics
Another taxonomy for DSS has been created by Daniel Power. Using the mode of
assistance as the criterion, Power differentiates communication-driven DSS, data-
driven DSS, document-driven DSS, knowledge-driven DSS, and model-driven DSS.
Architecture
Development Frameworks
DSS systems are not entirely different from other systems and require a structured
approach. Such a framework includes people, technology, and the development
approach.
DSS technology levels (of hardware and software) may include:
• The actual application that will be used by the user. This is the part of the
application that allows the decision maker to make decisions in a particular
problem area. The user can act upon that particular problem.
• Generator contains Hardware/software environment that allows people to
easily develop specific DSS applications. This level makes use of case tools or
systems such as Crystal, AIMMS, and iThink.
• Tools include lower level hardware/software. DSS generators including special
languages, function libraries and linking modules
An iterative developmental approach allows for the DSS to be changed and
redesigned at various intervals. Once the system is designed, it will need to be tested
and revised for the desired outcome.
Classifying DSS
There are several ways to classify DSS applications. Not every DSS fits neatly into one
category, but a mix of two or more architecture in one.
Holsapple and Whinston classify DSS into the following six frameworks: Text-oriented
DSS, Database-oriented DSS, Spreadsheet-oriented DSS, Solver-oriented DSS, Rule-
oriented DSS, and Compound DSS.
A compound DSS is the most popular classification for a DSS. It is a hybrid system
that includes two or more of the five basic structures described by Holsapple and
Whinston.
The support given by DSS can be separated into three distinct, interrelated categories
: Personal Support, Group Support, and Organizational Support.
DSS components may be classified as:
• Inputs: Factors, numbers, and characteristics to analyze
• User Knowledge and Expertise: Inputs requiring manual analysis by the
user
• Outputs: Transformed data from which DSS "decisions" are generated
• Decisions: Results generated by the DSS based on user criteria
DSSs which perform selected cognitive decision-making functions and are based on
artificial intelligence or intelligent agents technologies are called Intelligent Decision
Support Systems (IDSS).
The nascent field of Decision engineering treats the decision itself as an engineered
object, and applies engineering principles such as Design and Quality assurance to an
explicit representation of the elements that make up a decision.
Applications
As mentioned above, there are theoretical possibilities of building such systems in
any knowledge domain.
One example is the Clinical decision support system for medical diagnosis. Other
examples include a bank loan officer verifying the credit of a loan applicant or an
engineering firm that has bids on several projects and wants to know if they can be
competitive with their costs. DSS is extensively used in business and management.
Executive dashboard and other business performance software allow faster decision
making, identification of negative trends, and better allocation of business resources.
A growing area of DSS application, concepts, principles, and techniques is in
agricultural production, marketing for sustainable development. For example, the
DSSAT4 package, developed through financial support of USAID during the 80's and
90's, has allowed rapid assessment of several agricultural production systems around
the world to facilitate decision-making at the farm and policy levels. There are,
however, many constraints to the successful adoption on DSS in agriculture. DSS are
also prevalent in forest management where the long planning time frame demands
specific requirements. All aspects of Forest management, from log transportation,
harvest scheduling to sustainability and ecosystem protection have been addressed
by modern DSSs. A comprehensive list and discussion of all available systems in
forest management is being compiled under the COST action Forsys
A specific example concerns the Canadian National Railway system, which tests its
equipment on a regular basis using a decision support system. A problem faced by
any railroad is worn-out or defective rails, which can result in hundreds of derailments
per year. Under a DSS, CN managed to decrease the incidence of derailments at the
same time other companies were experiencing an increase. DSS has many
applications that have already been spoken about. However, it can be used in any
field where organization is necessary. Additionally, a DSS can be designed to help
make decisions on the stock market, or deciding which area or segment to market a
product toward.
CACI has begun integrating simulation and decision support systems. CACI
defines three levels of simulation model maturity. “Level 1” models are traditional
desktop simulation models that are executed within the native software package.
These often require a simulation expert to implement modifications, run scenarios,
and analyze results. “Level 2” models embed the modeling engine in a web
application that allows the decision maker to make process and parameter changes
without the assistance of an analyst. “Level 3” models are also embedded in a web-
based application but are tied to real-time operational data. The execution of “level
3” models can be triggered automatically based on this real-time data and the
corresponding results can be displayed on the manager’s desktop showing the
prevailing trends and predictive analytics given the current processes and state of the
system. The advantage of this approach is that “level 1” models developed for the
FDA projects can migrate to “level 2 and 3” models in support of decision support,
production/operations management, process/work flow management, and predictive
analytics. This approach involves developing and maintaining reusable models that
allow decision makers to easily define and extract business level information (e.g.,
process metrics). “Level 1” models are decomposed into their business objects and
stored in a database. All process information is stored in the database, including
activity, resource, and costing data. The database becomes a template library that
users can access to build, change, and modify their own unique process flows and
then use simulation to study their performance in an iterative manner.
Benefits of DSS
• Improves personal efficiency
• Expedites problem solving (speed up the progress of problems solving in an
organization)
• Facilitates interpersonal communication
• Promotes learning or training
• Increases organizational control
• Generates new evidence in support of a decision
• Creates a competitive advantage over competition
• Encourages exploration and discovery on the part of the decision maker
• Reveals new approaches to thinking about the problem space
• Helps automate the managerial processes.
The following table contains a high-level summary of some of the major flavors of
RAD and their relative strengths and weakness.
Table1: Pros and Cons of various RAD flavors
Criticism
Since rapid application development is an iterative and incremental process, it can
lead to a succession of prototypes that never culminate in a satisfactory production
application. Such failures may be avoided if the application development tools are
robust, flexible, and put to proper use. This is addressed in methods such as the 2080
Development method or other post-agile variants.
The EUC Ranges section describes two types of approaches that are at different ends
of a spectrum. A simple example of these two extremes can use the SQL context.
• The first approach would have canned queries and reports that for the most
part would be invoked with buttons and/or simple commands. In this approach,
a computing group would keep these canned routines up to date through the
normal development/maintenance methods.
• For the second approach, SQL administration would allow for end-user
involvement at several levels including administration itself. Users would also
define queries though the supporting mechanism may be constrained in order
to reduce the likelihood of run-away conditions that would have negative
influence on other users. We see this already in some business intelligence
methods which build SQL, including new databases, on the fly. Rules might
help dampen effects that can occur with the open-ended environment. The
process would expect, and accommodate, the possibility of long run times,
inconclusive results and such. These types of unknowns are undecidable
'before the fact'; the need to do 'after the fact' evaluation of results is a prime
factor of many higher-order computational situations but cannot (will not) be
tolerated by an end user in the normal production mode.
Between these two extremes view of EUC there are many combinations. Some of the
factors contributing to the need for further EUC research are knowledge processing,
pervasive computing, issues of ontology, interactive visualization and analysis
coupling schemes (see Duck test), and the like.
EUC Ranges
EUC might work by one type of approach that attempts to integrate the human
interface ergonomically into a user centered design system throughout its life cycle.
In this sense, EUC's goal is to allow unskilled staff to use expensive and highly skilled
knowledge in their jobs, by putting the knowledge and expertise into the computer
and teaching the end user how to access it. At the same time, this approach is used
when highly critical tasks are supported by computational systems (commercial flight,
nuclear plant, and the like).
Another approach to EUC allows end users (SMEs, domain experts) to control and
even perform software engineering and development. In this case, it can be argued
that this type of approach results mainly from deficiencies in computing that could be
overcome with better tools and environments. But, high-end roles for the computer in
non-trivial domains necessitate (at least, for now) a more full interchange (bandwidth
for conversation) that is situational and subject to near exhaustive scrutiny (there are
limits influencing how far we can go (bringing up, the necessity for a behavioral (also,
see black box below) framework)). Such cannot be filled by a pre-defined system in
today's world. In a sense, the computer needs to have the same credentials as does a
cohort (scientific method of peer review) in the discipline. This type of computing falls
on the more 'open' side of the fence where scientific knowledge is not wrapped within
the cloak of IP.
In the first type of approach of EUC described above, it appears easier to teach
factory workers, for example, how to read dials, push buttons, pull levers, and log
results than to teach them the manufacturing process and mathematical models. The
current computing trend is to simulate a console with similar dials, sliders, levers, and
switches, which the end user is taught to use. To further reduce end user training,
computer consoles all contain components which are shaped, labeled, coloured, and
function similarly. EUC developers assume that once the end user knows what and
how a particular lever works, they will quickly identify it when it appears in a new
console. This means that once staff learns one console, they will be able to operate all
consoles. Admittedly each console will have new components, but training is limited
to those, not the whole console. This approach requires more than just Pavlovian
responses as the console content will have meaning that is of use and power to the
particular computing domain. That is, there may be training that reduces the time
between sensor reading and action (such as the situation for a pilot of a commercial
plane) however, the meaning behind the reading will include other sensor settings as
well as whole context that may be fairly involved.
Computing of this type can be labeled black box where trust will be an essential part,
behavioral analysis is the name of the game (see Duck test), and there is a disparate
(and very, very wide) gap between the domain and the computer-support ontologies.
In the other type of EUC described above, it has been argued that a (teaching
programming and computing concepts to a domain expert (say, one of the sciences
or engineering disciplines) and letting the expert develop rules (this type of action
can be subsumed under the topic of business rules)) is easier than b (teaching the
intricacies of a complex discipline to a computer worker). b is the normal approach of
the IT-driven situation. a has been the reality since day one of computing in many
disciplines. One may further argue that resolving issues of a and b is not unlike the
interplay between distributed and centralized processing (which is an age-old concern
in computing). In this sense of EUC, there may be computer scientists supporting
decisions about architecture, process, and GUI. However, in many cases, the end user
owns the software components. One thrust related to this sense of EUC is a focus on
providing better languages to the user. ICAD was an example in the KBE context. Of
late, this discipline has moved to a co-joint architecture that features advanced
interactive domain visualization coupled with a complicated API accessed via VBA, C+
+, and the like. This type of co-jointness is an example of a domain tool augmented
with non-trivial extensibility.
Trend
The historical view regarding end users is being eroded by the internet and wireless
communication, where the traditional end user is able to actively contribute and add
value to the computer system. wikis are one example where end users provide the
content and free the webmaster to manage the site. Another example within the
computer field is free software, where end users can engage in all aspects of software
development, from feature requests, through testing and reviews, to usability,
documentation, and distribution. Music, pictures, and documents are remixed and
edited to satisfy personal taste and demand. The consequence is that many countries
and industries have been slow or unwilling to adjust to this emerging society, but
some have seen the potential and are exploring economic possibilities.
Another trend is where users specify, and even develop rules that may be fairly
normal relationships (SQL) or be hard-core numerical processes that may require
attention being given to serious computational characteristics, such as ill-
conditioning, parallelisms and similar issues of an ongoing nature.
Research
The human interface receives continuous attention as emerging interfaces reveal
more possibilities and risks. The quest to both internationalize (i18n) and localize
(L10n) software is hampered by computers designed for the English alphabet, but
other major languages, such as Chinese, Japanese, and Arabic, have different
requirements.
Other studies range from website accessibility to pervasive computing, with the focus
ranging from the human to the computer. The issue centers around how much the
human can safely and reliably adjust to the computer's I/O devices on the one hand,
and how unobtrusively the computer can detect the human's needs on the other.
Furthermore, issues related to computing ontologies (example: the Language/Action
perspective has found success in CRM, etc.) continue to be of interest to EUC.
Analysis
The concepts related to the end user cover a wide range (novice user to intellectual
borg—see Slogan 2), hence End User Computing can have a range of forms and
values. Most early computer systems were tightly controlled by an IT department;
'users' were just that. However, at any point in the evolution of computer systems
through time, there was serious work in several domains that required user
development. The dynamics of the power struggles between centralized and
decentralized computing have been a fact; this was partially due to the emergence of
the mid-sized computers (VAX, etc.). Then, the advent of the personal workstation
opened up the door, so to speak, since it allowed a more pervasive type of
computation to emerge. The recent advent of 'web' services has extended the issues
to a more broad scope.
In the sense of serious domain computing and given the intertwining of computation
into all advanced disciplines, any tool (inclusive of any type of capability related to a
domain/discipline) that is provided by a computer becomes part of the discipline
(methodology, etc.). As such, the issue arises about how open the tool is to scrutiny.
Some disciplines require more understanding of the tool set than do others. That is,
tools that are operational in scope require less understanding than those that are
ontological. As an example of the latter type of influence on disciplines, consider the
impact that the computer has had on the scientific method. Some of the issues
related to End User Computing concern architecture (iconic versus language
interface, open versus closed, and others). These continue to be studied. Other issues
relate to IP, configuration, maintenance, End User Computing allows more user input
into system affairs that can range from personalization to full-fledged ownership of
the system.
Examples of End User Computing are systems built using the 4GLs, such as MAPPER
or SQL, or one of the 5GLs, such as ICAD. ICAD (in the KBE domain) stands as a prime
example since it is associated with the pervasive use of Lisp (one of the 3GLs) by
Engineers to accomplish remarkable effects through a long economic cycle.
Slogans
Applications
An inventory control system may be used to automate a sales order fulfillment
process. Such a system contains a list of order to be filled, and then prompts
workers to pick the necessary items, and provides them with packaging and
shipping information. Real time inventory control systems use wireless, mobile
terminals to record inventory transactions at the moment they occur. A
wireless LAN transmits the transaction information to a central database.
Physical inventory counting and cycle counting are features of many inventory
control systems which can enhance the organization.
Marketing information systems are intended to support management decision
making. Management has five distinct functions and each requires support from an
MIS. These are: planning, organizing, coordinating, decisions and controlling.
Information systems have to be designed to meet the way in which managers tend to
work. Research suggests that a manager continually addresses a large variety of
tasks and is able to spend relatively brief periods on each of these. Given the nature
of the work, managers tend to rely upon information that is timely and verbal
(because this can be assimilated quickly), even if this is likely to be less accurate then
more formal and complex information systems. Managers play at least three separate
roles: interpersonal, informational and decisional. MIS, in electronic form or otherwise,
can support these roles in varying degrees. MIS has less to contribute in the case of a
manager's informational role than for the other two.
Three levels of decision making can be distinguished from one another:
• Strategic
• Control (or tactical)
• Operational.
Again, MIS has to support each level. Strategic decisions are characteristically one-off
situations. Strategic decisions have implications for changing the structure of an
organization and therefore the MIS must provide information which is precise and
accurate. Control decisions deal with broad policy issues and operational decisions
concern the management of the organization’s marketing mix.
A marketing information system has four components: the internal reporting system,
the marketing research systems, the marketing intelligence system and marketing
models. Internal reports include orders received, inventory records and sales invoices.
Marketing research takes the form of purposeful studies either ad hoc or continuous.
By contrast, marketing intelligence is less specific in its purposes, is chiefly carried
out in an informal manner and by managers themselves rather than by professional
marketing researchers.
R&D Management System The phases of the research and development vary,
depending on the type of product. Briefly, five stages can be classified: 1. idea
generation stage, 2. planning stage, 3. design stage, 4. pilot production stage, and 5.
initial production stage. Figure 1 illustrates each process and the corresponding
responsible departments. The emphasis of each stage is described as below:
2. Planning stage
Once a decision is made, a project team should be organized to perform the
design tasks. A project leader must be allocated. to ensure that the product is
completed with high quality, on schedule, and at low cost. In this stage, the
business specifications are converted into manufacture able engineering
specifications and the quality function deployment (QFD) can be used to assist
the transformation. A master schedule should be established for each project.
Schedule of the contractual design project should include the customer's
requirements and due date. The engineering specifications and design
schedule should be carefully reviewed to ensure that the design objective can
be achieved. Information that can be collated should include to enforce the
design function and to reduce the research/development risk. Previous design
project and customers' complaint records are two of the useful references.
3. Product design stage
The converted engineering specifications are the quality level that should be
accomplished in the design stage. The major tasks of this stage consist of:
The design output and records should be reviewed, and the results of FMEA or
FTA must be checked. The design review emphasizes the following points:
4. Pilot production
When the design output is proven acceptable after modification, the next step
is to construct the prototype and perform the pilot production. This stage aims
to transfer the designed results from the R&D department to the production
department. Several tasks should be completed before pilot production;
therefore, a separate stage is implemented by some enterprises. The
preparatory tasks include the following: