Sie sind auf Seite 1von 29

Chapter 10

Preparing The Systems Proposal

Systems Analysis and Design
Kendall & Kendall
Sixth Edition
Major Topics
Systems proposal
Determining hardware/software needs
Tangible and intangible costs and benefits
Systems proposal
Using tables, graphs, and figures
Systems Proposal
In order to prepare the systems proposal analysts
must use a systematic approach to:
Ascertain hardware and software needs.
Identify and forecast costs and benefits.
Compare costs and benefits.
Choose the most appropriate alternative.
Ascertaining Hardware and Software
Steps used to determine hardware and software needs:
Inventory computer hardware currently available.
Estimate current and projected workload for the system.
Evaluate the performance of hardware and software using some
predetermined criteria.
Choose the vendor according to the evaluation.
Obtain hardware and software from the vendor.

Steps in Acquiring Computer Hardware

and Software
Hardware Inventory
When inventorying hardware check:
Type of equipment.
Status of equipment operation.
Estimated age of equipment.
Projected life of equipment.
Physical location of equipment.
Department or person responsible for equipment.
Financial arrangement for equipment.
Evaluating Hardware
Criteria for evaluating hardware:
Time required for average transactions (including time for
input and output).
Total volume capacity of the system.
Idle time of the central processing unit.
Size of memory provided.
People that Evaluate Hardware
The people involved:
Systems analysts.
Purchasing, Leasing, or Renting
There are three options for obtaining computer
Evaluating Hardware Support
When evaluating hardware vendors, the selection
committee needs to consider:
Hardware support.
Software support.
Installation and training support.
Maintenance support.
Performance of the hardware.
Software Alternatives
Software may be:
Custom created in-house.
Purchased as COTS (commercial off-the-shelf) software.
Provided by an application service provider (ASP).
Creating Custom Software
Purchasing COTS Packages
Using an ASP
Software Evaluation
Use the following to evaluating software packages:
Performance effectiveness
Performance efficiency
Ease of use
Quality of documentation
Manufacturer support
Analytic Hierarchy Processing (AHP)
Analytic Hierarchy Processing requires decision
makers to judge the relative importance of each
criteria and indicate their preference regarding the
importance of each alternative criteria.
A disadvantage of AHP stems from the use of the
pairwise method used to evaluate alternatives.
Expert Systems
Expert systems are rule-based reasoning systems
developed around an expert in the field.
Neural Nets
Neural nets are developed by solving a number of
specific type of problems and getting feedback on
the decisions, then observing what was involved in
successful decisions.
Recommendation Systems
Recommendation systems are software and
database systems that reduce the number of
alternatives by ranking, counting, or some other
A recommendation system does not use weights.
It simply counts the number of occurrences.
The Web and Decision Making
The World Wide Web may be used to extract
decision-making information.
Push technologies automatically deliver new Internet
information to a desktop.
Intelligent agents learn your personality and
behavior and track topics that you might be
interested in based on what it has learned.
Identifying and Forecasting Costs and
May forecast costs and benefits of a prospective
system through:
Graphical judgment.
Moving averages.
Analysis of time series.
Estimating Trends
Trends may be estimated using:
Graphical judgment.
The method of least squares.
Moving average method.
Costs and Benefits
Systems analysts should take tangible costs,
intangible costs, tangible benefits, and intangible
benefits into consideration to identify cost and
benefits of a prospective system.
Tangible Benefits
Tangible benefits are advantages measurable in
dollars that accrue to the organization through use
of the information system.
Increase in the speed of processing.
Access to information on a more timely basis.
Intangible Benefits
Intangible benefits are advantages from use of the
information system that are difficult to measure.
Improved effectiveness of decision-making processes.
Maintaining a good business image.
Tangible Costs
Tangible costs are those that can be accurately
projected by systems analysts and the business
accounting personnel.
Cost of equipment.
Cost of resources.
Cost of systems analysts' time.
Intangible Costs
Intangible costs are those that are difficult to
estimate, and may not be known
Cost of losing a competitive edge.
Declining company image.
Selecting the Best Alternative
To select the best alternative, analysts should
compare costs and benefits of the prospective
alternatives using:
Break-even analysis.
Cash-flow analysis.
Present value method.
Break-Even Analysis
Break-even analysis is the point at which the cost of
the current system and the proposed system
Break-even analysis is useful when a business is
growing and volume is a key variable in costs.
Break-Even Analysis
Payback determines the number of years of
operation that the system needs to pay back the
cost of investing in it.
Break-Even Analysis Showing a Payback
Cash-Flow Analysis
Cash-flow analysis is used to examine the direction,
size, and pattern of cash flow associated with the
proposed information system.
Determine when cash outlays and revenues will
occur for both:
The initial purchase.
Over the life of the information system.
Present Value Method
Way to assess all the economic outlays and revenues
of the information system over its economic life
and to compare costs today with future costs and
today's benefits with future benefits.
Use present value when the payback period is long,
or when the cost of borrowing money is high.
Selecting the Best Alternative
Guidelines to select the method for comparing
Use break-even analysis if the project needs to be justified
in terms of cost, not benefits.
Use payback when the improved tangible benefits form a
convincing argument for the proposed system.
Selecting the Best Alternative
Guidelines to select the method for comparing
alternatives (continued)
Use cash-flow analysis when the project is expensive,
relative to the size of the company.
Use present value when the payback period is long or
when the cost of borrowing money is high.
Items in the Systems Proposal
When preparing a systems proposal, systems analysts
should arrange the following ten items in order:
Cover letter.
Title page of project.
Table of contents.
Executive summary (including recommendation).
Items in the Systems Proposal
When preparing a system’s proposal, systems analyst should
arrange the following ten items in order (continued):
Outline of systems study with appropriate documentation.
Detailed results of the systems study.
Systems alternatives (three or four possible solutions).
Systems analysts recommendations.
Items in the Systems Proposal
When preparing a system’s proposal,
systems analyst should arrange the following ten items
in order:
Assorted documentation.
Summary of phases.
Other material as needed.
Guidelines for Using Tables
Some guidelines to use tables effectively are:
Integrate it into the body of the proposal.
Try to fit the entire table vertically on a single page.
Number and title the table at the top of the page.
Guidelines for Using Tables
Some guidelines to use tables effectively are
Make the title descriptive and meaningful.
Label each row and column.
Use a boxed table if room permits.
Use footnotes if necessary to explain detailed information
contained in the table.
Guidelines for Using Graphs
Some guidelines for using graphs are:
Choose a style of graph that communicates your intended
meaning well.
Integrate the graph into the proposal body.
Give the graph a sequential figure number and a
meaningful title.
Label each axis, any lines, columns, bars, and pieces of
the pie on the graph.
Guidelines for Using Graphs
Some guidelines for using graphs are (continued):
Include a key to indicate differently colored lines, shaded
bars, or crosshatched areas.
Types of Graphs
Line graphs
Column charts
Bar charts
Pie charts
Line Graphs
Used to show change over time
Changes of up to five variables on a single graph
May show when lines intersect
Line Chart Example
Column Charts
Show a comparison between two or more variables
Compare different variables at a particular point in
Easier to understand than line graphs
Column Chart Example
Variations of Column Charts
100 percent stacked chart
Includes 100 percent stacked charts
Show how different variables make up 100 percent of an
Deviation Column Chart
Shows deviation from average
Bar Charts
Used to show one or more variables within certain
classes or categories during a specific time period
May be sorted or organized by:
Geographical order.
Progressive order.
Pie Charts
Used to show how 100 percent of a commodity is
divided at a particular point in time
Easier to read than 100 percent stacked column
charts or 100 percent subdivided bar charts
Disadvantage is they take a lot of room on the page
Pie Chart Example
Oral Presentations
When delivering the oral presentation, keep in mind
the principles of delivery:
Project loudly enough so that the audience can hear you.
Look at each person in the audience as you speak.
Make visuals large enough so that the audience can see
Oral Presentations (Continued)

When delivering the oral presentation, keep in mind

the principles of delivery.
Use gestures that are natural to your
conversational style.
Introduce and conclude your talk confidently.
Chapter 13
Designing Databases
Systems Analysis and Design
Kendall & Kendall
Sixth Edition
Major Topics
Key design
Using the database
Data warehouses
Data mining
Data Storage Design Objectives
The objectives in the design of data storage
organization are:
The data must be available when the user wants to use it.
The data must have integrity.
It must be accurate and consistent.
Efficient storage of data as well as efficient updating and
Data Storage Design Objectives
The objectives in the design of data storage organization are
The information retrieval be purposeful.
The information obtained from the stored data must be in an
integrated form to be useful for:
Decision making.

Approaches to Data Storage

There are two approaches to the storage of data in
a computer system:
Store the data in individual files each unique to a
particular application.
Storage of data in a computer-based system involves
building a database.
A database is a formally defined and centrally controlled store of
data intended for use in many different applications.
A file can be designed and built quite rapidly, and the
concerns for data availability and security are
Analysts can choose an appropriate file structure
according to the required processing speed of the
particular application system.
Objectives of Effective Databases
The effectiveness objectives of the database
Ensuring that data can be shared among users for a
variety of applications.
Maintaining data that are both accurate and consistent.
Ensuring all data required for current and future
applications will be readily available.
Objectives of Effective Databases
The effectiveness objectives of the database include
Allowing the database to evolve and the needs of the
users to grow.
Allowing users to construct their personal view of the
data without concern for the way the data are physically
Metadata is the information that describes data in
the file or database.
Used to help users understand the form and structure of
the data
Reality, Data, and Metadata
Entity-Relationship Concepts
Entities are objects or events for which data is
collected and stored.
An entity subtype represents data about an entity
that may not be found on every record.
Relationships are associations between entities.
A distinct collection of data for one person, place,
thing, or event.
Entity Subtype
An entity subtype is a special one-to-one relationship used to
represent additional attributes, which may not be present
on every record of the first entity.
This eliminates null fields on the primary database.
For example, a company that has preferred customers, or
student interns may have special field.
Associative Entity
Associative Entity - links two entities
An associative entity can only exist between two

Attributive Entity
An attributive Entity - describes attributes, especially
repeating elements.
Diagram Symbols
Relationships may be:
A single vertical line represents one.
A circle represents zero or none.
A crows foot represents many.
A self-join is when a record has a relationship with
another record on the same file.
Entity-Relationship Diagram Example
Attributes, Records, and Keys
Attributes are a characteristic of an entity,
sometimes called a data item.
Records are a collection of data items that have
something in common.
Keys are data items in a record used to identify the
Key Types
Key types are:
Primary key, unique for the record.
Secondary key, a key which may not be unique, used to
select a group of records.
Concatenated key, a combination of two or more data
items for the key.
Foreign key, a data item in one record that is the key of
another record.
A file contains groups of records used to provide
information for operations, planning, management,
and decision making.
Files can be used for storing data for an indefinite
period of time, or they can be used to store data
temporarily for a specific purpose.
File Types
Types of files available are:
Master file.
Table file.
Transaction file.
Work file.
Report file.
Master and Transaction Files
Master files
Have large records
Contain all pertinent information about an entity
Transaction records
Are short records
Contain information used to update master files
File Organization
The different organizational structures for file design
Sequential organization.
Linked lists.
Hashed file organization.
A database is intended to be shared by many users.
There are three structures for storing database files:
Relational database structures.
Hierarchical database structures (older).
Network database structures (older).
Logical and Physical Database Design
Normalization is the transformation of complex user
views and data to a set of smaller, stable, and
easily maintainable data structures.
Normalization (Continued)
Normalization creates data that are stored only once
on a file.
The exception is key fields.
The data structures are simpler and more stable.
The data is more easily maintained.
Three Steps of Data Normalization
The three steps of data normalization are:
Remove all repeating groups and identify the primary key.
Ensure that all nonkey attributes are fully dependent on
the primary key.
Remove any transitive dependencies, attributes that are
dependent on other nonkey attributes.
Three Steps of Normalization
Data Model Diagrams
Data model diagrams are used to show relationships
between attributes.
An oval represents an attribute.
A single arrow line represents one.
A double arrow line represents many.
Data Model Example
First Normal Form (1NF)
Remove any repeating groups.
All repeating groups are moved into a new table.
Foreign keys are used to link the tables.
When a relation contains no repeating groups, it is in
the first normal form.
Second Normal Form (2NF)
Remove any partial dependencies.
A partial dependency is when the data are only
dependent on a part of a key field.
A relation is created for the data that are only
dependent on part of the key and another for data
that are dependent on both parts.
Third Normal Form (3NF)
Remove any transitive dependencies.
A transitive dependency is when a relation contains
data that are not part of the entity.
The problem with transitive dependencies is
updating the data.
A single data item may be present on many records.
Entity-Relationship Diagram and
Record Keys
The entity-relationship diagram may be used to determine
record keys.
When the relationship is one-to-many, the primary key of the file at
the one end of the relationship should be contained as a foreign
key on the file at the many end of the relationship.
A many-to-many relationship should be divided into two one-to-
many relationships with an associative entity in the middle.
Guidelines for Creating Master Files or
Database Relations
Guidelines for creating master files or database
relations are:
Each separate entity should have it's own master file or
database relation.
A specific, nonkey data field should exist on only one
master file or relation.
Each master file or relation should have programs to
create, read, update, and delete records.
Integrity Constraints
There are three integrity constraints that help to
ensure that the database contains accurate data:
Entity integrity constraints, which govern the composition
of primary keys.
Referential integrity, which governs the denature of
records in a one-to-many relationship.
Domain integrity.
Entity Integrity
Entity integrity constraints are rules for primary
The primary key cannot have a null value.
If the primary key is a composite key, none of the fields in
the key can contain a null value.
Referential Integrity
Referential integrity governs the denature of records
in a one-to-many relationship.
Referential integrity means that all foreign keys in
one table (the child table) must have a matching
record in the parent table.
Referential Integrity (Continued)
Referential integrity includes:
You cannot add a record without a matching foreign key
You cannot change a primary key that has matching child
table records.
A child table has a foreign key for a different record.
You cannot delete a record that has child records.
Referential Integrity
A restricted database updates or deletes a key only if
there are no matching child records.
A cascaded database will delete or update all child
records when a parent record is deleted or
The parent triggers the changes.
Domain Integrity
Domain integrity defines rules that ensure that only
valid data are stored on database records
Domain integrity has two forms:
Check constraints, which are defined at the table level.
Rules, which are defined as separate objects and may be used
within a number of fields.
Retrieving and Presenting Database
The guidelines to retrieve and present data are:
Choose a relation from the database.
Join two relations together.
Project columns from the relation.
Select rows from the relation.
Derive new attributes.
Index or sort rows.
Calculate totals and performance measures.
Present data.
Denormalization is the process of taking the logical
data model and transforming it into an efficient
physical model.
Data Warehouses
Data warehouses are used to organize information
for quick and effective queries.
Data Warehouses and Database
In the data warehouse, data are organized around major
Data in the warehouse are stored as summarized rather than
detailed raw data.
Data in the data warehouse cover a much longer time frame
than in a traditional transaction-oriented database.
Data Warehouses and Database
Differences (Continued)
Data warehouses are organized for fast queries.
Data warehouses are usually optimized for answering
complex queries, known as OLAP.
Data warehouses allow for easy access via data-
mining software called software.
Data Warehouses and Database
Differences (Continued)
Data warehouses include multiple databases that
have been processed so that data are uniformly
defined, containing what is referred to as “clean”
Data warehouses usually contain data from outside
Online Analytic Processing (OLAP)
Online analytic processing (OLAP) is meant to answer
decision makers’ complex questions by defining a
multidimensional database.
Data mining, or knowledge data discovery (KDD), is the
process of identifying patterns that a human is incapable
of detecting.
Data Mining Decision Aids
Data mining has a number of decision aids
available, including:
Statistical analysis.
Decision trees.
Neural networks.
Intelligent agents.
Fuzzy logic.
Data visualization.
Data Mining Patterns
Data mining patterns that decision makers try to
identify include:
Associations, patterns that occur together.
Sequences, patterns of actions that take place over a
period of time.
Clustering, patterns that develop among groups of people.
Trends, the patterns that are noticed over a period of
Web Based Databases and XML
Web-based databases are used for sharing data.
Extensible markup language (XML) is used to define
data used primarily for business data exchange
over the Web.
Chapter 16
Quality Assurance
Through Software Engineering
Systems Analysis and Design
Kendall & Kendall
Sixth Edition
Major Topics
Six Sigma
Quality assurance
Structure charts
Data and control passing

Quality Assurance
Three quality assurance approaches through
software engineering have been developed to
evaluate the quality of the information system's
design and analysis
Guidelines for Quality Software
Quality assurance approaches are:
Securing total quality assurance through designing
systems and software with a top-down and modular
Documenting software with appropriate tools.
Testing, maintaining, and auditing software.
Six Sigma
Six Sigma is a culture built on quality.
Six Sigma uses a top-down approach.
Project leader is called a Black Belt.
Project members are called Green Belts.
Master Black Belts have worked on many projects.
There are seven steps in Six Sigma.
Steps of Six Sigma
Total Quality Management
Total quality management (TQM) is a conception of quality
as an evolutionary process toward perfection instead of
conceiving quality as controlling the number of defective
products produced.
The full organizational support of management and early
commitment to quality from the analyst and from the
business are necessary.
Structured Walkthroughs
One of the strongest quality assurance actions is
structured walkthroughs.
Walkthroughs use peer reviewers to monitor the
system's programming and overall development.
They point out problems, and allow the programmer
or analyst to make suitable changes.
Personal Involved in Structured
Structured walkthroughs involve at least four
The person responsible for the part of the system being
A walkthrough coordinator.
A programmer or analyst peer.
A person to take notes about suggestions.
Top-Down and Bottom-Up Approaches
The bottom-up approach and the top-down
approach are available for quality system design.
The Bottom-Up Approach
The bottom-up design refers to:
Identifying the processes that need computerization as
they arise.
Analyzing them as systems.
Either coding them or purchasing COTS (commercial off-
the-shelf) software to meet the immediate problem.
Disadvantages of a Bottom-up
The disadvantages of a bottom-up approach to
design are:
There is a duplication of effort in purchasing software, and
entering data.
Much worthless data are entered into the system.
Overall organizational objectives are not considered and
therefore cannot be met.
The Top-Down Approach
Top-down design allows the systems analyst to
ascertain overall organizational objectives along
with ascertaining how they are best met in an
overall system.
The system is divided into subsystems and their
Using the Top-Down Approach
Advantages of the Top-down Approach
The advantages of a top-down approach to design
Avoiding the chaos of attempting to design a system “all
at once”.
The ability to have separate systems analysis teams
working in parallel on different but necessary
Losing sight of system goals as a result of getting so
mired in detail.
Disadvantages of the Top-down
The three disadvantages of a top-down approach
There is a danger that the system will be divided into the
wrong subsystems.
Once subsystem divisions are made, their interfaces may
be neglected or ignored.
The subsystems must be reintegrated, eventually.
Modular Programming and the Top-
Down Approach
The modular programming concept is useful for a
top-down approach.
Once the top-down design approach is taken, the whole
system is broken into logical, manageable portions or
They should be functionally cohesive, accomplishing only
one function.
Advantages of Modular Programming
Advantages of modular programming are:
Modules are easier to write and debug.
Tracing an error in a module is less complicated.
A problem in one module should not cause problems in others.
Modules are easier to maintain.
Modules are easier to grasp because they are self-
contained subsystems.
Guidelines for Modular Programming
Four guidelines for correct modular programming
Keep each module to a manageable size.
Pay particular attention to the critical interfaces.
Minimize the number of modules the user needs to modify
when making changes.
Maintain the hierarchical relationships set up in the top-
down phases.
Linking Programs in Microsoft Windows
There are two systems to link programs in Microsoft
Dynamic Data Exchange (DDE) updates data in one
program based on data in another program.
Object Linking and Embedding (OLE) where an object in a
second program retains the properties of an object in
the first program.
Structure Charts
The recommended tool for designing a modular, top-down
system is a structure chart.
They help systems analysts by providing a picture of
modules and the relationships among those modules.
Consists of rectangular boxes that represents the modules
Connecting lines or arrows
Data and Control Passing
Data and control passed between structure chart
modules is either a:
Data couple, passing only data, shown as an arrow with an
empty circle.
Control couple, passing switches or flags, shown as an
arrow with a filled-in circle.
Switches, which have only two values.
Flags, that have more than two values.
Structure Chart and Coupling
Control Coupling
Control flags should be passed up the structure
Control modules make the decisions about which lower-
level modules should be executed.
Lower-level modules are functional, performing only one
Minimal Coupling
Systems analysts should keep the number of
couples to a minimum.
The fewer data couples and control flags one has in the
system, the easier it is to change the system.
Data and Control Passing
Data and control passed between structure chart
modules is either:
Data coupling, only the data required by the module are
passed, or
Stamp coupling, more data than necessary are passed
between the modules
Control coupling
Creating Reusable Modules
Data Flow Diagrams and Structure
A data flow diagram may be used to create a
structure chart in the following two ways:
Indicating the sequence of the modules.
Indicating modules subordinate to a higher module.
Types of Modules
Modules fall into three classes:
Control modules, determining the overall program logic.
Transformational modules, changing input into output.
Functional modules, performing detailed work.
Improper Subordination
A subordinate module is one found lower on the
structure chart, called by another higher module.
Allowing a lower-level module to perform any
function of the calling, higher-level module, is
called improper subordination.
System Documentation
One of the requirements for total quality assurance is
preparation of an effective set of system
This serves as:
A guideline for users.
A communication tool.
A maintenance reference as well as development
Forms of System Documentation
Documentation can be one of the following:
Procedure manuals.
The FOLKLORE method.
Pseudocode is an English-like code to represent the
outline or logic of a program.
It is not a particular type of programming code, but it
can be used as an intermediate step for developing
program code.
Procedure Manuals
Common English-language documentation
Background comments
Steps required to accomplish different transactions
Instructions on how to recover from problems
Online help may be available
“Read Me” files included with COTS software
Procedure Manuals (Continued)
The biggest complaints with procedure manuals are
They are poorly organized.
It is difficult to find needed information.
The specific case in question does not appear in the
The manual is not written in plain English.
Web Documentation
A Web site can help maintain and document the
system by providing:
FAQ (Frequently Asked Questions).
Help desks.
Technical support.
Fax-back services.
FOLKLORE Documentation
The FOLKLORE documentation method collects
information in the categories of:
Art forms.
FOLKLORE Documentation
Choosing a Documentation Technique
Guidelines for choosing a documentation technique:
Is it compatible with existing documentation?
Is it understood by others in the organization?
Does it allow you to return to working on the system after
you have been away from it for a period of time?
Choosing a Documentation Technique
Guidelines for choosing a documentation technique
Is it suitable for the size of the system you are working
Does it allow for a structured design approach if it is
considered to be more important than other factors?
Does it allow for easy modification?
Testing Overview
The new or modified application programs,
procedural manuals, new hardware, and all system
interfaces must be tested thoroughly.
Testing Procedures
The following testing process is recommended:
Program testing with test data.
Link testing with test data.
Full system testing with test data.
Full system testing with live data.
Organizational Roles and Testing
Program Testing with Test Data
Desk check programs.
Test with valid and invalid data.
Check for errors and modify programs.
Link Testing with Test Data
Also called string testing
See if programs can work together within a system
Test for normal transactions
Test with invalid data
Full System Testing with
Test Data
Operators and end users test the system.
Factors to consider:
Is adequate documentation available?
Are procedure manuals clear?
Do work flows actually flow?
Is output correct and do the users understand the output?
Full System Testing with Live Data
Compare the new system output with the existing
system output.
Only a small amount of live data are used.
Maintenance is performed to:
Repair errors or flaws in the system.
Enhance the system.
Ensure feedback procedures are in place to communicate
There are internal and external auditors.
Internal auditors study the controls used in the system to
make sure that they are adequate.
Internal auditors check security controls.
External auditors are used when the system influences a
company’s financial statements.