Sie sind auf Seite 1von 12

Laureate Online Education

Software Engineering

All right reserved, 2002-2013. The Software Engineering module, in all its parts:
syllabus, guidelines, lectures, discussion questions, technical notes, images and any
additional material is copyrighted by Laureate Online Education

Module: SE

Seminar 1

Seminar 1- Introduction to Software Engineering and The Software Process


Gail Miles
Study Chapter 1 and 2
The first week is used to get acquainted with each other and to the discipline of
Software Engineering and introduce the Software Process. We will start working on
mini team projects in week 2 once we are comfortable with each other.
PART 1: Overview: Introduction for Software Engineering
Software Engineering is not a continuation of learning skills found in programming. The
product of a professional programming effort is much more than a "program". The
software application product requires well-designed and documented applications which
simplify maintenance, are thoroughly tested for high reliability, and adhere strongly to
version control so later versions can be tested against early test cases to verify that new
changes do not compromise existing functions.
Software engineering is now recognized as an engineering discipline which builds on
the fundamentals of computer science, but where computer science is interested in the
theory of computing, software engineering deals with the practical side of software
development by concentrating on project management to facilitate the development of a
software product on time, within budget, complying with quality standards, and meeting
performance specifications.
Software engineering encompasses complex and non-trivial activities because of the
many skills required to manage, design, implement, test, install, and maintain a
programming product, and because the tools, although greatly improved, are not
universally used for the development effort
Early Software Development Problems
The challenge of the first twenty-five of the computer industrys seventy years was to
develop hardware to be more reliable and less expensive. Originally, software was so
insignificant that vendors bundled the software with the hardware sale. With the birth of
1

the microchip, the cost of hardware development considerably decreased, but the cost
of software continued to rise. The industry refocused its efforts on the development of
software. The rapid growth of computer use was fueled with the introduction of the
microcomputer thirty-five years ago. Software became the key factor in growth of the
microcomputer.
A variety of reasons contributed to dramatic rise in the cost of software. In the 1970s,
new software development companies proliferated to write software that could be
modified for a variety tasks for a variety of clients. Organizations jumped at the chance
to buy software that was pre-packaged. This concept was never realized for
enterprise applications. In fact, the opposite became true. Organizations soon
recognized that, since the software was written for many customers, programmers were
needed to build enterprise interfaces and modify standard installs. Millions of lines of
code were developed or modified by in-house programmer teams. Most of this
software was written as a quick, temporary fix to a specific problem. Very little analysis
was done, and little or no documentation was written. The maintenance of these
applications became a nightmare, and since software does not wear out, quick fixes
often stayed in place for years. These have become the legacy systems which many
enterprises continue to maintain at a very high cost.
There were no defined standards for software development in the early systems.
Documentation was poor. The fact that Computer Science as a discipline was just
emerging in colleges and universities further compounded software development
problems. Most software programs were written by self-taught programmers. As a
group, internal technical staff members responsible for developing and maintaining inhouse code were not trained software developers. Thus, they often did not have a
broad understanding of the need for standardized processes to improve maintainability.
Software was written specifically for a specific hardware platform with no thought to
portability. When new equipment and/or operating systems were installed, major rewrites of the code were necessary. Maintenance of the programs continued to eat into
corporate revenues. More programmers were hired to maintain aging software. The
cost of software soared. The industry began to seek solutions for these problems.
By the late 70s, the phrase Software Crisis was coined. The primary characteristics of
the problem included
1. The inability to complete software projects on time. Some projects were reaching
years past their scheduled due date.
2. Low cost estimates for software projects. Some projects were costing 5 to10 times
the original estimate.
3. Questionable software quality with continuing need for major software patches. It
was not unusual to have a version 4.0, 4.1, 4.3, 4.5 on a single product where each
version fixed a major problem in the software.
2

4. Weak project managers who had sound management skills but little understanding of
managing technical staff working on software projects
Software was and continues to be a logical system as opposed to a physical system
(such as a building or a bridge). Since each piece of software is developed with mind
power, the development of software was and continues to be very labor intensive.
Compounding this was the inability to develop reusable code so every project was
reinventing the wheel.
As the microchip permeated the computer industry, the development of distributed
systems and personal computers exploded. This explosion of computing for the masses
was the catalyst for the growth of software companies. The increase of the userfriendly software in the 1980s sold application packages such as word processors,
spreadsheets, and databases as commodity products. The world was not prepared for
the human resource needs necessary to sustain the demand of software development.
A Solution
By the late 70s, the computer industry was desperate to put procedures in place to
combat some of these problems. They pulled concepts from both management and
engineering to develop standards for software development. Fritz Bauer (at the First
NATO Conference on Software Engineering, 1969) provided an early definition of this
process that evolved as Software Engineering The establishment and use of sound
engineering principles in order to obtain economically software that is reliable and works
efficiently on real machines (Randell, 1996) . The assumption was that we could use
the same techniques used by engineers building physical structures to build (develop)
software. Software engineering provides standard structures for the development of
software. These structures include procedures which drive the development of software
and the techniques by which we follow to improve cost and time estimates. The key
tools used to drive the software engineering process have become CASE tools.
Systems Engineering
Systems engineering is the process of designing socio-technical (human, social, and
environmental) systems that produce a product. The primary focus is to understand the
system by following a standard processes: functional and requirements analysis which
includes designing, implementing, and testing to verify the product meets the
requirements. Such activities as the study of human factors, reliability, feasibility, and
safety are part of this overall effort.
The process of systems engineering is explained clearly in the text -- the final goal of
the formal process is to design a system that addresses as many variables as possible
to improve our ability to develop a successful system.
3

Evolving Software Engineering


The next sections discuss a few modern trends in Software Engineering.
Collaboration
Collaborative activities are critical in the current global climate producing todays
complex systems. Software engineers no longer develop and maintain a programming
environment. Instead, we have interconnecting heterogeneous systems
communicating together. (Redmiles, et.al., 2006) At the heart of this trend is the
richness of the web technologies and the move toward ubiquitous computing.
Globalization as well as technology innovations demand the ability to collaborate
geographically with many cultures and language. The Internet is the driving force.
Thomas Friedman ( 2005) summed the impact of the Internet very well It created a
global platform that allowed more people to plug and play, collaborate and compete,
share knowledge and share work, than anything we have ever seen in the history of the
world.
Underlying the need to build collaborative environments is the necessity to design,
implement, and effectively use collaborative technology tools such as document sharing
(SharePoint , Google docs), extranets, groupware (MS-Exchange, LotusNotes
Domino), broadband networks, videoconferencing,
Pervasive Computing
Postnote (2006) defines pervasive computing as the integration of computing into the
everyday lives of people and their environment. We already see rapid development of
technologies. Computers are pervasive in the workplace, and now we find computers
where we play and study. The rise of mobile devices with solid internet access is
making pervasive computing a reality. We can see signs of this everywhere: Starbucks
(coffee houses), McDonalds (fast food chains), even camp grounds who offer free wi-fi
connection.
The trend toward very small computers (the size of a coin) being embedded into
inanimate and animate objects is already seen in embedded medical devices (such as
medical diagnostic hand-helds), environmental monitoring, care for the elderly, etc.
Pressman (2005), a well-known Software Engineer, sees pervasive computing in 2
phases:
Phase 1: Current state of pervasive computing
a. Mobile devices and connectivity network connectivity everywhere. Only a few
years ago, we worried about finding a network while traveling. Now we expect
the connectivity.
b. Context awareness the ability for our wireless devices to know and
automatically connect to networks in the environment.
4

c. Smart apps -- allow communication between one object to another. A simple


example is an app called bump , available for the Android OS. When two
smart phones are close (and Bluetooth is turned on), they can exchange data
simply by clicking a button)
Phase 2: (in the next decade) [I think this will be much sooner and may already be
upon us]
a. Mobile user profiles will be automatically recognized by other objects.
b. Artificial Intelligence concepts integrated into the devices so smart objects interact
with its environment based on situational characteristics
The impact of pervasive computing on Software Engineers is huge how do we test
environments that are constantly shifting? This also leads to the problem of
adaptability. Since the environments are rapidly changing, how do we adapt our
processing requirements? Adding to this is the complexity of communication as we face
these future challenges. Some of the answers to these questions lay in the rapid
development methods (such as Agile) and the maturity of reuse.
Short video showing examples of pervasive computing of the future
http://www.youtube.com/watch?v=KsKne-fw-X0
Cloud Computing and SaaS (Software as a Service)
Cloud computing is the label we use to define the process where data is stored on
internet-connected servers. When a client needs the data, a copy is downloaded to the
individual desktop. We see the use in all types of systems corporate, entertainment
(downloading movies from the web), gaming (wii, PS3, 360), handhelds (United Parcel
Service, etc).
The main advantages include data at our fingertips regardless of geography, data
sharing (such as game information), and resource sharing. The pressure on Software
Engineering forces the need for complex security systems and with high reliability on an
infrastructure that may not be that reliable.

Professional and Ethical Responsibility.


All professions are governed by Codes of Conduct. Professional activities have the
power to affect outcomes. Thus, professional roles are such that with the special rights
of the profession, there are also responsibilities (Johnson, 2009) to act ethically.
As Information Technology has matured, the need to build an infrastructure of Codes of
Conduct becomes paramount. In recent years, serious concern has been raised in the
computing industry because of the lack of standard professional ethics. This has been
5

exacerbated by the number of computer-related disasters and denial of services.


Many of our software systems are categorized as critical systems. As such, failure of
system functions can easily cause millions of dollars of damage or even deaths.
Strong, widely-held Codes of Conduct and Professional Ethics must be used to guide
the professional behavior and practice of the profession.
The institutionalization of Codes of Conduct and Codes of Practice is important for the
profession to grow. Currently, there is a distinction between a profession such as
Information Systems professional and controlled professions such as Medicine and
Law, where the loss of membership may also imply the loss of the right to practice.
(Davidson et.al, 2001) In the professions such as Information Technology, Software
Engineering, Systems Analyst, etc, there are organizations guiding the ethics of these
professions (the British Computer Society (BCS), the Association for Computing
Machinery (ACM), the Institute of Electrical and Electronic Engineers (IEEE),
Association of Information Technology Professionals (AITP) and the International
Federation for Information Processing (IFIP). Each of these organizations has a written
Code of Conduct to which members of the organization are expected to adhere to. The
difference between a standard profession such as these and the controlled professions
such medicine and law has to do with what happens if the member has a breach of
conduct. If there is a breach of ethics on the part of a Medical Doctor, he loses his
ability to practice medicine. A breach of ethics in Information Technology by members
of the society will remove them from professional organization but does not restrict
them from practicing their profession.
The Association of Information Systems identifies several areas that are included in
most Codes of Conduct
"Academic honesty
Adherence to confidentiality agreements
Data privacy
Handling of human subjects
Impartiality in data analysis and professional consulting
Professional accountability
Resolution of conflicts of interest
Software piracy (Davison et, al, 2001)
Summary
Software Development has many challenges in current complex software
environments. Developers will be required to develop more complex systems in
environments that are littered with risks internal and external. Software engineering
techniques can provide practices such that this complexity can be managed to provide
high quality, bug-free software systems.
6

PART 2: Software Processes


Software Processes incorporate all the tasks necessary to produce a software product.
There are many different approaches used to model the software process, but key
activities are found in most approaches.
Software specification defining the functions and the properties attached to the functions.
Software design and implementation producing a model or representation of a process that
is then built.
Software validation validating the software to answer the question: Are we building the right
system?
Software evolution monitoring the evolution of the software to meet the needs of the
customer and the changing environment.

One of the first tasks in the development of a product is providing an abstract


representation of the software process. As with programming, there is always more
than one way to represent the process. These are called process models. The two
which are widely used is the waterfall model and the evolutionary development model.
The Agile model, which has rapidly gained popularity in the past five years, is the
current a form of the evolutionary model which usually has a faster development cycle
(Ambler, 2006). It is discussed in detail in the next seminar. Component-based
software development and reuse have also become popular approaches where large
complicated software systems are created by choosing a variety of software modules
that are commercially available (COTS) (Qureshi and Hussain, 2008)
The waterfall model is the oldest and is still widely used in software engineering
(Parekh, 2005) . The software process is divided into five distinct stages, each of which
spills into the next stage. You can see a graphical image of this model in Figure 2.1
of your Sommerville text.
The analysis or requirements definition phase of the process establishes the functional
requirements of the software system to be developed. This system analysis phase
defines the components of the system. These components include not only the
software to be produced, but also the hardware, data, and human systems. Once
system elements are defined, the requirements are defined along functional tasks.
The analysis phase requires a close relationship between the customer and the
developer to identify functional tasks, performance benchmarks, and user interface
7

requirements early in the process. Once the requirements are defined and the
customer has signed off on them, the process moves to the design phase. The design
phase translates the requirements into an architectural representation of the total
system which is used to map the coding process. The implementation phase converts
the design model into a working software model. The testing phase focuses on testing
both logical and functional tasks to discover errors, and finally the operation and
maintenance phase works to maintain a running product and further enhancements.
There are difficulties in the waterfall model (Lewallen, 2005). One major drawback is
the need to define a complete set of software requirements early in the development
cycle. It does not allow for on-going changes in requirements. This is especially
problematic with the complexity of software development of large projects where the
environment and tools are constantly evolving and changing. After looking at the
emerging trends, we can see where the waterfall approach would not be effective for
most of the rapidly developing complex systems. The nature of current software
development efforts requires software to be developed in phases where each phase
then determines the requirements of the next phase.
The evolutionary development process addresses some of these issues and allows
for an initial implementation of user requirements and new features added by customers
as they are defined (Figure 2.2. in Sommerville). The advantage here is the ability to
design specifications incrementally. The newer approaches to Project Management use
this approach. The Team Software ProcessSM (TSPSM) developed by the Software
Engineering Institute (Humphrey, 2002) is based on the cyclic development of software.
Agile Programming is a form of an evolutionary development process.
The disadvantage of phase development lies in the possibility of a poorly structured
system if not managed properly. It is also often difficult to determine the scope of the
project because customers could theoretically add requirements and features
indefinitely. Thus, it is best used for small systems, although it is evolving and has been
used for larger projects. The splintered development sometimes produces serious
problems for large projects. It appears, however, to be successful in bringing a project
to completion (Snbl, et al, 2001). The statistics collected on the Agile approach are
encouraging. A survey completed by Ambysoft Corporation found that Agile success
rates (72%) were better that the traditional approach (63%) (Ambler, 2007)
Component based software engineering (CBSE) is based on the concept of Code
Reuse using existing components of source code to develop new software systems.
(Qureshi and Hussain, 2008) These components include not only programming code,
but also documentation, architectural design, and data. The underlying concept is that
code reliability can be increased and the cost reduced by using fully validated code and
8

tools. Somerville addresses the basic foundation of COTS and CBSE in Chapter 16.
We will study this in detail in Seminar 6.

Process Activities
With any of these approaches, it is necessary to determine the problem space, define a
solution, implement, and test these implementations. Thus, developmental approaches
are different; each has processes in common that follow this sequence.
Software Specification
Software specification defines what services are required from the system. This
technique is called requirements engineering. Requirements engineering includes four main
phases:
Feasibility study We can assume that most projects are feasible given unlimited resources
and unlimited time. Thus, we need to determine whether the project is feasible given what we
have to work with.
Requirements Elicitation and Analysis Once we determine the project can be
accomplished with the available resources, the system requirements are identified. One of the
best methods is the use of a software prototype. We now have rapid prototyping tools allowing
us to simulate a working system without completed code requirements.
Requirements Specifications We define the project specifications using the system
requirements that are defined in the requirements analysis.
Requirements Validation We perform a validation check to guarantee that that our system in
consistent and complete.

A graphical representation of these phases is shown on Figure 2.4 in Sommerville.


Once we have collected the project requirements, we now use them to create a design.
Once a full software design is complete, it is given to the implementers to code the
solution. The design process is an iterative one where we start with the requirements
and progressively add details until it is ready for implementation.
There are four basic activities as a part of our progression:
Architectural Design (discussed in Seminar 3): builds the structure and relationships
in the software systems). A good architectural design includes a representation of the
main communication and control structures. (Architecture Discipline, 2010)
9

Interface Design: defines and models the interfaces between the components. Today,
we usually equate interface design to the user interface, but before the UI, we must
define how each of the software components fit together.
Component Design (discussed in Seminar 6): defines the pieces of the final solution.
No software system is developed as one large system. Instead, we design each of the
pieces (components) which we fit together into a final solution. A component may not
yet exist or we may use reusable components. The component design identifies these.
Database Design: defines the data structures and how data is represented in the
system.

Prototyping
The sooner we can provide a visual look and feel of the proposed design, the better it
is for the customer and for the designer. The problem has been that, until recently, we
had no way to create an electronic mock-up of the design without considerable effort,
and once that effort was completed, designers were reluctant to throw away their
work. Now, we have very good, inexpensive software methods to develop software
prototypes that can be easily modified or thrown away. As we develop the prototype, we
dont concern ourselves with all the system constraints.
"A software rapid prototype is a dynamic visual model that provides a communication
tool for customer & developer that is more effective for displaying functionality than
narrative prose or static visual models "(Novac, 2007). Using prototypes provides a
method of mitigating the risk of change. Since we are prototyping pieces, it is not
necessary to represent the complete system. Thus, system prototyping provides a way
to cope with change since customer ideas can be implemented to check on the
requirements. The value of rapid prototyping is the ability to construct it quickly.
There are many prototyping tools. These tools range from simple word-processing tools
to sophisticated IDE environments. A developer may use several tools for one
implementation. (Bhagwat, 2003; Rees, 1994)
Prototyping Summary

A prototype can be used to provide end-users a concrete impression of the systems


capabilities. It is increasingly used for system development where rapid development is
essential. In fact, rapid development of prototypes is expected in most software
development activities.

References
Ambler, S.(2007) IT Project Success Rates Survey: August 2007. Ambysoft.
http://www.ambysoft.com/surveys/success2007.html

10

Ambler, S. (2006) Agile Model Driven Development (AMDD


)http://www.agilemodeling.com/essays/amdd.htm
Architecture Discipline (2010). Software Architecture and Related Concerns. Bredemeyer Consulting.
http://www.bredemeyer.com/whatis.htm
Bhagwat, A (2003) Object-Oriented Analysis & Design: Tools Cetus Team
http://web.nchu.edu.tw/~jlu/cetus/oo_ooa_ood_tools.html
Davison, R, Kock,N, Loch,K., and Clarke, R (2001) Would a Code of Practice Help? Communications
of AIS, Volume 7 Article 4 1 Research Ethics in Information Systems:
http://www.rogerclarke.com/SOS/cais0107.pdf
Friedman, T (2005) The World is Flat: A Brief History of the Twenty-First Century. Farrar, Straus and
Giroux. NY ISBN-13: 978-0-374-29278-2
Humphrey, W. (2000) Introduction to the Team Software Process. SEI Series in Software Engineering.
Addison Wesley. 2000 ISBN 020147719X.
th

Johnson, D. (2009). Computer Ethics. 4 Edition Prentiss Hall. ISBN-10: 0131112414


Lewallen, R. The General Model. Codebetter.com. 2005.
http://codebetter.com/blogs/raymond.lewallen/archive/2005/07/13/129114.aspx
Novac, A (2007) Rapid Prototyping Page. Updated June, 2007.
http://www.cc.utah.edu/~asn8200/rapid.html
Parekh, N. (2005)The Waterfall Model Explained, Buzzle.com. http://www.buzzle.com/editorials/1-52005-63768.asp
Postnote (2006). Pervasive Computing. Parliamentary Office of Science and Technology. May 2006, No
263. http://www.parliament.uk/documents/post/postpn263.pdf
Pressman, R. (2009). Emerging Trends in Software Engineering. Boca Raton, Florida.
http://www.jasst.jp/archives/jasst09e/pdf/A1.pdf
Qureshi, M & Hussain, S (2008). A reusable software component-based development process model.
Advances in Engineering Software. Vol 39, Issue 2 pg 88-94 ISSN 0965-9978.
Randell, B. , ( 1996) "History of Software Engineering" Schloss Dagstuhl The 1968/69 NATO Software
Engineering Reports http://homepages.cs.ncl.ac.uk/brian.randell/NATO/NATOReports/
Redmiles, D, Cheng, L, Damian, Herbsleb, J., Kellogg, W. (2006). Panel: Collaborative Software
Engineering New and Emerging Trends. http://www.ics.uci.edu/~redmiles/publications/C068RCD+06.pdf

11

Snbl, A., Weber H. , Padberg, J. (2001) Evolutionary Development Of Business Process Centered
Architectures Using Component Technologies. Journal of Integrated Design & Process Science Vol 5,
Issue 3 (August, 2001)
Additional Reading
Rubinstein, D (2007). Standish Group Report: Theres Less Development Chaos Today. SDTimes
(March 1,2007). http://www.sdtimes.com/content/article.aspx?ArticleID=30247
Standish Group 2009). New CHAOS numbers show startling results. April 23, 2009.
http://www1.standishgroup.com/newsroom/chaos_2009.php
The following books provide more information on the topic of software engineering:
Pressman, R. (2005), Software Engineering, A Practitioners Approach (6th Edition), McGraw Hill. Good
textbook, but a little on the technical side.
Brooks, F. (1995) The Mythical Man-Month, Essays on Software Engineering, Addison Wesley. Fred
Brooks was the original Project Manager for the IBM 360 hardware as well as for its MVS operating
system, and although some of the book is dated, his observations are still widely accepted as the
foundation of software engineering. ISBN-10: 0201835959

Reading Assignments
The assigned readings for this seminar are Chapter 1 and Chapter 2
The assignments for Week 1 are in the Assignment file.

12

Das könnte Ihnen auch gefallen