Sie sind auf Seite 1von 74

VISION BASED GAME DEVELOPMENT

USING HUMAN COMPUTER INTERFACE

By

S.KRISHNA MOORTHY (08BCT18)

B.LAVANYA (08BCT19)

C.PRAVEEN KUMAR (08BCT25)

R.SHANMUGA PRIYA (08BCT34)

A.VENKATESH (08BCT40)

Of

KARPAGAM COLLEGE OF ENGINEERING

COIMBATORE- 32

Submitted To The

FACULTY OF APPLIED SCIENCE

In partial fulfillment of the Requirements

For the Award of the Degree

Of

BACHELOR OF SCIENCE

In

COMPUTER TECHNOLOGY

APRIL 2011
CERTIFICATE
BONAFIDE CERTIFICATE

Certified that this project report titled entitled “NETWORK GREEN

ENERGY SAVING USING NETWORK VIRTUALIZATION” is the bonafide

work of S.KRISHNA MOORTHY (08BCT18) , B.LAVANYA (08BCT19) ,

C.PRAVEEN KUMAR (08BCT25) , R.SHANMUGA PRIYA (08BCT34) ,

A.VENKATESH (08BCT40) , who carried out the research under my supervision.

Project Guide & Head of the Department


Dr. R. PARIMELAZHAGAN, M.Sc., M.Phil., Ph.D.,
Department of Applied Sciences,
Karpagam College of Engineering,
Coimbatore- 32.

Submitted for the project VIVA-VOCE examination held on _____________

Internal Examiner External Examiner


ACKNOWLEDGEMENT

ACKNOWLEDGEMENT
If words are considered as symbols of approval and tokens of acknowledgement, then

words play the heralding role of expressing our gratitude to all who have helped me directly

or indirectly during my project work.

It is our bounden duty to thank Dr.R.Vasantha Kumar., BE (Hons)., D.Sc., our


managing trustee, for his endeavor to provide all facilities we require and the interest he
showed in the welfare of the students. We also take privilege to thank our principal
Dr.G.Chandra Mohan., M.Tech., PhD., for his benevolent patronage.

We express our sincere gratitude and heartfelt thanks to our Head of the Department
Dr.R.Parimelazagan., M.Sc., M.Phil., PhD., for providing the necessary facilities to carry
out our project work and for his continue encouragement .

It is with the feeling of profound thankfulness and gratitude that we acknowledge,


the valuable guidance rendered to us by our internal guide Dr.R.Parimelazagan., M.Sc.,
M.Phil., PhD., for extending all support to us in the form of technical literature and excellent
guidance.

We accord our sincere and heartfelt thanks our deep sense gratitude to all the Staff
Members of Department of Applied Sciences for their constant support, valuable suggestions
and guidance.

Finally, we propose our heartfelt thanks to all our parents and friends for their

moral support without which we would have not been able to complete this project.

Last but not least our work will be incomplete without expressing our gratitude to
various authors whose works we referred to carry out this project.
ABSTRACT
ABSTRACT

This graduation project aims to present an application that is able of replacing the
traditional mouse with the human face as a new way to interact with the computer. Facial
features (nose tip and eyes) are detected and tracked in real-time to use their actions as mouse
events. The coordinates and movement of the nose tip in the live video feed are translated to
become the coordinates and movement of the mouse pointer on the user’s screen. The
left/right eye blinks fire left/right mouse click events. The only external device that the user
needs is a webcam that feeds the program with the video stream. In the past few years high
technology has become more progressed, and less expensive. With the availability of high
speed processors and inexpensive webcams, more and more people have become interested in
real-time applications that involve image processing. One of the promising fields in artificial
intelligence is HCI(Human Computer Interface.) which aims to use human features (e.g. face,
hands) to interact with the computer. One way to achieve that is to capture the desired feature
with a webcam and monitor its action in order to translate it to some events that communicate
with the computer.

In our work we were trying to compensate people who have hands disabilities that
prevent them from using the mouse by designing an application that uses facial features (nose
tip and eyes) to interact with the Computer. The nose tip was selected as the pointing device;
the reason behind that decision is the location and shape of the nose; as it is located in the
middle of the face it is more comfortable to use it as the feature that moves the mouse pointer
and defines its coordinates, not to mention that it is located on the axis that the face rotates
about, so it basically does not change its distinctive convex shape which makes it easier to
track as the face moves. Eyes were used to simulate mouse clicks, so the user can fire their
events as he blinks.
.

This project is developed using JDK 1.5 & JMF.


TABLE OF CONTENTS

TABLE OF CONTENTS
S.NO TITLE PAGE.NO

1. INTRODUCTION

1.1 ORGANIZATION PROFILE

1.2 OVERVIEW

1.3 MODULES

1.3.1 Modules Description

1.4 OBJECTIVE

1.4.1 Background study

2. SYSTEM ANALYSIS

2.1 EXISTING SYSTEM

2.2 STUDY ON PROPOSED SYSTEM

2.3 DEFINING THE PROBLEM

3. SYSTEM SPECIFICATION

3.1 HARDWARE SPECIFICATION

3.2 SOFTWARE SPECIFICATION

3.2.1 Software Description

4. SYSTEM DESIGN

4.1 SYSTEM ARCHITECTURE

4.2 FUNDAMENTAL DESIGN CONCEPTS

4.2.1 Dataflow Diagram and Rules

4.2.2 Table Construction

4.2.3 Class Diagram

4.3 UML DIAGRAMS

4.4 TABLE DESIGN


5. SYSTEM IMPLEMENTATION

6. SYSTEM TESTING

6.1 TYPES OF TESTING

7. CONCLUSION AND FUTURE WORK

8. APPENDICES

8.1 APPENDIX I- SCREEN SHOTS

8.2 APPENDIX II- SAMPLE CODING

9. BIBLIOGRAPHY AND REFERENCES


LIST OF TABLES
LIST OF TABLES

S. No Table. No Table name Page. No


1. 4.4.1 User registration
2. 4.4.2 Applications
3. 4.4.3 Files
4. 4.4.4 Settings
5. 4.4.5 Resources
CHAPTER 1
INTRODUCTION
1.1 Organization Profile:

OctaPacesolutions was established in 1997 at Coimbatore, that providing

software solution and web solutions to the companies situated in around the Coimbatore

and Tamilnadu, our companies developers are well trained and well equipped to fulfill the

needs of our customers. Our web solutions are targeting the Windows Server, Linux

Server. We are working with .Net and Java Technologies.

As multiple skills and competencies combine to realize technology-driven

business transformations, software development continues to be the largest software

engineering activity across enterprises. Drivers for custom-built solutions for clients are

based on innovative use of technology to achieve competitive advantage and


differentiation. As organizations drive towards iteration of their business and IT

strategies, outsourcing IT application development allows focus on core businesses with

benefits across the business spectrum.

OctaPace solutions are robust, scalable and will easily integrate with a diverse

range of products and technologies. OctaPace solutions expertise spans the entire gamut

of application and custom development. At OctaPace solutions, its wide range of

technological expertise, application knowledge and consulting experience, enable it to

develop and integrate robust and scalable e-business solutions that keep end customer's

requirement in mind.

1.2 OVERVIEW OF THE PROJECT

The energy consumed by the burgeoning computing infrastructure worldwide

has recently drawn significant attention. While the focus of energy management has been

on the data-center setting, attention has also been directed recently to the significant

amounts of energy consumed by desktop computers in homes and enterprises. The usual

approach to reducing PC energy wastage is to put computers to sleep when they are idle.

However, the presence of the user makes this particularly challenging in a desktop

computing environment. Users care about preserving long-running network connections

(e.g., login sessions, IM presence, file sharing), background computation (e.g., syncing

and automatic filing of new emails), and keeping their machine reachable even while it is

idle. Putting a desktop PC to sleep is likely to cause disruption (e.g., broken connections),

thereby having a negative impact on the user, who might then choose to disable the
energy savings mechanism altogether. To reduce user disruption while still allowing

machines to sleep, one approach has been to have a proxy on the network for a machine

that is asleep. However, this approach suffers from an inherent tradeoff between

functionality and complexity because of the need for application-specific customization.

1.3 MODULES:

 User Manager

 Storage System

 State Manager

 State Finding

 Virtual Dispatcher

1.3.1 MODULES DESCRIPTION:

Module 1: User Manager

This module is used to manage the user details and their permission details. User

details include their unique userid, password and their working system information.

Module 2: Storage System

This module is the storage server which stores all the user virtual machines. The

storage system on the server hosts the guest VMs that have been migrated to it from

(idle) desktop machines. The server also includes a controller, which is the brain of

Network Green. The controller receives periodic updates from stubs on the desktop,

resources on the level of user and computing activity on the desktops. The controller
also tracks resource usage on the server. Using all of this information, the controller

orchestrates the migration of VMs to the server and back to the desktop machines, and

manages the allocation of resources on the server. We have chosen a centralized design

for the controller because it is simple, efficient, and also enables optimal migration

decisions to be made based on full knowledge of the server.

Module 3: State Manager

This is the primary module which manages the state of the users running

programs each thread and its preferences through their working session on to the

storage system. Entire user’s application UI and Code related preferences are

monitored and managed by this state manager.

Module 4: State Finding

The presence of any UI activity initiated by the user, through the mouse or the

keyboard (e.g., mouse movement, mouse clicks, key presses), in the recent past

activity Window, set to 10 minutes by default is taken as an indicator that the

machine is active. Even though the load imposed on the machine might be rather

minimal, we make this conservative choice to reflect our emphasis on minimizing the

impact on the interactive performance perceived by the user. In the default policy, the

presence of UI activity is taken as the only indicator of whether the machine is active.

So, the absence of recent UI activity is taken as an indication that the machine is idle.

A more conservative policy, however, also considers the actual computational

load on the machine. Specifically, if the CPU usage is above a threshold, the

machine is deemed to be active. So, for the machine to be deemed idle, both the

absence of recent UI activity and CPU usage being below the threshold are

necessary conditions.
To avoid too much bouncing between the active and idle states, network green

introduces hysteresis in the process by (a) measuring the CPU usage as the average

over an interval (e.g., 1 minute) rather than instantaneously.

Module 5: Virtual Dispatcher

This module is used to dispatch the saved virtual machine on to the user

machine when user starts the system. Dispatching of virtual machine state starts only

when the user is get authenticated. Storage system stores the user application

preferences; dispatcher then loads the preferences on to the user computer. Users

will get back their system on the same state as last time the left with user activities

and contents get restored to the state.

Dispatcher allows users to load their application on any computer outside the

company only if administrator granted permission for remote access.

1.4 OBJECTIVE

Main Objective of this project Network Green, is to create a system to save

desktop energy by employing a novel approach to minimizing user disruption and avoiding

the complexity of application-specific customization. The basic idea is to virtualize the

user’s desktop computing environment, by encapsulating it in a virtual machine (VM), and

then migrating it between the user’s physical desktop machine and a VM server, depending

on whether the desktop computing environment is actively used or idle.


When the desktop becomes idle, say when the user steps away for several

minutes (e.g., for a coffee break), the desktop VM is migrated to the VM server and the

physical desktop machine is put to sleep. When the desktop becomes active again (e.g.,

when the user returns), the desktop VM is migrated back to the physical desktop machine.

Thus, even when it has been migrated to the VM server, the user’s desktop environment

remains alive (i.e., it is “always on”), so ongoing network connections and other activity

(e.g., background downloads) are not disturbed, regardless of the application involved. The

“always on” feature of NetworkGreen allows energy savings whenever the opportunity

arises, without having to worry about disrupting the user. Besides long idle periods (e.g.,

nights and weekends), energy can also be saved by putting the physical desktop computer to

sleep even during short idle periods, such as when a user goes to a meeting or steps out for

coffee. Indeed, our measurements indicate that the potential energy savings from exploiting

short idle periods are significance.

1.4.1 Background Study


PC Energy Consumption

Researchers have measured and characterized the energy consumed by

desktop computers. The typical desktop PC consumes 80-110 W when active and 60-80 W

when idle, excluding the monitor, which adds another 35-80 W. The relatively small

difference between active and idle modes is significant and arises because the processor itself

only accounts for a small portion of the total energy. In view of this, multiple S (“sleep”)

states have been defined as part of the ACPI standard. In particular, the S3 state (“standby”)

suspends the machine’s state to RAM, thereby cutting energy consumption to 2-3 W. S3 has
the advantage of being much quicker to transition in and out of than S4 (“hibernate”), which

involves suspending the machine’s state to disk.

Proxy-based Approach

As discussed above, the only way of cutting down the energy consumed by a PC is to

put it to sleep. However, when a PC it put to sleep, it loses its network presence, resulting in

disruption of ongoing connections (e.g., remote-login or file-download sessions) and the

machine even becoming inaccessible over the network. The resulting disruption has been

recognized as a key reason why users are often reluctant to put their machines to sleep.

Researchers have found that roughly 60% of office desktop PCs are left on continuously

CHAPTER 2

SYSTEM ANALYSIS

2.1 EXISTING SYSTEM


2.2 STUDY ON PROPOSED SYSTEM

2.1.1 Defining the Problem

The general approach to allowing a PC to sleep while maintaining some


network presence is to have a network proxy operate on its behalf while it is asleep. The
functionality of the proxy could span a wide range:

• WoL Proxy:

The simplest proxy allows the machine to be woken up using the Wake-on-
LAN mechanism supported by most Ethernet NICs. To be able to send the “magic” WoL
packet, the proxy must be on the same subnet as the target machine and needs to know the
MAC address of the machine. Typically, machine wakeup is initiated manually.

• Protocol Proxy:
A more sophisticated proxy performs Automatic wakeup, triggered by a
filtered subset of the incoming traffic. The filters could be configured based on user input and
also the list of network ports that the target machine was listening on before it went to sleep.
Other traffic is either responded to by the proxy itself without waking up the target machine
(e.g., ARP for the target machine) or ignored (e.g., ARP for other hosts).

• Application Proxy:

An even more sophisticated proxy incorporates application-specific stubs that


allow it to engage in network communication on behalf of applications running on the
machine that is now asleep. Such a proxy could even be integrated into an augmented NIC.
Enhanced functionality of a proxy comes at the cost of greater complexity, for instance, the
need to create stubs for each application that the user wishes to keep alive.

Saving Energy through Consolidation:

Consolidation to save energy has been employed in other computing settings


—data centers and thin clients. n the data-center setting, server consolidation is used to
approximate energy proportionality by migrating computation, typically using virtualization,
from several lightly-loaded servers onto fewer servers, and then turning off the servers that
are freed up. Doing so saves not only the energy consumed directly by the servers but also the
significant amount of energy consumed indirectly for cooling. Thin client based computing,
an idea that is making a reappearance despite failures in the past, represents an extreme form
of consolidation, with all of the computing resources being centralized.

While the cost, management, and energy savings might make the model
attractive in some environments, there remain questions regarding the up-front hardware
investment needed to migrate to thin clients. Also, thin clients represent a trade-off and may
not be suitable in settings where power users want the flexibility of a PC or insulation from
even transient dips in performance due to consolidation. So addressing the problem of energy
consumed by desktop PCs remains important.

2.1.2 Developing Solution Strategies

Network Green’s use of consolidation is inspired by the above work; a key


difference arises from the presence of users in a desktop computing environment. Unlike in a
data center setting, where machines tend to run server workloads and hence are substitutable
to a large extent, a desktop machine is a user’s personal computer. Users expect to have
access to their computing environment. Furthermore, unlike in a thin client setting, users
expect to have good interactive performance and the flexibility of attaching specialized
hardware and peripherals (e.g., a high-end graphics card). Progress on virtualizing high end
hardware, such as GPUs, facilitates Network-Green’s approach of running the desktop in a
VM. Central to the design of Network Green is preserving this PC model and minimizing
both user disruption and new hardware cost, by only consolidating idle desktops.

CHAPTER-3

SYSTEM SPECIFICATION

3.1 HARDWARE SPECIFICATION

• Processor : Intel Pentium P4 2.5 GHZ

• RAM : 1GB

• Hard Disk Drive : 80 GB

• Keyboard : 102 Keys

• Mouse : Optical Mouse

• Monitor : 15” color


3.2 SOFTWARE SPECIFICATION

Operating System : Windows XP sp3

IDE used : Visual Studio .Net Framework 2010

• Web Technologies : ASP.NET

• Language Used : Visual C#

3.2.1 SOFTWARE DESCRIPTION

• VISUAL STUDIO .NET FRAME WORK 4.0

The Microsoft .NET Framework is a software framework that can be


installed on computers running Microsoft Windows operating systems. It includes a
large library of coded solutions to common programming problems and a virtual
machine that manages the execution of programs written specifically for the
framework. The .NET Framework is a Microsoft offering and is intended to be used
by most new applications created for the Windows platform. The framework's Base
Class Library provides a large range of features including user interface, data access,
database connectivity, cryptography, web application development, numeric
algorithms, and network communications. The class library is used by programmers,
who combine it with their own code to produce applications. Programs written for
the .NET Framework execute in a software environment that manages the program's
runtime requirements. Also part of the .NET Framework, this runtime environment is
known as the Common Language Runtime (CLR). The CLR provides the appearance
of an application virtual machine so that programmers need not consider the
capabilities of the specific CPU that will execute the program.

• WINDOWS PRESENTATION FOUNDATION:

This subsystem is a part of .NET Framework 3.0.The Windows


Presentation Foundation (or WPF) is a graphical subsystem for rendering user
interfaces in Windows-based applications. WPF, previously known as "Avalon", was
initially released as part of .NET Framework 3.0. Designed to remove dependencies
on the aging GDI subsystem, WPF is built on DirectX, which provides hardware
acceleration and enables modern UI features like transparency, gradients, and
transforms. WPF provides a consistent programming model for building applications
and provides a clear separation between the user interface and the business logic.

WPF also offers a new markup language, known as XAML, which is


an alternative means for defining UI elements and relationships with other UI
elements. A WPF application can be deployed on the desktop or hosted in a web
browser. It also enables rich control, design, and development of the visual aspects of
Windows programs.

It aims to unify a number of application services: user interface, 2D


and 3D drawing, fixed and adaptive documents, advanced typography, vector
graphics, raster graphics, animation, data binding, audio and video. Microsoft
Silverlight is a web-based subset of WPF that enables Flash-like web and mobile
applications with the same programming model as .NET applications.

• GRAPHICAL SERVICES

All graphics, including desktop items like windows, are rendered using
Direct3D. This aims to provide a unified avenue for displaying graphics and is the
enabling factor that allows 2D, 3D, media, and animation to be combined in a single
window. Allows more advanced graphical features when compared to Windows
Forms and its GDI underpinnings.

• MEDIA SERVICES

WPF provides an integrated system for building user interfaces with


common media elements like vector and raster images, audio, and video. WPF also
provides an animation system and a 2D/3D rendering system. WPF provides shape
primitives for 2D graphics along with a built-in set of brushes, pens, geometries, and
transforms. The 3D capabilities in WPF are a subset of the full-feature set provided by
Direct3D. However, WPF provides tighter integration with other features like user
interfaces, documents, and media. This makes it possible to have 3D user interfaces,
3D documents, or 3D media. There is support for most common image formats: BMP,
JPEG, PNG, TIFF, Windows Media Photo, GIF, and ICON.

• ANIMATIONS

WPF supports time-based animations, in contrast to the frame-based


approach. This decouples the speed of the animation from how the system is
performing. WPF supports low level animation via timers and higher level
abstractions of animations via the Animation classes. Any WPF element property can
be animated as long as it is registered as a Dependency Property. Animation classes
are based on the .NET type of property to be animated. For instance, changing the
color of an element is done with the Color Animation class and animating the Width
of an element is done with the Double Animation class.

• EFFECTS

WPF 3.0 provides for Bitmap Effects, which are raster effects applied
to a Visual. These raster effects are written in unmanaged code and force rendering of
the Visual to be performed on the CPU and not hardware accelerated by the GPU.
Bitmap Effects were deprecated in .NET 3.5 SP 1.The .NET Framework 3.5 SP1 adds
the Effect class, which is a Pixel-Shader 2.0 effect that can be applied to a visual,
which allows all rendering to remain on the GPU.The Effect class is extensible
allowing application to specify their own shader effects..NET 3.5 SP1 ships with two
built-in effects, BlurEffect and DropShadowEffect.

• ASP.NET

ASP.NET is a unified Web development model that includes the


services necessary for you to build enterprise-class Web applications with a minimum
of coding. ASP.NET is part of the .NET Framework, and when coding ASP.NET
applications you have access to classes in the .NET Framework. You can code your
applications in any language compatible with the common language runtime (CLR),
including Microsoft Visual Basic and C#. These languages enable you to develop
ASP.NET applications that benefit from the common language runtime, type safety,
inheritance, and so on.
Visual Web Developer is a full-featured development environment for creating
ASP.NET Web applications. Visual Web Developer offers you the following features:

o Web page designs A powerful Web page editor that includes WYSIWYG
editing and an HTML editing mode with IntelliSense and validation.

o Page design features Consistent site layout with master pages and consistent
page appearance with themes and skins.

o Code editing A code editor that enables you to write code for your dynamic
Web pages in Visual Basic or C#. The code editor includes syntax coloration
and IntelliSense.

o Testing and debugging A local Web server for testing and a debugger that
helps you find errors in your programs.

o Deployment Tools to automate typical tasks for deploying a Web application


to a hosting server or a hosting provider.
• VISUAL C#:

C# (pronounced "see sharp") is a multi-paradigm programming


language encompassing imperative, functional, generic, object-oriented (class-based),
and component-oriented programming disciplines. It was developed by Microsoft
within the .NET initiative and later approved as a standard by Ecma (ECMA-334) and
ISO (ISO/IEC 23270). C# is one of the programming languages designed for the
Common Language Infrastructure. C# is intended to be a simple, modern, general-
purpose, object-oriented programming language. Its development team is led by
Anders Hejlsberg. The most recent version is C# 3.0, which was released in
conjunction with the .NET Framework 3.5 in 2007. The next proposed version, 4.0, is
in development.

o FEATURES

By design, C# is the programming language that most directly reflects


the underlying Common Language Infrastructure (CLI). Most of its intrinsic types
correspond to value-types implemented by the CLI framework. However, the
language specification does not state the code generation requirements of the
compiler: that is, it does not state that a C# compiler must target a Common Language
Runtime, or generate Common Intermediate Language (CIL), or generate any other
specific format. Theoretically, a C# compiler could generate machine code like
traditional compilers of C++ or FORTRAN.

• Some notable distinguishing features of C# are:


o There are no global variables or functions. All methods and members must be
declared within classes. Static members of public classes can substitute for
global variables and functions.

o Local variables cannot shadow variables of the enclosing block, unlike C and
C++. Variable shadowing is often considered confusing by C++ texts.

o C# supports a strict Boolean datatype, bool. Statements that take conditions,


such as while and if, require an expression of a type that implements the true
operator, such as the boolean type. While C++ also has a boolean type, it can
be freely converted to and from integers, and expressions such as if(a) require
only that a is convertible to bool, allowing a to be an int, or a pointer. C#
disallows this "integer meaning true or false" approach on the grounds that
forcing programmers to use expressions that return exactly bool can prevent
certain types of common programming mistakes in C or C++ such as if (a = b)
(use of assignment = instead of equality ==).

o In C#, memory address pointers can only be used within blocks specifically
marked as unsafe, and programs with unsafe code need appropriate
permissions to run. Most object access is done through safe object references,
which always either point to a "live" object or have the well-defined null
value; it is impossible to obtain a reference to a "dead" object (one which has
been garbage collected), or to a random block of memory. An unsafe pointer
can point to an instance of a value-type, array, string, or a block of memory
allocated on a stack. Code that is not marked as unsafe can still store and
manipulate pointers through the System.IntPtr type, but it cannot dereference.

o Managed memory cannot be explicitly freed; instead, it is automatically


garbage collected. Garbage collection addresses the problem of memory leaks
by freeing the programmer of responsibility for releasing memory which is no
longer needed.

o In addition to the try...catch construct to handle exceptions, C# has a


try...finally construct to guarantee execution of the code in the finally block.

o Multiple inheritance is not supported, although a class can implement any


number of interfaces. This was a design decision by the language's lead
architect to avoid complication and simplify architectural requirements
throughout CLI.

o C# is more typesafe than C++. The only implicit conversions by default are
those which are considered safe, such as widening of integers. This is enforced
at compile-time, during JIT, and, in some cases, at runtime. There are no
implicit conversions between booleans and integers, nor between enumeration
members and integers (except for literal 0, which can be implicitly converted
to any enumerated type). Any user-defined conversion must be explicitly
marked as explicit or implicit, unlike C++ copy constructors and conversion
operators, which are both implicit by default.

o Enumeration members are placed in their own scope.

o C# provides properties as syntactic sugar for a common pattern in which a pair


of methods, accessor (getter) and mutator (setter) encapsulate operations on a
single attribute of a class.

o C# currently (as of 3 June 2008) has 77 reserved words.

o C# is the programming language that most directly reflects the underlying


Common Language Infrastructure (CLI). Most of its intrinsic types correspond
to value-types implemented by the CLI framework. However, the language
specification does not state the code generation requirements of the compiler:
that is, it does not state that a C# compiler must target a Common Language
Runtime, or generate Common Intermediate Language (CIL), or generate any
other specific format. Theoretically, a C# compiler could generate machine
code like traditional compilers of C++ or FORTRAN. With Visual developers
can build solutions for the broadest range of clients, including Windows, the
Web, and mobile or embedded devices. Using this elegant programming
language and tool, developers can leverage their existing C++ and Java-
language skills and knowledge to be successful in the .NET environment.
o C# is more type safe than C++. The only implicit conversions by default are
those which are considered safe, such as widening of integers. This is enforced
at compile-time, during JIT, and, in some cases, at runtime. There are no
implicit conversions between booleans and integers, or between enumeration
members and integers (except for literal 0, which can be implicitly converted
to any enumerated type).

o Any user-defined conversion must be explicitly marked as explicit or implicit,


unlike C++ copy constructors and conversion operators, which are both
implicit by default. In C#, memory address pointers can only be used within
blocks specifically marked as unsafe, and programs with unsafe code need
appropriate permissions to run. Most object access is done through safe object
references. Managed memory cannot be explicitly freed; instead, it is
automatically garbage collected.
• DYNAMIC SUPPORT
o Visual C# 2010 provides support for late binding to dynamic types by
introducing a new type, dynamic. This addition enables many new scenarios,
including simplified access to COM APIs such as the Office Automation
APIs, to dynamic APIs such as Iron Python libraries, and to the HTML
Document Object Model (DOM).

LIVE SEMANTIC ERRORS


o The Live Semantic Errors feature has been enhanced in Visual C# 2010. The
use of wavy underlines to signal errors and warnings as you type has been
extended to include constructs that are outside of method bodies, such as
return types, parameter types, and default values in method declarations.

CHAPTER -4

SYSTEM DESIGN

4.1 SYSTEM ARCHITECTURE:


4.2 FUNDAMENTAL DESIGN CONCEPTS

System design sits in the technical kernel of software engineering and applied
science regardless of the software process model that is used. Beginning once the software
requirements have been analyzed and specified, tests that are required in the building and
verifying the software is done. Each activity transforms information in a number that
ultimately results in validated computer software.

There are mainly three characteristics that serve as guide for evaluation of
good design,

• The design must be implement all of explicit requirements contained in the analysis
model, and it must accommodate all of the implicit requirements desired by the
customer.
• The design must be readable, understandable guide for those who generate code and
for those who test and subsequently support the software.
• The design should provide a complete picture of software, addressing the data, its
functional and behavioral domains from the implementation perspective.

System design is thus a process of planning a new system or to replace or the


complement of the existing system. The design based on the limitations of the existing
system and the requirements specification gathered in the phase of system analysis.

Input design is the process of converting the user-oriented description of the


computer based business information into program-oriented specification. The goal of
designing input data is to make the automation as easy and free from errors as possible.

Logical Design of the system is performed where its features are described,
procedures that meet the system requirements are formed and a detailed specification of the
new system is provided

Architectural Design of the system includes identification of software


components, decoupling and decomposing them into processing modules, conceptual data
structures and specifying relationship among the components.

Detailed Design is concerned with the methods involved in packaging of


processing modules and implementation of processing algorithms, data structure and
interconnection among modules and data structure.
External Design of software involves conceiving, planning and specifying the
externally observable characteristics of the software product. The external design begins in
the analysis phase and continues till the design phase.

As per the design phase the following designs had to be implemented, each of
these design were processed separately keeping in mind all the requirements, constraints and
conditions. A step-by-step process was required to perform the design.

Process Design is the design of the process to be done; it is the designing that
leads to the coding. Here the conditions and the constraints given in the system are to be
considered. Accordingly the designing is to be done and processed.

The Output Design is the most important and direct source of information to
the user. The output design is an on-going activity during study phase. The objectives of the
output design define the contents and format of all documents and reports in an attractive and
useful format.
4.2 DESIGN CONCEPTS

4.2.1 DATA FLOW DIAGRAM

The data flow diagram (DFD) is a graphical tool used for expressing
system requirements in a graphical form. The DFD also known as the “bubble chart” has the
purpose of clarifying system requirements and identifying major transformations that will
become programs in system design. Thus DFD can be stated as the starting point of the
design phase that functionally decomposes the requirements specifications down to the
lowest level of detail. The DFD consists of series of bubbles joined by lines. The bubbles
represent data transformations and the lines represent data flows in the system. A DFD
describes what data flow is rather than how they are processed, so it does not depend on
hardware, software, data structure or file organization.

RULES USED FOR CONSTRUCTING A DFD

Process should be named and numbered for easy reference. Each name
should be representative of the process. The direction of flow is from top to bottom and from
left to right. That is data flow should be from source to destination. When a process is
exploded into lower level details, they are numbered. The name of the data stores, sources
and destinations are written in capital letters. Process and data flow names have the first letter
of each word capitalized. The DFD is particularly designed to aid communication. If it
contains dozens of process and data stores it gets too unwieldy. The rule of the thumb is to
explode the DFD into a functional level. Beyond that, it is best to take each function
separately and expand it to show the explosion in a single process. If a user wants to know
what happens within a given process, then the detailed explosion of that process may be
shown.

4.2.2 Table Design

Data Constraints
All business in the world runs on business data being gathered stored and
analyzed. Business managers determine a set of rules that must be applied to the data being
stored to ensure its integrity.

Types of Data Constraints

There are two types of data constraints that can be applied to data being
inserted into a database table .One type of constraint is called an I/O constraint. The other
type of constraint is called a business rule constraint.
I/O Constraints

The input /output data constraint is further divided into two distinctly
different constraints.

The Primary Key Constraint

Here the data constraint attached to a column ensures:

• That the data entered in the table column is unique across the entire column.
• That none of the cells belonging to the table column are left empty.
The Foreign Key Constraint

Foreign constraint establishes a relationship between records across a


master and a detail table. The relationship ensures.
• Records cannot be inserted in a detail table if corresponding records in the master
table do not exist.
• Records of the master table cannot be deleted if corresponding records in the detail
table exist.
Business Rule Constraints

The Database allows the application of business rules to table columns.


Business managers determine business rules.
The Database allows programmers to define constraints at:

• Column Level
• Table Level

Column Level Constraints

If data constraints are defined along with the column definition where
creating or altering a table structure, they are column level constraints.

Table Level Constraints

If data constraints are defined after defining all the table columns when
creating or altering a table structure, it is a table level constraint.

Null Value Concepts

A NULL value is different from a blank of zero. NULL values are treated
specially by the database. A NULL value can be inserted into the columns of any data type.
Not Null Constraint Defined at the Column Level

When a column is defined as not null, then that column becomes a


mandatory column .It implies that a value must be entered into the column if the record is to
be accepted for storage in the table.

The Primary Key Constraint

Primary Key Concepts


A primary is one or more column(s) in a table used to uniquely identify
each row in the table .A primary key column in a table has special attributes.
• It defines the column as a mandatory column i.e. the column cannot be left blank. The
NOT NULL attribute is active.
• The date held across the column MUST BE UNIQUE.

4.2.3 CLASS DIAGRAM:

Object role modeling can be used if you just want to model the classes and
their relationships.

Instance Level Relationships

External links

A Link is the basic relationship among objects. It is represented as a line


connecting two or more object boxes. It can be shown on an object diagram or class diagram.
A link is an instance of an association. In other words, it creates a relationship between two
classes.
Association

An Association represents a family of links. Binary associations (with two


ends) are normally represented as a line, with each end connected to a class box. Higher order
associations can be drawn with more than two ends. In such cases, the ends are connected to
a central diamond.
An association can be named, and the ends of an association can be adorned
with role names, ownership indicators, multiplicity, visibility, and other properties. There are
five different types of association. Bi-directional and uni-directional associations are the most
common ones. For instance, a flight class is associated with a plane class bi-directionally.
Associations can only be shown on class diagrams. Association represents the static
relationship shared among the objects of two classes. Example: "department offers courses",
is an association relation.
Aggregation
Aggregation is a variant of the "has a" or association relationship; aggregation
is more specific than association. It is an association that represents a part-whole or part-of
relationship. As a type of association, an aggregation can be named and have the same
adornments that an association can. However, an aggregation may not involve more than two
classes.
Aggregation can occur when a class is a collection or container of other
classes, but where the contained classes do not have a strong life cycle dependency on the
container—essentially, if the container is destroyed, its contents are not.
In UML, it is graphically represented as a hollow diamond shape on the
containing class end of the tree of lines that connect contained class(es) to the containing
class.
Composition
Class diagram showing Composition between two classes at top and
Aggregation between two classes at bottom Composition is a stronger variant of the "owns a"
or association relationship; composition is more specific than aggregation. Composition
usually has a strong life cycle dependency between instances of the container class and
instances of the contained class(es): If the container is destroyed, normally every instance that
it contains is destroyed as well. Note that a part can (where allowed) be removed from a
composite before the composite is deleted, and thus not be deleted as part of the composite.
The UML graphical representation of a composition relationship is a filled
diamond shape on the containing class end of the tree of lines that connect contained class(es)
to the containing class.
Differences between Composition and Aggregation
When attempting to represent real-world whole-part relationships, e.g., an
engine is part of a car, the composition relationship is most appropriate. However, when
representing a software or database relationship, e.g., car model engine ENG01 is part of a
car model CM01, an aggregation relationship is best, as the engine, ENG01 may be also part
of a different car model. Thus the aggregation relationship is often called "catalog"
containment to distinguish it from composition's "physical" containment.
The whole of a composition must have a multiplicity of 0..1 or 1, indicating
that a part must belong to only one whole; the part may have any multiplicity. For example,
consider University and Department classes. A department belongs to only one university, so
University has multiplicity 1 in the relationship. A university can (and will likely) have
multiple departments, so Department has multiplicity 1..*.
Class Level Relationships
Generalization
The Generalization relationship indicates that one of the two related classes
(the subtype) is considered to be a specialized form of the other (the super type) and
supertype is considered as 'Generalization' of subtype. In practice, this means that any
instance of the subtype is also an instance of the supertype. An exemplary tree of
generalizations of this form is found in binomial nomenclature: human beings are a subtype
of simian, which are a subtype of mammal, and so on. The relationship is most easily
understood by the phrase 'an A is a B' (a human is a mammal, a mammal is an animal).
The UML graphical representation of a Generalization is a hollow triangle
shape on the supertype end of the line (or tree of lines) that connects it to one or more
subtypes.
The generalization relationship is also known as the inheritance or "is a" relationship.
The supertype in the generalization relationship is also known as the "parent", superclass,
base class, or base type.
The subtype in the specialization relationship is also known as the "child",
subclass, derived class, derived type, inheriting class, or inheriting type. Note that this
relationship bears no resemblance to the biological parent/child relationship: the use of these
terms is extremely common, but can be misleading.
Generalization-Specialization relationship
Realization
In UML modeling, a realization relationship is a relationship between two
model elements, in which one model element (the client) realizes (implements or executes)
the behavior that the other model element (the supplier) specifies. A realization is indicated
by a dashed line with an unfilled arrowhead towards the supplier.
A realization is a relationship between classes, interfaces, components, and
packages that connects a client element with a supplier element. A realization relationship
between classes and interfaces and between components and interfaces shows that the class
realizes the operations offered by the interface.

Dependency
Dependency is a weaker form of relationship which indicates that one class
depends on another because it uses it at some point of time. Dependency exists if a class is a
parameter variable or local variable of a method of another class.
Multiplicity
The association relationship indicates that (at least) one of the two related
classes makes reference to the other. In contrast with the generalization relationship, this is
most easily understood through the phrase 'A has a B' (a mother cat has kittens, kittens have a
mother cat).
The UML representation of an association is a line with an optional arrowhead
indicating the role of the object(s) in the relationship, and an optional notation at each end
indicating the multiplicity of instances of that entity (the number of objects that participate in
the association).

4.3 UML DIAGRAMS


4.3.1 CLASS DIAGRAMS

USER MANAGEMENT:
USER MANAGEMENT
APPLICATION SETTING

4.3.2 SEQUENCE DIAGRAM


GetTable<TEntity> :
this : AppState Datas : SettDataContext Data : Setting
Table<Setting>

AddSettings

Try

[try]
Create SettDataContext

new SettDataContext() OnCreated

OnCreated()

<<return>>

Create Setting

new Setting() OnCreated

OnCreated()

<<return>>

InsertOnSubmit

Datas.Settings.InsertOnSubmit(Data)

<<return>>

Return

return "true"

[catch (Exception)]
Return

return ex.T…
this : Login D : AppState F : Form1

button1_Cl…

Create AppState

new StateNotify.AppState()

<<return>>

CheckUser

D.CheckUser(textBox1.Text, text…

<<return>>

If

[if (chck == "error")]

[else]
Create Form1

new Form1() InitializeComponent

I nitializeComponent()

<<return>>
this : Form1 D : AppState

Form1_Load

Create AppState

new StateNotify.AppState()

<<return>>

Try

[try]
AddSettings

D.AddSettings(UserSession.inSes…

<<return>>

[catch (Exception)]

ReturnSettings

D.ReturnSettings(UserSession.in…

<<return>>
this : AdminHOme U : CreateUser

createUser…

Create CreateUser

new CreateUser() InitializeComponent

InitializeComponent()

<<return>>
GetTable<TEntity> :
this : CreateUser Datas : UsersDataContext Data : UserTB
Table<UserTB>

button1_Cl…

Create UsersDataContext

new UsersDataContext() OnCreated

OnCreated()

<<return>>

Create UserTB

new UserTB() OnCreated

OnCreated()

<<return>>

InsertOnSubmit

Datas.UserTBs.InsertOnSubmit(Data)

<<return>>
this : ViewUsers Datas : UsersDataContext source : Table<UserTB> data : IQueryable<UserTB>

ViewUsers…

Create UsersDataContext

new UsersDataContext() OnCreated

OnCreated()

<<return>>

Select<UserTB,UserTB>

Select (k)

<<return>>

ToList<UserTB>

ToList

<<return>>
4.4 Table Design

4.4.1 Table Name: User Registration

Primary Key: Userid

Field Name Data Type


Userid Varchar(50)
Name Varchar(30)
Pass Varchar(25)
Emailid Varchar(50)
Contactnumber Bigint
Address Varchar(50)
City Varchar(30)
State Varchar(20)
Country Varchar(20)
Securityquestion Varchar(50)
Securityanswer Varchar(20)
Systemid Varchar(30)
Systemname Varchar(30)
Systemos Varchar(30)
Systemrights Varchar(30)

4.4.2 Table Name: Applications

Primary Key: Applicationid

Foreignkey: Userid

Field Name Data Type


Applicationid Varchar(50)
Appname Varchar(50)
Appversion Varchar(50)
Appsystem Varchar(50)
Userid Varchar(50)
4.4.3 Table Name: Files

Primary Key: Filesid

Foreign Key: Userid

Field Name Datatype


Fileid Varchar(50)
Name Varchar(50)
Path Varchar(50)
Lastwrite Varchar(50)
Lastaccess Varchar(50)
Userid Varchar(50)
Application Varchar(50)

4.4.4 Table Name: Settings

Primarykey: Settingsid

Foreignkey: Userid

Field Name Datatype


Settingsid Varchar(50)
Applicationid Varchar(50)
Userid Varchar(50)
Configuration Xml

4.4.5 Table Name: Resources

Primarykey: Resourceid

Foreignkey: Userid

Field Name Datatype


Resourceid Varchar(50)
Name Varchar(50)
Userid Varchar(50)
Applicationid Varchar(50)
Resources Xml

CHAPTER – 5
CHAPTER – 5
SYSTEM IMPLEMENTATION

After the successful study of requirement analysis the next step involved is the Design
and Development phase that practically helps to build the project.

The methods that are applied during the development phase


 Software Design
 Code Generation
 Software Testing

The Linear Sequential Model or Classic Life Cycle or the Waterfall Model develops
project. This is a sequential approach to software development that begins at the system level
and progresses through analysis, design, coding and testing

 System / Information Engineering and Modeling

Because software is always part of a larger system, work begins by establishing


requirements for all system elements and then allocating some subset of these requirements to
software. System view is essential when software must interact with other elements such as
hardware people and database.

 Software requirements analysis

Requirements is intensified and focused specially on software. To understand the nature


of the program to be built, the software engineer must understand the information domain for
the software, as well as required function, behavior, performance and interface.

 Design of the project

The design process translates requirements into a representation of the software that can
be accessed for quality before coding begins. Like requirements, the design is documented
and becomes part of the software configuration.
 Code Generation

The design must be translated into a machine-readable form. The code generation step
performs this task. If design is performed in a detailed manner, code generation can be
accomplished mechanistically.
After completing the design phase, code was generated using Visual Basic
environment and the SQL Server 2000 was used to create the database. The server and the
application were connected through ADO.Net concepts.
The purpose of code is to facilitate the identification and retrieval of items of
information. Codes are built with the mutually exclusive features. They are used to give
operational distractions and other information. Codes also show interrelationship among
different items. Codes are used for identifying, accessing, sorting and matching records. The
code ensures that only one value of code with single meaning is correctly applied to give
entity or attribute as described in various ways. Codes can also be designed in a manner
easily understood and applied by the user.

The coding standards used in the project are as follows:


1. All variable names are kept in such a way that it represents the flow/function it is
serving.
2. All functions are named such that it represents the function it is performing.
CHAPTER 6

SYSTEM TESTING AND MAINTENANCE


SYSTEM TESTING

Software testing is a critical element of software quality assurance and represents the
ultimate review of specification, design and code generation. Once the source code has been
generated, software must be tested to uncover as many errors as possible before delivery to
the customer. In order to find the highest possible number of errors, tests must be conducted
systematically and test cases must be designed using disciplined techniques.

6.1 TYPES OF TESTING


 White box Testing

White box testing sometimes called as glass box testing is a test case design method
that uses the control structures of the procedural design to derive test cases.
Using White Box testing methods, the software engineer can derive test case, that
guarantee that all independent paths with in a module have been exercised at least once,
exercise all logical decisions on their true and false sides, execute all loops at their boundaries
and within their operational bounds, exercise internal data structures to ensure their validity.
“Logic errors and incorrect assumptions are inversely proportional to the probability that a
program path will be executed“.
The logical flow of a program is sometimes counterintuitive, meaning that
unconscious assumptions about flow of control and data may lead to make design errors that
are uncovered only once path testing commences.
“Typographical errors are random“
When a program is translated into programming language source code, it is likely that
some typing errors will occur. Many will be uncovered by syntax and typing checking
mechanisms, but others may go undetected until testing begins. It is as likely that a type will
exist on an obscure logical path as on a mainstream path.
 Black box Testing

Black box testing, also called as behavioral testing, focuses on the functional
requirements of the software. That is, black box testing enables the software engineer to
derive sets of input conditions that will fully exercise all functional requirements for a
program. Black box testing attempts to find errors in the following categories:
1. Incorrect or missing functions
2. Interface errors
3. Errors in data structures or external data base access
4. Behavior or performance errors
5. Initialization and termination errors
By applying black box techniques, a set of test cases that satisfy the following criteria
were been created: Test cases that reduce, by a count that is greater than one, the number of
additional test cases that must be designed to achieve reasonable testing and test cases that
tell something about the presence or absence of classes of errors, rather than an error
associated only with the specific test at hand.
Black - box testing is not an alternative to white - box testing techniques. Rather it is
complementary approach that is likely to uncover a different class of errors than white - box
methods.
 Validation Testing

Validation testing provides the final assurance that software meets all functional,
behavioral and performance requirements. Validation testing can be defined in many ways,
but a simple definition is that validations succeed when the software functions in a manner
that is expected by the user. The software once validated must be combined with other system
element.
System testing verifies that all elements combine properly and that overall system
function and performance is achieved. After the integration of the modules, the validation test
was carried out over by the system. It was found that all the modules work well together and
meet the overall system function and performance.
 Integration Testing

Integration testing is a systematic technique for constructing the program structure


while at the same time conducting test to uncover errors associated with interfacing. The
objective is to take unit - tested modules and build a program structure that has been dictated
by design. Careful test planning is required to determine the extent and nature of system
testing to be performed and to establish criteria by which the result will be evaluated.
All the modules were integrated after the completion of unit test. While Top - Down
Integration was followed, the modules are integrated by moving downward through the
control hierarchy, beginning with the main module. Since the modules were unit - tested for
no errors, the integration of those modules was found perfect and working fine. As a next step
to integration, other modules were integrated with the former modules.
After the successful integration of the modules, the system was found to be running
with no uncovered errors, and also all the modules were working as per the design of the
system, without any deviation from the features of the proposed system design.

 Acceptance Testing

Acceptance testing involves planning and execution of functional tests, performance


tests and stress tests in order to demonstrate that the implemented system satisfies its
requirements. When custom software is built for one customer, a series of acceptance tests
are conducted to enable the customer to validate all requirements.

In fact acceptance cumulative errors that might degrade the system over time will
incorporate test cases developed during integration testing. Additional testing cases are added
to achieve the desired level functional, performance and stress testing of the entire system.
 Unit testing

Static analysis is used to investigate the structural properties of source code. Dynamic
test cases are used to investigate the behavior of source code by executing the program on the
test data. This testing was carried out during programming stage itself.
After testing each every field in the modules, the modulus of the project is tested
separately. Unit testing focuses verification efforts on the smallest unit of software design and
field. This is known as field - testing.
CHAPTER- 7

CONCLUSION AND FUTURE WORK

CONCLUSION:

Recent work has recognized that desktop computers in enterprise


environments consume a lot of energy in aggregate while still remaining idle much of the
time. The question is how to save energy by letting these machines sleep while avoiding user
disruption. Network Green uses virtualization to resolve this problem, by migrating idle
desktops to a server where they can remain “always on” without incurring the energy cost of
a desktop machine. The seamlessness offered by Network Green allows us to aggressively
exploit short idle periods as well as long periods. Data-driven analysis of more than 65000
hours of desktop usage traces from 120 users as well as a small scale deployment of Network
Green on ten desktops, comprising 3200 user-hours over 28 days, shows that Network Green
can help desktops sleep for 86-88% of the time. This translates to estimated desktop energy
savings of 72-74% for Network Green as compared to 32%savings under existing power
management mechanisms.

FUTURE WORK:

Live migration assumes that the disk is shared between the source and
destination machines, say in the form of network attached storage (NAS). This avoids the
considerable cost of migrating disk content. However, this is a Limitation of our current
system since, in general, client machines would have a local disk, which applications (e.g.,
sharing of local files) need access to. Recent work has demonstrated the migration of VMs
with local virtual hard disks (VHDs) by using techniques such as pre-copying and mirroring
of disk content to keep the downtime to under 3 seconds in a LAN setting. Note that since the
base OS image is likely to be already available at the destination node, the main cost is that of
migrating the user data.
CHAPTER 8

APPENDICES

8.1 APPENDIX I- SCREEN SHOTS

HOME PAGE:

ADMIN HOME:
USER CREATION:

MANAGING USERS:
USER DETAILS:

STATE MANAGER:
USER LOGIN:

BROWSER APPLICATION:
BROWSER APPLICATION STATE MANAGED:

USER STATE VIEWING:


LAST BROWSER STATE UPDATED:
USER LOGIN MEDIA APPLICATION:

MEDIA APPLICATION PAUSED STATE:


USER STATE VIEWING:
MEDIA PLAYER LAST STATE UPDATED:

MEDIA PLAYER RESUMED:


USER LOGIN DOCUMENT EDITOR APPLICATION:

DOCUMENT EDITOR:
DOCUMENT APPLICATION LAST STATE RETAINED:
APPENDIX II- CODINGS

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Windows;
using System.Windows.Controls;
using System.Windows.Data;
using System.Windows.Documents;
using System.Windows.Input;
using System.Windows.Media;
using System.Windows.Media.Imaging;
using System.Windows.Navigation;
using System.Windows.Shapes;
using System.Xml.Linq;
using Microsoft.Win32;

namespace NetworkGreenTestApp
{
/// <summary>
/// Interaction logic for MediaP.xaml
/// </summary>
public partial class MediaP : UserControl
{
public MediaP()
{
InitializeComponent();

private void button1_Click(object sender, RoutedEventArgs e)


{
StateNotify.AppState D = new StateNotify.AppState();
Try

XElement SettingsData;
SettingsData = new XElement("Controls",
new XElement("Ctrl",
new XAttribute("name", "mediaply"),
new XAttribute("pos", "0"))
);
D.AddSettings(UserSession.inSession + "s2", "a2", UserSession.inSession,
SettingsData).ToString();
}
catch (Exception ex)
{

}
TimeSpan f = new TimeSpan();
char[] ch = { ':' };
StateNotify.Setting S = D.ReturnSettings(UserSession.inSession, "a2",
UserSession.inSession + "s2");
var t = S.Configuration;
if (t.Element("Ctrl").Attribute("name").Value == "mediaply")
{
string val = t.Element("Ctrl").Attribute("pos").Value.ToString();
f = new TimeSpan(long.Parse(val));
}
mediaElement1.Position = f;
mediaElement1.Play();
}

private void button2_Click(object sender, RoutedEventArgs e)


{
StateNotify.AppState D = new StateNotify.AppState();
XElement SettingsData;
SettingsData = new XElement("Controls",
new XElement("Ctrl",
new XAttribute("name", "mediaply"),
new XAttribute("pos", mediaElement1.Position.Ticks.ToString()))
);

D.UpdateSettings(UserSession.inSession + "s2", "a2", UserSession.inSession,


SettingsData).ToString();
MessageBox.Show(mediaElement1.Position.ToString());
}

private void button3_Click(object sender, RoutedEventArgs e)


{
mediaElement1.Stop();
}

private void button4_Click(object sender, RoutedEventArgs e)


{
OpenFileDialog dlg = new OpenFileDialog();

dlg.InitialDirectory = "F:\\";

dlg.Filter = "Media files (*.wmv)|*.wmv|All Files (*.*)|*.*";

dlg.RestoreDirectory = true;

if (dlg.ShowDialog() == true) //System.Windows.Forms.DialogResult.OK)


{

string selectedFileName = dlg.FileName;

label1.Content = selectedFileName;

mediaElement1.Source = new Uri(selectedFileName);

mediaElement1.Play();

}
}
}

using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Linq;
using System.Text;
using System.Windows.Forms;

namespace NetworkGr
{
public partial class CreateUser : Form
{
public CreateUser()
{
InitializeComponent();
}

private void button2_Click(object sender, EventArgs e)


{
this.Close();
}

private void button1_Click(object sender, EventArgs e)


{
UsersDataContext Datas = new UsersDataContext();
UserRegistration Data = new UserRegistration();
Data.Email = txtemail.Text;
Data.Name = txtname.Text;
Data.Pass = txtpass.Text;
Data.Systemid = txtsystem.Text;
Data.UserId = txtuserid.Text;
Datas.UserRegistrations.InsertOnSubmit(Data);
Datas.SubmitChanges();
MessageBox.Show("User Created");
}
}
}

using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Linq;
using System.Text;
using System.Windows.Forms;
using System.Xml.Linq;

namespace NetworkGreenTestApp
{
public partial class Form1 : Form
{
public Form1()
{
InitializeComponent();
}

private void Form1_Load(object sender, EventArgs e)


{
StateNotify.AppState D = new StateNotify.AppState();
try
{

XElement SettingsData;
SettingsData = new XElement("Controls",
new XElement("Ctrl",
new XAttribute("name", "toolStripComboBox1"),
new XAttribute("text", toolStripComboBox1.Text)),
new XElement("Ctrl",
new XAttribute("name", "toolStripLabel1"),
new XAttribute("text", toolStripLabel1.Text)),
new XElement("Ctrl",
new XAttribute("name", "Form1"),
new XAttribute("text", this.Text))
);
D.AddSettings(UserSession.inSession + "s1", "a1", UserSession.inSession,
SettingsData).ToString();
}
catch (Exception ex)
{

StateNotify.Setting S =
D.ReturnSettings(UserSession.inSession,"a1",UserSession.inSession+"s1");
var t = S.Configuration;
if (t.Element("Ctrl").Attribute("name").Value == "toolStripComboBox1")
{
toolStripComboBox1.Text = t.Element("Ctrl").Attribute("text").Value.ToString();
}

private void textBox1_TextChanged(object sender, EventArgs e)


{
StateNotify.AppState D = new StateNotify.AppState();
XElement SettingsData;
SettingsData = new XElement("Controls",
new XElement("Ctrl",
new XAttribute("name", "toolStripComboBox1"),
new XAttribute("text", toolStripComboBox1.Text)),
new XElement("Ctrl",
new XAttribute("name", "toolStripLabel1"),
new XAttribute("text", toolStripLabel1.Text)),
new XElement("Ctrl",
new XAttribute("name", "Form1"),
new XAttribute("text", this.Text))
);

D.UpdateSettings(UserSession.inSession + "s1", "a1", UserSession.inSession,


SettingsData).ToString();
}

private void toolStripButton1_Click(object sender, EventArgs e)


{
webBrowser1.Navigate(toolStripComboBox1.Text);
}

private void toolStripComboBox1_Click(object sender, EventArgs e)


{

}
}
}

using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Linq;
using System.Text;
using System.Windows.Forms;
using System.Xml.Linq;
namespace NetworkGreenTestApp
{
public partial class DocumentEdi : Form
{
public DocumentEdi()
{
InitializeComponent();

private void exitToolStripMenuItem_Click(object sender, EventArgs e)


{

this.Close();
}

private void cutToolStripMenuItem_Click(object sender, EventArgs e)


{
richTextBox1.Cut();

private void copyToolStripMenuItem_Click(object sender, EventArgs e)


{
richTextBox1.Copy();
StateNotify.AppState D = new StateNotify.AppState();
try
{

XElement SettingsData;
SettingsData = new XElement("Controls",
new XElement("Ctrl",
new XAttribute("name", "doc"),
new XAttribute("pos", richTextBox1.Text)),
new XElement("Ctrl",
new XAttribute("name", "copy"),
new XAttribute("pos", Clipboard.GetText()))
);
D.UpdateSettings(UserSession.inSession + "s3", "a3", UserSession.inSession,
SettingsData).ToString();

}
catch (Exception ex)
{

}
}

private void pasteToolStripMenuItem_Click(object sender, EventArgs e)


{
StateNotify.AppState D = new StateNotify.AppState();

StateNotify.Setting S = D.ReturnSettings(UserSession.inSession , "a3",


UserSession.inSession + "s3");
var t = S.Configuration;
if (t.Element("Ctrl").Attribute("name").Value == "copy")
{
string val = t.Element("Ctrl").Attribute("pos").Value.ToString();
// Clipboard.SetText(val);
richTextBox1.AppendText(val);
}
richTextBox1.Paste();
}

private void deleteToolStripMenuItem_Click(object sender, EventArgs e)


{
richTextBox1.DeselectAll();
}

private void undoToolStripMenuItem_Click(object sender, EventArgs e)


{
richTextBox1.Undo();
}

private void redoToolStripMenuItem_Click(object sender, EventArgs e)


{
richTextBox1.Redo();
}

private void newToolStripMenuItem_Click(object sender, EventArgs e)


{

private void saveToolStripMenuItem_Click(object sender, EventArgs e)


{

private void DocumentEdi_Load(object sender, EventArgs e)


{
// Clipboard.SetText(" ");
StateNotify.AppState D = new StateNotify.AppState();
try
{

XElement SettingsData;
SettingsData = new XElement("Controls",
new XElement("Ctrl",
new XAttribute("name", "doc"),
new XAttribute("pos", richTextBox1.Text)),
new XElement("Ctrl",
new XAttribute("name", "copy"),
new XAttribute("pos", ""))
);
D.AddSettings(UserSession.inSession + "s3", "a3", UserSession.inSession,
SettingsData).ToString();
}
catch (Exception ex)
{

StateNotify.Setting S = D.ReturnSettings(UserSession.inSession, "a3",


UserSession.inSession + "s3");
var t = S.Configuration;
if (t.Element("Ctrl").Attribute("name").Value == "doc")
{
string val = t.Element("Ctrl").Attribute("pos").Value.ToString();
richTextBox1.Text = val;
}

private void richTextBox1_TextChanged(object sender, EventArgs e)


{
StateNotify.AppState D = new StateNotify.AppState();
try
{

XElement SettingsData;
SettingsData = new XElement("Controls",
new XElement("Ctrl",
new XAttribute("name", "doc"),
new XAttribute("pos", richTextBox1.Text)),
new XElement("Ctrl",
new XAttribute("name", "copy"),
new XAttribute("pos", Clipboard.GetText()))
);
D.UpdateSettings(UserSession.inSession + "s3", "a3", UserSession.inSession,
SettingsData).ToString();

}
catch (Exception ex)
{

}
}
}
BIBLIOGRAPHY

Books Referred:

1. Alex Homer , “Professional C#.NET 1.1”, 2004 Edition, Wrox Publications

2. Steven Holzner, “C#.NET Black Book”, 2003 Edition, Dreamtech Publications

3. Roger S Pressman, “Software Engineering”, 2000 Edition, Dreamtech Publications

4. Karli Watson, Richard Anderson , “Professional ASP.NET 1.1” , 2004 Edition, Wrox

Publications

Websites:

1. www.msdn.microsoft.com
2. www.vbcity.com
3. www.vbdotnetheaven.com
4. www.codeguru.com

Das könnte Ihnen auch gefallen