Sie sind auf Seite 1von 12

THE UNIVERSITY OF POONCH RAWALAKOT

Assignment # 2

Course title: HRM


Submitted to: Ma’am Aiman
Submitted by: huma hameed
Roll no: (CS)112
Semester : BS(CS) 5th
Session : 2017-2021
Submission date : 10 April-2020
DEPT CS&IT
A: STREAMING DESKTOP VIDEO WORK

The video stream is compressed using a video codec such as H. 264 or


VP8

In the early days of streaming media -- the mid-to-late 1990s -- watching


videos and listening to music online wasn't always fun. It was a little like
driving in stop-and-go traffic during a heavy rain. If you had a
slow computer or a dial-up Internet connection, you could spend more
time staring at the word "buffering" on a status bar than watching videos
or listening to songs. On top of that, everything was choppy, pixilated
and hard to see.

Streaming video and audio have come a long way since then. According
to Bridge Ratings, 57 million people listen to Internet radio every week.
In 2006, people watched more than a million streaming videos a day on
YouTube [source: Reuters]. The same year, television network ABC
started streaming its most popular TV shows over the Web.

The success of streaming media is pretty recent, but the idea behind it
has been around as long as people have. When someone talks to you,
information travels toward you in the form of a sound wave. Your ears
and brain decode this information, allowing you to understand it. This is
also what happens when you watch TV or listen to the radio.
Information travels to an electronic device in the form of a cable signal,
a satellite signal or radio waves. The device decodes and displays the
signal.

In streaming video and audio, the traveling information is a stream of


data from a server. The decoder is a stand-alone player or a plug in that
works as part of a Web browser. The server, information stream and
decoder work together to let people watch live or prerecorded
broadcasts.

In this article, we'll explore what it takes to create this stream of ones
and zeros as well as how it differs from the data in a typical download.
We'll also take a look at how to make good streaming media files.

Video and media streaming


In the early days of streaming media -- the mid-to-late 1990s -- watching
videos and listening to music online wasn't always fun. It was a little like
driving in stop-and-go traffic during a heavy rain. If you had a
slow computer or a dial-up Internet connection. According to Bridge
Ratings, 57 million people listen to Internet radio every week. In 2006,
people watched more than a million streaming videos a day on YouTube
[source: Reuters]. The same year, television network ABC started
streaming its most popular TV shows over the Web. People who missed
an episode of shows than the able to watch again online legally and for
free.
The success of streaming media is pretty recent. When someone talks to
you, information travels toward you in the form of a sound wave. Your
ears and brain decode this information, allowing you to understand it.
This is also what happens when you watch TV or listen to the radio.
Information travels to an electronic device in the form of a cable signal,
a satellite signal or radio waves. The device decodes and displays the
signal.
In streaming the traveling information is a stream of data from a server.
The decoder is a stand-alone player or a plug-in that works as part of a
Web browser. The server, information stream and decoder work together
to let people watch live or prerecorded broadcasts. We'll also take a look
at how to make good streaming media files.

B: Used in HRM

Streaming desktop video Used to facilitate distance learning and


training or to provide corporate information to employees quickly
and inexpensive.

Academic Video Online (from Alexander Street) 


Academic Video Online delivers more than 67,000 titles spanning the
widest range of subject areas including anthropology, business,
counseling, film, health, history, music, and more. Academic Video
Online includes documentaries, interviews, performances, news
programs and newsreels, field recordings, commercials, and raw
footage. Patrons will find thousands of award-winning films, including
Academy®, Emmy®, and Peabody® winners, along with the most
frequently used films for classroom instruction, plus newly released
films and previously unavailable archival material.
Digital Campus by Swank Motion Pictures 
Please use the latest version of Firefox or Google Chrome for Digital
Campus access. Digital Campus, provided by Swank Motion Pictures,
Inc.® was created for professors and administrators to enhance
curriculum by providing students with access to course-related films.
Through Digital Campus, students can conveniently view assigned
films, freeing up valuable class time and eliminating the time constraints
of sharing copies.
Canopy Streaming Video 
Canopy provides streaming video access to award-winning
documentaries, training films and theatrical releases on every topic
imaginable. The Library is unable to grant additional licenses until
further notice. Please go to the Canopy information page to view
currently licensed titles at: https://libguides.wilmu.edu/videos/kanopy
Films on Demand 

Films on Demand is a digital video database for streaming


educational video content. This collection contains videos in the
criminal justice and law category. Faculty may contact the Library
to request specific titles

A: NETWORK MONITORING WORK

Network monitoring is the use of a system that constantly monitors


a computer network for slow or failing components and that notifies
the network administrator (via email, SMS or other alarms) .
network threats from the outside
For example, to determine the status of a web server, monitoring
software may periodically send an HTTP request to fetch a page.
For email servers, a test message might be sent through SMTP and
retrieved by IMAP or POP3.
Commonly measured metrics are response time, availability and uptime,
although both consistency and reliability metrics are starting to gain
popularity. The widespread addition of WAN optimization devices is
having an adverse effect on most network monitoring tools, especially
when it comes to measuring accurate end-to-end delay because they
limit round-trip delay time visibility.
Status request failures, such as when a connection cannot be established,
it times-out, or the document or message cannot be retrieved, usually
produce an action from the monitoring system. These actions vary; An
alarm may be sent (via SMS, email, etc.) to the resident sysadmin,
automatic failover systems may be activated to remove the troubled
server from duty until it can be repaired, etc.
Monitoring the performance of a network uplink is also known
as network traffic measurement.

In order to know how network monitoring works, it is important to know


the significance of a network to an organization. Networks are the
lifeblood of any modern corporation, and slowdowns and breaches are
costly. Monitoring is the practice of watching the internal network as a
whole, including devices, traffic and servers. This helps identify and
address potential problems as they occur, preventing network issues. For
nearly all businesses, this monitoring occurs with the help of software
systems.

 Size and scale: Some network monitoring systems are simple,


pinging hosts to check for availability. Some are even achieved
using a patchwork of various software and hardware in tandem.
More advanced systems, on the other hand, monitor all areas of
even the most complex networks with a single comprehensive
system.
 Ease of use: Interfaces vary wildly depending on the type and
sophistication of the network monitoring system. While some offer
only simple alerts and command-based interfaces, others may
provide a graphical user interface to improve functionality. Many
modern network monitoring tools have web-based and mobile-
based interfaces.
 Automation: Basic monitoring systems rely on an administrator to
see results and act on them, but many companies are turning to
automated systems that handle events themselves. These systems
are designed to trigger events when network data falls outside set
parameters, functionally eliminating the middle man and
improving response time for network errors.

One important point to network monitoring systems is that they are not
necessarily security systems. While network monitoring can serve as a
helpful tool to protect against network gaps and slowdowns that could
lead to a breach, network monitoring systems are not intrusion detection
systems or intrusion prevention systems.
How this technology works

Network monitoring uses a variety of techniques to test the availability


and functionality of the network. Some of the more common general
techniques used to collect data for monitoring software are listed below:

 Ping: A ping is one of the most basic techniques that monitoring


software uses to test hosts within a network. The monitoring
system sends out a signal and records data such as whether the
signal was received, how long it took the host to receive the signal,
whether any signal data was lost and more.
 SNMP: Simple network management protocol (SNMP) monitors
individual devices in a network through monitoring software. In
this system, each monitored device has monitoring software
installed that sends information about the device’s performance to
a central SNMP manager. The manager collects this information in
a database and analyzes it for errors.
 Syslog: Syslog is an automated messaging system that sends
messages when an event affects a network device. Technicians can
set up devices to send out messages when the device encounters an
error, shuts down unexpectedly, encounters a configuration failure
and more…. .
 Scripts: In networks with gaps in network monitoring software
functionality, scripts may be used to fill small gaps. Scripts are
simple programs that collect basic information and instruct the
network to perform an action within certain conditions.

Once this data is collected, the network monitoring software sends out
an alert if results don’t fall within certain thresholds. Network managers
will usually set these thresholds of acceptable performance,
programming the network software to send out an alert if its data
indicates slow throughput, high error rates, unavailable devices or slow
response times.

Internet server monitoring


Monitoring an internet server means that the server owner always knows
if one or all of his services go down. Server monitoring may be internal,
i.e. web server software checks its status and notifies the owner if some
services go down, and external, i.e. some web server monitoring
companies check the services status with a certain frequency. Server
monitoring can encompass a check of system metrics, such as CPU
usage, memory usage, network performance and disk space. It can also
include application monitoring, such as checking the processes of
programs such as Apache, MySQL, Nginx, Postgres and others.
External monitoring is more reliable, as it keeps on working when the
server completely goes down. Good server monitoring tools also have
performance benchmarking, alerting capabilities and the ability to link
certain thresholds with automated server jobs, such as provisioning more
memory or performing a backup.
Servers around the globe
Network monitoring services usually have a number of servers around
the globe - for example in America, Europe, Asia, Australia and other
locations. By having multiple servers in different geographic locations, a
monitoring service can determine if a Web server is available across
different networks worldwide. The more the locations used, the more
complete is the picture on network availability.
Web server monitoring process
When monitoring a web server for potential problems, an external web
monitoring service checks a number of parameters. First of all, it
monitors for a proper HTTP return code. By HTTP specifications RFC
2616, any web server returns several HTTP codes. Analysis of the HTTP
codes is the fastest way to determine the current status of the monitored
web server. Third-party application performance monitoring tools
provide additional web server monitoring, alerting and reporting
capabilities.

B : Internet- and network-monitoring software used in HRM

Used to track employees Internet and e-mail activities or to monitor


their performance

Used well, technology makes HR practices more efficient. Before


the internet and email, connecting with job seekers meant phone,
face time or a letter. Software programs can even take over much of
the work in evaluating employees.

Data warehouse and computerized analytic program


In computing, a data warehouse (DW or DWH), also known as
an enterprise data warehouse (EDW), is a system used
for reporting and data analysis, and is considered a core component
of business intelligence.[1] DWs are central repositories of integrated data
from one or more disparate sources. They store current and historical
data in one single place[2] that are used for creating analytical reports for
workers throughout the enterprise.[3]
The data stored in the warehouse is uploaded from the operational
systems (such as marketing or sales). The data may pass through
an operational data store and may require data cleansing[2] for additional
operations to ensure data quality before it is used in the DW for
reporting.
Extract, transform, load (ETL) and extract, load, transform (E-LT) are
the two main approaches used to build a data warehouse system.\
History
The concept of data warehousing dates back to the late 1980s when IBM
researchers Barry Devlin and Paul Murphy developed the "business data
warehouse". In essence, the data warehousing concept was intended to
provide an architectural model for the flow of data from operational
systems to decision support environments. The concept attempted to
address the various problems associated with this flow, mainly the high
costs associated with it. In the absence of a data warehousing
architecture. The process of gathering, cleaning and integrating data
from various sources, usually from long-term existing operational
systems (usually referred to as legacy systems).
Key developments in early years of data warehousing:
 1960s – General Mills and Dartmouth College, in a joint research
project, develop the terms dimensions and facts.
 1970s – ACNielsen and IRI provide dimensional data marts for
retail sales.
 1970s – Bill Inmon begins to define and discuss the term Data
Warehouse
 1975 – Sperry Univac introduces MAPPER (MAintain, Prepare,
and Produce Executive Reports) is a database management and
reporting system that includes the world's first 4GL. It is the first
platform designed for building Information Centers (a forerunner of
contemporary data warehouse technology).
 1983 – Teradata introduces the DBC/1012 database computer
specifically designed for decision support.
 1984 – Metaphor Computer Systems, founded by David Liddle and
Don Massaro, releases a hardware/software package and GUI for
business users to create a database management and analytic system.
 1985 - Sperry Corporation publishes an article (Martyn Jones and
Philip Newman) on information centers, where they introduce the
term MAPPER data warehouse in the context of information centers.
 1988 – Barry Devlin and Paul Murphy publish the article An
architecture fora business and information system where they
introduce the term "business data warehouse".
 1990 – Red Brick Systems, founded by Ralph Kimball, introduces
Red Brick Warehouse, a database management system specifically
for data warehousing.
 1991 – Prism Solutions, founded by Bill Inmon, introduces Prism
Warehouse Manager, software for developing a data warehouse.
 1992 – Bill Inmon publishes the book Building the Data
Warehouse.
 1995 – The Data Warehousing Institute, a for-profit organization
that promotes data warehousing, is founded.
Using Data Warehouse Systems And Human Resource Management
Due to significant growth organically and through acquisitions in recent
years, Global Payments facing many challenges connecting various data
warehouse systems and applications throughout the organization. Data
sharing become a major issue. It is sometimes impossible to access
certain systems within the organization due to different technology and
security. Dependency upon each entity or individual to send their data or
report can lead to greater risk of getting incorrect data interpretation or
errors, and human resource management have been done via legacy data
warehouse systems. As more demand on data collections and analytics,
these traditional data warehouse systems have become inadequate to
effectively manage these administrative activities. This inadequacy is
manifested in higher costs and increased merchant attritions. In order to
more effectively manage our administration, reduce costs, and improve
customer retention, Global Payments must move to a web-based
application as outlined in this business case. By doing so, enabling the
company to manage its administration from one central and common
platform.
Data warehouse and business intelligence systems are
commonly used in sales or marketing departments. In contrast,
their use in HR departments is relatively .

Das könnte Ihnen auch gefallen