Sie sind auf Seite 1von 30

Project Report

On

REAL TIME VISUALIZATION OF DATA


THROUGH WEB BASED SOLUTIONS
Submitted to

INTEGRATED TEST RANGE


DEFENCE RESEARCH DEVELOPMENT
ORGANISATION
Chandipur
Balasore, Odisha.

Submitted by

Ambika Prasad Swain


Department of
Computer Science
VELLORE INSTITUTE OF TECHNOLOGY
Amaravati, Andhra Pradesh.
Under the Guidance of

Mr. SOURAV KAITY


Scientist “E”
Computer Data Processing
ITR, DRDO
Certificate

This is to certify that the project entitled “Real Time Visualization of


Data through Web-based solutions” is a record of bonafied work
carried out under the supervision and guidance of Mr. Sourav Kaity
for partial fulfillment of the requirements for Summer Training
Program from 6th May 2019 to 6th June 2019 in the Computer Data
Processing, ITR (DRDO), Chandipur.

This thesis has fulfilled all the requirements as required by the


institute and in my opinion has reached the standard for submission.

Mr Rakesh Barua Mr Sourav Kaity


Scientist-‘F’ Scientist-‘E’
Group Director (CDP) Group Head(RTDP)

DATE:
ACKNOWLEDGEMENT

I express my deep sense of gratitude to Dr B K Das Outstanding


Scientist Director, ITR for granting me this great opportunity to work
on a project at ITR, an esteemed organization.

I express my deep regards and sincere thanks to my guide Mr. Sourav


Kaity, Scientist-‘E’ GH(RTDP) for his excellent guidance, motivation
and gracious inspiration throughout the project work. Despite his busy
schedule, he used to offer his knowledge and experience to me
everyday which helped me in giving a successful shape to the project.

I thank the entire office of CDP Mr Rakesh Barua Scientist-‘F’ Group


Director(CDP), Mr H K Ratha Scientist-‘G’ ADDL Director(H) and
Mr B Pattnaik Scientist-‘F’ ADG(CDP) for support, help and
continuous co-operation at the time of need and also to the rest of the
office members of ITR. It is my great pleasure to take this opportunity
to thank all those who helped us directly and indirectly in preparation
and completion of this project.

ITR, Chandipur
Balasore

AMBIKA PRASAD SWAIN


Reg no:17BCN7025
ABOUT ORGANISATION

Defense Research Development Organization is


premier R & D organization of the country which works on various
areas of military technology, weapons, missiles and other advance
military equipment for the fulfilment of the need by the armed forces.
It has various laboratories situated across the country with ITR being
on of the missile test ranges.
Integrated Test Range is the one of the labs belonging
to Defense Research Development Organization which engages in
testing various missile systems and evaluating the performance
through various radar, telemetry and other instruments to give a report
on how the particular missile is able to operate with certain
shortcomings or not. The DRDO constructed an interim facility
adjacent to the Proof and Experimental Establishment (PXE) at
Chandipur. In 1986, the Union Government announced plans to
construct a National Test Range at Baliapal in Balasore district, the
same district as Chandipur.There are various departments under ITR
like: Launch Complexes , Radar Systems , Computer Data
Processing , Telecommand , Telemetry ,Communication Systems etc.
Computer Data Processing engages in visualization
of real time data coming from a missile test which are collected,
processed and analyzed from useful information such as performance
and shortcomings can be obtained. Whenever there is a mission the
data gets uploaded through various sensors is obtained and then
analyzed as per requirement.
ABSTRACT

The main objective of this internship project was to create a web


server which can process the given data and update it at the rate of 100
milli seconds and show it in the client side. The data must be retrieved
from a database and must be projected graphically. Each section of the
data as it gets updated time to time in a real time basis must happen
within every 100 milli seconds .So in order to do this server-client
connection along with MongoDB as database was used and the server
side was written using node js and an express app.Here for the
efficiency of the update time Rickshaw js has been implemented to
reduce any load and to allow faster updating of data.
CONTENTS

1. INTRODUCTION 7-11
1.1 Motivation
1.2 Problem Statement
1.3 Concepts
2. SOFTWARES AND TOOLS USED 12-13
3. INITIAL DEVELOPMENT OF SERVER SENT
EVENTS 14-16
3.1 Client Side
3.2 Building Real-Time Backends
4. ADVANCING FURTHER WITH GRAPHS 17-18
5. OPTIMIZATION 19-21
5.1 Initial Problem
5.2 Reason for Optimization
6. CONNECTING TO MONGODB 22-27
6.1 HTTP requests
6.2 Handling Post Requests and Middleware
6.3 Installing Mongoose
6.4 Saving Data to the Database
7. CONCLUSION 28
8. FUTURE CONCEPTS 29
9. REFERENCES 30
1. INTRODUCTION

MOTIVATION:
Generally this was taken up for two possible reasons one being web
based development was something which was yet to be explored by
me so as a challenge to learn and expand my knowledge and the
second being that in ITR there was a working model of this project
however what was missing was the data render was not close to
100ms and the data was supposed to be retrieved from a database in
real time.

PROBLEM STATEMENT:
The working model present in ITR was a server-client model where
data was being uploaded to the server from a file , processed by the
server and uploaded onto the client every 1000ms . So what the target
was to first implement a database which would store real time data a
feed the data to the server and the server would upload it to the client
in a graphical format rendering every 100ms.

CONCEPTS USED:

The project that will be discussed in detail depicts on web


development solutions which includes basics like HTML, Cascading
Style Sheets and Java Script.
While HTML was intended to define a content of a document like a
heading or paragraph CSS on the other hand defines how to display
HTML elements. It is used to control the style and layout of multiple
Web pages all at once. Then comes Java Script which is basically used
to define the behaviors of web pages how they perform to various
actions.
The term web server can refer to either the hardware (the computer) or
the software (the computer application) that helps to deliver web
content that can be accessed through the Internet. The most common
use of web servers is to host websites, but there are other uses such as
gaming, data storage or running enterprise applications. • IIS - Internet
Information Services – Microsoft Windows • Apache Web Server –
Open Source – Cross-platform: UNIX, Linux, OS X, Windows

Internet Information Services:


Web Server that host the Web Pages/Web Site. Make sure to have the
IIS Role installed with ASP.NET sub components. IIS has it's own
Process Engine to handle the request. So, when a request comes from
client to server, IIS takes that request and process it and send response
back to clients
Node JS:
It is a free, opensource server environment which enables us to handle
file requests.So it basically sends the task to the computer, s file
system and when the file system opens and reads the file, the server
returns the content to the client. It runs single-threaded, non-blocking,
asynchronously programming, which is very memory efficient.

Express is a minimal and flexible Node.js web application framework


that provides a robust set of features to develop web and mobile
applications. It facilitates the rapid development of Node based Web
applications.
 Allows to set up middlewares to respond to HTTP Requests.
 Defines a routing table which is used to perform different
actions based on HTTP Method and URL.
 Allows to dynamically render HTML Pages based on passing
arguments to templates.
Once Express is installed it saves the installation locally in the
node_modules
 body-parser − This is a node.js middleware for handling
JSON, Raw, Text and URL encoded form data.
 cookie-parser − Parse Cookie header and populate
req.cookies with an object keyed by the cookie names.
 multer − This is a node.js middleware for handling
multipart/form-data.
MongoDB:
MongoDB is a cross-platform document oriented database program.
Classified as a No-SQL database program, MongoDB uses JSON -like
documents with schema.

REACT
COMPONENT
FRONT END
2. SOFTWARES AND TOOLS USED

 Cmder:It is a console emulator acting as a terminal which is


extremely useful and user-friendly and provide a bash-like
prompt using GNU's readline
 Atom: Atom is a free and open-source text and source code
editor for macOS, Linux, and Microsoft Windows[7] with
support for plug-ins written in Node.js, and embedded Git
Control, developed by GitHub. Atom is a desktop application
built using web technologies.[8] Most of the extending
packages have free software licenses and are community-built
and maintained.
 Notepad: An alternative yet a very useful form of text-editing
variant notepad or the notepad++ is easy and convenient for
use.
 Nodemon: nodemon is a tool that helps develop node.js based
applications by automatically restarting the node application
when file changes in the directory are detected. Nodemon does
not require any additional changes to your code or method of
development. nodemon is a replacement wrapper for node, to
use nodemon replace the word node on the command line when
executing your script.
 Postman: Postman is the only complete API development
environment, and flexibly integrates with the software
development cycle. Postman's new API features make it easy to
design APIs and maintain multiple API versions with schema
support and versioning.
 Robo 3T: Robo 3T (formerly Robomongo) is the free
lightweight GUI for MongoDB enthusiast , GUI for MongoDB
with embedded shell
 Scalable Vector Graphics: Scalable Vector Graphics (SVG) is an
XML-based vector image format for two-dimensional graphics
with support for interactivity and animation. SVG images and
their behaviors are defined in XML text files. This means that
they can be searched, indexed, scripted, and compressed. As
XML files, SVG images can be created and edited with any text
editor, as well as with drawing software.
3. INITIAL DEVELOPMENT

The typical interactions between browsers and servers consist of


browsers requesting resources and servers providing responses. But,
can we make our servers send data to clients at any time without
explicit requests?
The answer is yes! We can achieve that by using Server Sent Events, a
W3C standard that allows servers to push data to clients
asynchronously. This may suggest using that annoying polling we'd
implement to get the progress status from a long server processing but,
thanks to SSE, we don't have to implement polling to wait for a
response from the server. We don't even need any complex and strange
protocol. That is, we can continue to use the standard HTTP protocol.
SSE opens a persistent connection that allows you to send data back to
the connected clients the second something is changed on the server.
The only caveat is that it doesn’t allow messages to go the other
direction. That’s not really a problem though, we still have good old
fashioned Ajax techniques for that.

The Client:
The EventSource function initiates a connection with the server over
good old HTTP or HTTPS. It has a similar API to WebSocket and you
can provide an onmessagehandler for receiving data from the server.
Here’s an annotated example showing all of the important events.

Sending Events from the Server:


When a HTTP request comes in from EventSource it will have
an Accept header of text/event-stream, we need to respond with headers
that keep the HTTP connection alive, then when we are ready to send
data back to the client we write data to the Response object in a special
format data: <data>\n\n.
Although SSE is wonderfully simple to implement on both the client
and the server, as mentioned above, its one caveat is that it doesn’t
provide a way to send data from the client to the server. Luckily, we
can already do that with XMLHttpRequest or fetch. Our new found
superpower is to be able to push from the server to the client.

Building Real-Time Backends:


The server-side of our application will be simple Node.js web server
that responds to requests submitted to the events endpoint. To
implement it, we create a new directory called real-time-sse-
backend at the same level of the real-time-sse-app directory. At the
beginning of the file, we import the http module and we use
its createServermethod to run a web server whose behavior is
described by the callback function passed as an argument. The
callback function verifies that the requested URL is /events and, only
in this case, initiates a response by sending a few HTTP headers. The
headers sent by the server are very important in order to establish a
live communication channel with the client.
In fact, the keep-alive value for the Connection header informs the
client that this is a permanent connection. With that, the client knows
that this is a connection that doesn't end with the first bunch of data
received.
The text/event-stream value for the Content-Type header determines
the way the client should interpret the data that it will receive. In
practice, this value informs the client that this connection uses the
Server-Sent Events protocol.
Finally, the Cache-Control header asks the client not to store data into
its local cache, so that data read by the client is really sent by the
server and not some old, out-of-date data received in the past.
After sending these headers, the client using
the EventSource() constructor will wait for events reporting newly
available data. The rest of the function body schedules the execution
of a few functions in order to simulate the change of a flight state.

Now we try to make a communication between node js server and the


client, so, in order to enable communication between our client app
and our backend server, we need to make our server support Cross
origin resource sharing. With this approach, the server authorizes a
client published on a different domain to request its resources. To
enable CORS in our Node.js server, we can simply add a new header
to be sent to the client: The Access-Control-Allow-Origin header.
4. ADVANCING FURTHER WITH GRAPHS
AND CHARTS
Now we could have used plot.ly but that but work much better with an
Angular CLI and as we are not using Angular CLI we went for an
alternative and tried with Canvas. The HTML <canvas> element is
used to draw graphics, on the fly, via JavaScript.
The <canvas> element is only a container for graphics. We must use
JavaScript to actually draw the graphics.
Canvas has several methods for drawing paths, boxes, circles, text,
and adding images. A simple example is here:
Next step what we did was to integrate everything. I wrote a small
html and css code for a basic outlook and added a list of random data
values to the js file and tried portraying it on a graph. So we got the
output shown above fine. Now we had to make a real time application
and with the target that the update time interval must be 100ms.The
graphs did show real time data but it failed beyond 1000ms.Inspite of
trying various ways and even using SVG as an alterative it didn’t quite
meet the requirements.
5. OPTIMIZATION

INITIAL PROBLEM:
So, as we had that problem we went for a different route. After a lot of
research, I came to a solution that using Rickshaw.js SVG charting
library that is based on D3.js will actually might solve the problem.
Rickshaw provides the elements you need to create interactive graphs:
renderers, legends, hovers, range selectors, It's all based on d3
underneath, so graphs are drawn with standard SVG and styled with
CSS. D3 stands for Data-Driven Documents. It is an open-source
JavaScript library developed by Mike Bostock to create custom
interactive data visualizations in the web browser using SVG,
HTML and CSS

D3 Features
 Uses Web Standards: D3 is an extremely powerful
visualization tool to create interactive data visualizations. It
exploits the modern web standards: SVG, HTML and CSS to
create data visualization.
 Data Driven: D3 is data driven. It can use static data or fetch it
from the remote server in different formats such as Arrays,
Objects, CSV, JSON, XML etc. to create different types of
charts.
 DOM Manipulation: D3 allows you to manipulate the
Document Object Model (DOM) based on your data.
 Data Driven Elements: It empowers your data to dynamically
generate elements and apply styles to the elements, be it a table,
a graph or any other HTML element and/or group of elements.
 Dynamic Properties: D3 gives the flexibility to provide
dynamic properties to most of its functions. Properties can be
specified as functions of data. That means your data can drive
your styles and attributes.
 Types of visualization: With D3, there are no standard
visualization formats. But it enables you to create anything from
an HTML table to a Pie chart, from graphs and bar charts to
geospatial maps.
 Custom Visualizations: Since D3 works with web standards, it
gives you complete control over your visualization features.
 Transitions: D3 provides the transition() function. This is quite
powerful because internally, D3 works out the logic to
interpolate between your values and find the intermittent states.
 Interaction and animation: D3 provides great support for
animation with functions like duration(), delay() and ease().
Animations from one state to another are fast and responsive to
user interactions.

REASON FOR OPTIMIZATION:


SVG provided different shapes like lines, rectangles, circles, ellipses
etc. Hence, designing visualizations with SVG gives you more
flexibility and power in what you can achieve. It is an image that is
text-based which is similar in structure to HTML and sits in the DOM
with properties can be specified as attributes
Each SVG element has it's own properties - which includes both
geometry and style properties. All properties can be set as attributes
but generally, we provide geometry properties as attributes and styling
properties as styles. And since SVG sits in the DOM, we can use attr()
and append() just like we did for HTML elements.
Now we had to render the charts at every 100ms so if we lower the
frequency of chart render functions we get the output with every
100ms refresh rate.

This is a screenshot of how the final output looks on the client side.
6. CONNECTING TO MONGODB

HTTP Requests:
GET
The GET method is used to retrieve information from the given server
using a given URI. Requests using GET should only retrieve data and
should have no other effect on the data.
POST
A POST request is used to send data to the server, for example,
customer information, file upload, etc. using HTML forms.
PUT
Replaces all current representations of the target resource with the
uploaded content.
DELETE
Removes all current representations of the target resource given by a
URI.

Now the main objective of these requests handlers is a get request can
retrieves data from a web server by specifying parameters in the URL
portion of the request. This is the main method used for document
retrieval.Once that is done the object is created in MongoDb as shown
in the UI of Robo 3T and we know that we have received it.To check
if the request handlers are working or not we can test them using
Postman through which we can use any of the HTTP requests .
Handling POST requests and Middleware:
As we know between any request and response there are lots of code
acting as middlewares which gets executed before a response is
received. So here instead of using a lot of these we can use Body
parser , so these look for any requests and as soon as it gets it parses
the code and attaches it to the request object. Now those request object
can be accessed in the route handlers present. So in order to do all this
we install body parser through npm install body parser and save it to
our dependencies directory.
Models and Data Schemas:
Unlike SQL databases, where you must determine and declare a
table’s schema before inserting data, MongoDB’s collections, by
default, does not require its documents to have the same schema. That
is:

The documents in a single collection do not need to have the same set
of fields and the data type for a field can differ across documents
within a collection.
To change the structure of the documents in a collection, such as add
new fields, remove existing fields, or change the field values to a new
type, update the documents to the new structure.
This flexibility facilitates the mapping of documents to an entity or an
object. Each document can match the data fields of the represented
entity, even if the document has substantial variation from other
documents in the collection.

Installing Mongoose:
Mongoose is an Object Data Modeling (ODM) library for MongoDB
and Node.js. It manages relationships between data, provides schema
validation, and is used to translate between objects in code and the
representation of those objects in MongoDB.
A Mongoose model is a wrapper on the Mongoose schema. A
Mongoose schema defines the structure of the document, default
values, validators, etc., whereas a Mongoose model provides an
interface to the database for creating, querying, updating, deleting
records, etc.

Saving Data to the Database:


So here we first connect to mongodb but before that we must run
Mongodb in the background which can be done by executing this
command "C:\Program Files\MongoDB\Server\4.0\bin\mongo.exe" in
the Cmder terminal and it shall run. Meanwhile we have explicitly
denote the connection to MongoDb through code.
Once this is done we can send post request through our Postman and
check if its working and then we can see that once we open Robo 3T
where it shows the request object is recerived.
So as this is done now we can add data to the database and as we run
the file it shows us that that data is fetched though the database and
rendered every 100ms on the client side.This is snap of the running
graph which is the output on client side.
CONCLUSION

This project is now able to meet the needs as was stated in the
problem statement as not only can access data from a database it is
able to render data every 100ms without any loss. Instead of going
with the typical node and express combination a different route was
taken which proved to be helpful in the optimization as the Rickshaw
js with SVG saved a lot of data loss and eventually everything fell into
place as it should have been . This project is still open to changes
though it can be modified in various ways and lot of other things can
be improved or added but for now it can give real-time visualization
of data through a webserver.
FUTURE WORK

Now given the project that the project was able to meet its
requirement there is however a lot of things that can be improved or
altered. As of now the database connectivity had some issues which
can be rectified or maybe instead of using MongoDB any other
NoSQL database which might be compatible can be used .Next is the
put requests has a lot of bugs which can be rectified as once the data is
added retrieving it was always a tough task which sometimes did
actually fail to be honest so that definitely can be rectified. Next is the
approach used is Rickshaw js with SVG library so instead of that one
can use Canvas js and can implement zooming and lot of other
properties to the graph and can improve the frontend even way better
changing the style sheets. Last is the instead of showing the data
rendering on a 2D graph maybe a 3D graph with hover options and
other functionalities could be shown. And even so the rendering time
can be reduced to further less with more accuracy.
REFERENCES
https://www.youtube.com/watch?
v=cOt8LfcA9wY&list=PL4cUxeGkcC9jBcybHMTIia56aV21o2cZ8&index=10

https://www.freecodecamp.org/news/introduction-to-mongoose-for-mongodb-
d2a7aa593c57/

https://www.sitepoint.com/real-time-apps-websockets-server-sent-events/

https://www.youtube.com/watch?v=gOcGbphplfM

https://docs.mongodb.com/manual/tutorial/install-mongodb-on-windows/

https://auth0.com/blog/developing-real-time-web-applications-with-server-sent-
events/

Das könnte Ihnen auch gefallen