CHAPTER 1 INTRODUCTION Many enterprise tools are moving to a Web GUI for a better user interface

CHAPTER 1
INTRODUCTION
Many enterprise tools are moving to a Web GUI for a better user interface, easy remote access and the numerous web design tools available handy. The shift to the Web GUIs is also because the designing on web is much ahead of Stand Alone Design Frameworks like Tk and Qt thanks to the numerous JavaScript libraries. Shift to Web GUI also helps with the access from a handheld device using the internet.

Web applications have a lot of advantage over the Stand Alone applications as Web Applications do not need a setup on the end user’s device. The easy authentication, with use of sessions and cookies can make user experience seamless and keep application very secure.
Use of REST API can make our application very flexible and the work can be easily ported to a Mobile app. Flask makes the development of a REST API very easy coupled with the multitude of libraries available in Python and light weight behavior of Flask over other Python libraries, made Flask a go to choice for development.
Looking into application, it should be very fluid and should be real time. It should cater to both technical and non-technical users. The operation of the Dashboard should be easy and must use a lot of Visual information like Graphs. The UI should be able to seamlessly convert from a Mobile to a Tablet to a PC view. Since there are a multitude of web browsers that people may use, the application must be designed to have a common UI which doesn’t vary much with browsers. The application should also be Single Page with not many reloads and network requests in it. Grommet Framework developed by HPE makes much of this functionalities much easy to deliver.

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!


order now

The Application should also be able to analyze PCAP files faster, and present a graph to the user. C++ handles the major part of PCAP analysis, which gives us the raw performance and speed required for the analysis required in the project.

1.1 State of art development
In this section, studies conducted, optimizations made and applications built using Flask, ReactJS , Expect and extensions of the same are discussed.

In 1, a brief study on Single Page Applications using JavaScript was made. The work highlights on building efficient Single Page Applications for the enterprise. Multiple JavaScript frameworks were tested. The study concluded that either ReactJS or AngularJs are suitable for the job. ReactJS was concluded to be better as it was more a library and less a framework.

In 2, a study on display and visualizations in JavaScript was studied. The agenda was to build a real time display tool using an Open Source software. It glanced upon multiple libraries and multiple charts. The study concluded d3.js was the best among all the open source softwares as it was easy to use and powerful at the same time.

In 4, a study was made on binding C++ and Python. The study mainly focused on the Python.h library and the distutils module. The study then approached a way to include C++ header files in Python and vice versa. While C++ headers were fast, they lacked functionalities, and while Python functions were ample, they lacked in speed. This work has presented an overview of binding Python and C++ and henceforth speeding up Python by a huge margin. The study also focuses on numpy library for the usecase of scientific computing.
The above are some studies, works and research material on various JavaScript libraries and frameworks, Python Flask and extensions and various usages of the same for different use cases in different fields.

1.2 Motivation
Over the last few years, Web technology has come to be of central prominence. Developing Web related applications is seen as a crucial skill in today’s IT world.

Managing scripts and keeping track of their logs is hugely done on a Command Line interface or a Standalone Application. This approach not only hinders User Experience but can be tiresome if done manually. The approach can also present the problems with running triggers on a remote machine.
In this project a system which can trigger scripts remotely on a Web based Interface has been proposed. The user can keep a library of scripts and run them, while checking the live flow of logs on the screen. The user is also notified in case of failure of script.
The system can also do PCAP analysis on the Web Interface and return intuitive graphs, charts and reports based on the analysis.

1.3 Problem Statement
In the existing system the test scripts are run manually on a Command Line Interface which is a tedious task and prone to errors.

The task at hand is to develop a Graphical User Interface (Single Page Application) for the running of the test scripts and to save and display log files in real time. The UI must also allow the user to configure the Scripts. The system has to also notify the user on a failed test or warnings that have occurred during the tests.

The other requirement is to analyze a PCAP file, and extract some useful information from it. The extracted data is to be then fed to the UI for it to be visualized using graphs and tables.
1.4 Objective
The objectives of the system are set with the requirements of each module. The first objective is to provide with a backend which can run the scripts we input and store the logs. The second objective is to build a Single Page Application GUI to configure, start and stop the triggers and view their logs, and to also serve as the Front End for the PCAP analyzer. The third objective is to build a PCAP analyzer for an offline PCAP file uploaded by the user.
1.5 Scope
This project is currently developed keeping in mind the relatively small population of specialists and a script runner for only bash / expect scripts which was as per the request. But with minor modifications the system can be used for automation of multiple types of scripts, regardless of the scripting languages. The PCAP analyzer is also in its early stages where the analysis is on intermediate level.

1.6 Methodology
There are five basic modules to the system as a whole. The first module is the Script Manager. The second module is trigger manager and the third module is trigger operations. For all the above mentioned modules a mix of Expect and Python Flask is being used. The Expect is called through the subprocess module of Python, and expect calls the Bash Scripts uploaded by the user and checks for Successful completion of the Bash script. The conditions for success have to be determined by the user.

The fourth module is the Log Master. A mix of Python and Terminal programs are used to implement the module. The logs when requested, the last read on the logs is fetched and a tail command to number of lines from last read of the logs to current cursor of log is returned back to the user.

The third module which is the PCAP analyzer is built with a mixture of C++ and Python Flask. C++ is used for the PCAP read, parse and initial analysis, and the data is then sent to Python using Python-C API for further analysis and binding with REST API of Flask. Distutils library of Python and Python.h is used for the binding.

All this is served through a GUI. The data is fetched from the REST API and Web Sockets returned from Flask. While REST API gives on demand functionality, Web Socket gives Real Time Functionality. The framework used is Grommet built by HPE, which is based on React. Additionally for this module, Redux is used for state store, React-Router is used for browser routing and socket.io is used for web socket connection.

1.7 Organization of the Report
This section gives the overall picture of the many chapters in this report.

Chapter 2 gives an overview of the project domain which describes the details of the software and techniques used to carried out the project.

Chapter 3 is on Software Requirement Specification which describes the assumptions and dependencies, user characteristics, functional requirements and constraints of the project.

Chapter 4 is High Level Design which elucidates the design phase in Software Development Life Cycle. This chapter talks about the design considerations like architectural strategies, general constraints, development methods and. It explains the project System Architecture and Data Flow Diagrams.

Chapter 5 is Detailed Design which explicates the five project modules. The functionality of each module represented as a flowchart is explained in this section.
Chapter 6 is Implementation which describes the technology used in the system. This section also explains programming language, development environment, code conventions followed.

Chapter 7 is on Software Testing which elaborates the test environment and briefly explains the test cases which were tried out during unit, integration and system testing.

Chapter 8 is Experimental Results which mentions the results found by the experimental analysis on different data sets. It talks about the inference made from the results.

Chapter 9 is Conclusion conveying the summary, limitations and future enhancements of the project.

1.8 Summary
This chapter deals with the introduction to the topic. The existing system for a Script Runner and PCAP analyzer is discussed and the requirements for a GUI based system are discussed. Research work, study material and other applications built on similar lines are discussed in this chapter. It also discuss in detail about the Motivation, Problem Statement, Objectives of the project.

CHAPTER 2
OVERVIEW OF FLASK FRAMEWORK, GROMMET AND PYTHON-C API
Flask is one of the most used Python Framework for Web Development. Grommet is a UI Framework built by HPE on top of Redux. Python-C (often called cython) is an API written for C to interact with Python. In the current system, Python-C API is used to build a Python executable .so file which is imported in Flask.

2.1 Flask Framework
Flask is a micro web framework written in Python and based on the Werkzeug toolkit and Jinja2 template engine. Flask is classified as a microframework because it does not require particular tools or libraries. It has no database abstraction layer, form validation, or any other components where pre-existing third-party libraries provide common functions. However, Flask supports extensions that can add application features as if they were implemented in Flask itself. Extensions exist for object-relational mappers, form validation, upload handling, various open authentication technologies and several common framework related tools. Routes can be easily specified in Flask by using a method decorator route.

2.1.1 Jinja2 and REST API
Jinja2 is a template engine for binding HTML and Python, which is often used by Python web frameworks. A Python data can be appended to HTML with {{ }} tag and a Python control can be appended with {% %} tag. The variables are sent as argument in the Flask render_template() module. Jinja2 also supports template inheritance where we can reuse redundant codes. This becomes very useful in scenarios like Header and Footer for the Page.

Flask can return both REST or HTML based on the route provided to the app. The HTML is usually returned through Jinja2 and REST is returned through jsonify module of Flask. This returns a JSON object from a dictionary or array input. Web Sockets are implemented using flask_socketio Python Package. An overview of websockets and REST polling is shown in Figure 2.1 and Figure 2.2. To enable Cross Origin Resource Sharing (CORS), flask-cors package is used. Cross-origin resource sharing (CORS) is a mechanism that allows restricted resources (e.g. fonts) on a web page to be requested from another domain outside the domain from which the first resource was served.

2.1.2 Extensions
The Flask framework is shipped with only 2 inbuilt extensions, ie the Jinja2 and Werkzeug. Jinja2 is for template design while Werkzeug is a lightweight development server. Flask is also WSGI ready so porting it to Apache with a WSGI middleware like mod_wsgi or uWSGI is easy.

Figure 2.1 Communication through Web Sockets

Figure 2.2 Communication through REST PollingFlask framework is not shipped with any other extension. They can be installed by the user. The extension library is well maintained and updated on a regular basis. The general naming of Flask extension goes by Flask-Foo. Extensions used by our system include Flask-socketio and Flask-CORS. Some other popular Flask Extensions are:
Flask-Admin : Simple and extensible interface for admin.

Flask-Cache : Cache support to flask
Flask-Celery : Integration of celery to flask. Celery is an asynchronous task queue/job queue based on distributed message passing. It is focused on real-time operation, but supports scheduling as well
Flask-Login : It provides user session management and user login methods to Flask
Flask-OAuth : Support to O-Auth in Flask
Flask-SQLAlchemy: It gives SQLAlchemy support to Flask. SQLAlchemy is a popular Python Package which gives ORM (Object Relational Mapping) like Hibernate in Java. In addition Flask-SQLAlchemy gives support to maintain multiple databases in the app.

Flask-Uploads: Adds the functionality of easily uploading a file to the Flask app.
Flask-User: User and Account Management for Flask. Methods like Register, Username, Email Address, Forgot Password and Role based authorizations are provided.

Not having the extensions pre installed is one of the best features of Flask. This is what it makes truly light weight, yet powerful. The installation of the Extensions is totally left on users. Most Flask extensions are reviewed and can also be downloaded through PyPI.

2.2 Grommet
Grommet is an Open Source UI Framework built by HPE based on React Library 23. Reactjs was built by Facebook and was Open Sourced in 2013. Basic terms used in Reactjs are Component, props and states. A Component is a reusable React Class which holds states and props, and renders some HTML. Props are the arguments sent to the Component when it is called by its parent. Props cannot be changed. States are temporary data stored by the components, which can be used to determine current state of the component or just be used for storing temporary calculated values in the component. Grommet provides us with some pre built React Components, like ;Box; which is equivalent to ;div; in HTML, but has some extra properties like flexbox. Grommet also provides with inbuilt styling which is done using SCSS.
2.2.1 Virtual DOM
Virtual DOM (Virtual Document Object Model) is an important feature of React. React creates a Copy of the DOM in memory called Virtual DOM. The Virtual DOM just calculates the changes done to the DOM and updates just the changes to the DOM displayed on the browser. This allows programmer to write code as if the whole page in rendered on each change, while internally in React only a sub component is updated. This makes rendering on React fast.

2.2.2 Component Lifecycle
React also has a callback function for every stage of a component. They are called component life cycles. Commonly used React Component Lifecycle methods are explained in this section.

Figure 2.3 React Component Lifecycle methods
The Figure 2.3 shows Lifecycle components of React. The most common lifecycle method used are
componentWillMount() : Component has mounted but not rendered.

componentDidMount() : Called just after the initial render
render() : Returns JSX elements, this is the most important life cycle method
componentWillReceiveProps(): Component received props from the parent, or parent has changed the value of props
componentWillUnmount(): Acts as a destructor. Used to clear the set-interval timers for polling etc.

2.2.3 Redux
React is often used in conjunction with Redux. Redux provides React with a container for storing the application state. Redux helps us a lot with Single Page Application building, as we can request most information on startup of the app and store the results in a redux store. Then we can just display information based on the store.

2.3 Python-C API
Python-C is an API to bind Python with C/C++. The API can also be used with C++ but by convention it is just call Python-C. There are mainly two reasons to use the API. The first reason is to build powerful Python libraries by using C/C++ (which we are exploiting in this project). It can also be used for using python libraries in C/C++, this technique is called embedding Python in application
2.3.1 DistutilsDistutils is used to build and install additional modules into an existing Python installation. The module may be written in Python or C or mixture of both. In the current scenario we use distutils to compile the C/C++ program written with Python-C API to Python executable .so files. Doing so gives us the raw power of C and the huge library support of Python, which truly gives us “best of the both worlds” 8
2.4 Summary
This chapter gives an overview of the Flask Micro-Framework and its popular extensions with a brief note on their functionality. This chapter also explains React – Redux and its component lifecycle. There is also a brief explanation of Python-C API and building Python modules from C using distutilsCHAPTER 3
SOFTWARE REQUIREMENTS SPECIFICATION FOR NETWORK TRAFFIC TRIGGER AND LOG ANALYZER
Software Requirements Specification 20 is detail description of system behavior that is constructed. It integrates the functional and nonfunctional requirements for software to be constructed. The functional requirements describe what exactly the software must do and the non-functional requirement includes the constraint on the design or implementation of the system. A function is described as a set of inputs, the behavior, and outputs. A non-functional requirement is a requirement that specifies criteria that can be used to judge the operation of a system, rather than specific behaviors.

3.1 Overall Description
This section describes the general factors which affects system and the requirements. The software developed must be able to configure the scripts, run the scripts for the specified iterations and threads , with each thread of the script generating a log on the local filesystem. The system should also be able to analyze PCAP files faster. This section also deals with user characteristics, constraints on using the system and dependencies of the system on other applications.

3.1.1 Product Perspective
The system should be versatile and easy to use. It should be flexible and the response time should be quick. The system is composed of several modules performing different tasks and they must be well co-ordinated. The system developed should be easy to deploy and maintain. The intended users are IT Analysts and the Quality Analysts, who can easily run and configure their scripts and get a GUI view for the scripts and logs. The scripts can be triggered on a remote System. The system is also intended for Network Analysts and System Admins who can analyse Packet Capture files.
3.1.2 Product Functions
There are five functions that the product is responsible for, the primary one being managing the scripts written by the enduser or use the scripts available in the library. The second one deals with managing the triggers and configuration of them. The third is responsible for the user to start and stop the triggers. The system also allows for the user to view logs which are generated when the script are run. The final functionality of the system is to analyze PCAP files, and show graphical information like Charts to the end user.
3.1.3 User Characteristics
The end users of the system, mentioned before map mostly to two roles, Quality analysts and System Admins. The system can help the end users with the exploratory data analysis (EDA). In case of the system or the network problems, the scripts can be run by the system admins, and for further analysis they can analyze the Packet Capture files to detect anomalies in the capture. 1
Constraints and Dependencies
The network traffic trigger and Log analyzer has many dependencies. The dependencies that the project has are:
Flask should be installed on the system and should be configured with mod_wsgi.

Expect should be installed.

Apache should be running and listening the correct port and should have the module mod_wsgi installed.

Grommet should be forked 24 and all grommet dependencies should be installed
C packages libpcap-devel and python-dev must be installed
All Environment settings must be configured properly
3.2 Specific Requirements
This section covers all software requirements with sufficient details to be made use of by the designers to build a system to satisfy requirements. The product perspective and user characteristics do not state the actual requirements needed that the system but rather state how the product should work with respect to user convenience. The specific requirements are actual data with which the customer and software provider can agree. The final system is expected to satisfy all the requirements mentioned here.

3.2.1 Functional Requirements
The functional requirements of the system are:
The system must automatically show the scripts written previously on the system and their configurations.

The system must be able to run the scripts according to the configurations and notify the user on success and failure of the scripts.

The system should also show the generated logs to the user on the Front End.

The system should be able to analyse Packet capture file and then show a detailed report of the capture file to the user.

3.2.2 Performance Requirements
Performance requirements for the system include the following:
The scripts should be able to run concurrently
Packet Capture analysis should be done quickly.

Logs should be shown in real time.

3.2.3 Supportability
The system is web based. Thus, it should work with all standard browsers without much difference in appearance of the interface.

3.2.4 Software Requirements
The different software requirements required by the application are as follows:
Operating System
OS X Yosemite and higher versions
RHEL 7 or higher
Language : Python, Javascript, C++
Backend : Flask Framework (Build REST API) + Apache (Server).

FrontEnd : Grommet(HTML + CSS + Javascript) + http-server (Server)
IDE/tool
Vim 8.0.0
Atom 1.27.0
Sublime Text 3.1.0
3.2.5 Hardware Requirements
The following describe the hardware requirements for ideal running of the application.

Processors : 2.2 GHz Intel Core i7
Memory : 16 GB 1600 MHz DDR3
Storage : Does not affect performance
3.2.6 Design Constraints
The system is designed to be flexible. The design constraints include a user friendly interface. It should be able to run multiple scripts concurrently. The system must ensure that the application state is always consistent. The system should take a small time to analyze the PCAP file. Also, the system must be built to handle all possible inputs so as to make the feature comprehensive enough for non-technical end users.

3.2.7 Interfaces
This section describes the interfaces in detail. The two types of interfaces involved are User Interfaces and Software Interfaces.

3.2.7.1 User Interfaces of the system
The five modules involved each have different user interfaces from which they accept parameters from the user.

The trigger operations module is shown on the landing page of the dashboard.

The trigger manager module can be got by selecting edit trigger or add trigger.

The script manager module can be got by selecting add scripts module.

The log master module pops up as a console on starting a trigger, and shows the generated logs.

The PCAP Master module can be selected from the menu tab, where a PCAP file can be uploaded, after upload it is analyzed and the data is displayed on the UI.
3.2.7.2 Software Interfaces of the system
The following are software used in the system:
Redux middleware for application store.

Distutils for building python modules
Non-Functional Requirements
These are the requirements that specify criteria that can be used to judge or evaluate the operation of the system rather than specific behavior and are not directly concerned with the specific functions delivered by the system .Efficiency : The system shall perform at best possible efficiency in all internal operations
Availability : The system must be up and running at all times
Security : The system must be well built with no security threats
Uniformity : The system must perform on any standard browser with no disparities
Speed : The system must have a short latency period and must be responsive
3.3 Summary
The specific requirements and constraints that must be kept in mind while building the application have been detailed in this chapter. These include the hardware requirements, software requirements and functional requirements for automation of reconciliation. Also, this chapter cites the various assumptions being made by the developer of the system for PCAP analysis. All these have to be managed while building and running the system.

CHAPTER 4
HIGH LEVEL DESIGN FOR SCRIPT RUNNER AND PCAP ANALYZER
Design is significant phase in development of software. It is basically a creative procedure which includes the description of the system organization, establishes that it satisfies the functional and non-functional system requirements 20. Larger systems divided down into smaller sub-systems contain services that are related to each other. The output in design phase describes the architecture of software to be used for the development of the common endpoint service. This section depicts the issues that are required to be covered or resolved before attempting to devise a complete design solution. The detailed design includes an explanation for all the modules. It throws light on the purpose, functionality, input and output. The software specification requirements have been studied to design an appropriate and efficient software to handle a multitude of users belonging to different user groups simultaneously accessing the system.

4.1 Design Considerations
There are several design consideration issues that need to be fixed before designing a solution for the system to be implemented. The following sections describe constraints that have heavy impact on the software, a method or approach used for the development and the architectural strategies. It also describes the overview of the system design.

4.1.1 General Constraints
General constraints which need to be considered to use the system are listed below:
The user should have knowledge of the required inputs and the formats of the inputs
The data entered by user must be legitimate
The user should have knowledge of writing Expect Scripts and some other general scripting language
4.1.2 Development Methods
The design method employed is highlighted in this chapter. The data flow model has been the design method employed for development of the system. A data flow model is modeling system based on data transformation that takes place as the data is being processed. The notations used represent functional processing and data stores. Data flow models gives the better understanding of how data is associated with the particular process by tracking and providing the documentation.

4.2 Architectural Strategies
The overall organization of the system and its high level structure is provided by this section and this section also provides the key insight into the mechanism and strategies used in system architecture.

4.2.1 Programming Language
The system involves two major segments, the frontend and backend which are built using Python and Javascript respectively. Python and Javascript are scripting languages which support wide range of data types and application programming interface for handling the data. The user interfaces are developed using HTML and CSS.

Flask is used in the backend to return REST response and web sockets. React is used to fetch the REST API from the backend using Ajax and display the components. The separation of Front End and Back End makes our app more portable and the dashboard can be accessed remotely from anywhere and usage of REST API makes portability to a mobile App very easy.

4.2.2 User Interface Paradigm
The GUI used is a Single Page Application. Each functionality has an option on menu and a route. Grommet provides with the React components inbuilt and helps with the maintainability of the code. The SCSS is predefined in the components provided by the system and the media queries are handled by Grommet. So switch from a mobile view to tablet view to a PC view can be done seamlessly.

4.2.3 Error Detection and Recovery
Error detection and recovery is an important aspect of the implemented project. React helps ensure that the required field in the form is filled with relavent data and also helps have default data in case of repetitive fields. Exceptions may occur, if the user enters incompatible data. Internal error detection is done using the exception handling clauses provided by Python. An occurrence of error also triggers a warning message to the required developers who would immediately look into and fix any errors.

4.2.4 Application State Management
Application state management is necessary for efficiency of the program. It ensures that the current state of the application and all the data fetched through ajax is stored in a central container (Redux) for efficient implementation of the Single Page Application. This is applicable to all the five main modules of the system. The store data is returned to the UI on request for store.

4.3 System Architecture
This section is focused on basic structure of model in the system. It aims to identify major modules in system and communication flow amongst these modules. The approach used for the development of the common endpoint service is object oriented wherein the system is classified into different objects, which represent real world entities. The architecture is depicted in the figure below.

Figure 4.1 System Architecture of Script Runner and PCAP Analyzer
The system architecture is as shown in Figure 4.1. The user can add or view or edit the scripts present in the system. The scripts are stored locally in the disk. The user can also compose triggers. Triggers are scripts coupled with configurations which define the behavior of the script. The default configurations of scripts are Iteration and Threads, rest of the configurations are user defined. The user can start or stop those triggers. The logs are generated in the process. The lines appended to the logs are sent back to the user. The user can upload the Packet Capture files and then analyze them, the analyzed data is then modeled to graphs and visualizations and sent back to user.

4.4 Data flow diagram
A Data Flow Diagram (DFD) is graphical representation of the “flow” of data through an information system 20. Data Flow models describe how data flows through a sequence of processing steps. DFD is composed of four elements, which are process, data flow, external entity and data store. With data flow diagram, the users can easily to visualize the operations within the system, what can be accomplished using the system and implementation of the system. DFDs provide the end users an abstract idea regarding the data that is given as input to the system, the effect that the input will ultimately have upon the whole system.

4.4.1 Data flow diagram – Level 0
The level 0 DFD describes general operation of the system. It represents the system and user and the inputs and outputs between the user and the system. The level 0 Data flow diagram is as shown in the Figure 4.2.

Figure 4.2 Level 0 DFD of the Network Script Trigger and Packet Analyzer
4.4.2 Data flow diagram – Level 1
The Level 1 DFD describes the system more in detail than the Level 0 DFD. It specifies the main modules involved in the system. This is as shown in the Figure 4.3. The manage scripts module handles the script storage operations in the local disk storage. It returns the contents of the script to the user (in case of view operation). Manage Trigger operation stores the Configurations on the disk and stores the Triggers in memory, and prepares a trigger for the script.

Figure 4.3 Level 1 DFD of Network Script Trigger and Packet Analyzer

In the trigger operations the user can start and stop a trigger and the response is sent as a notification (success / failure). This step also generates logs which are used in the next module. The log master module uses the log file and sends the appended logs back to the user. PCAP operation module deals with PCAP operations like uploading a PCAP and the analysis of it. The user is returned with visualizations modeled from the analyze PCAP file.

4.4.3 Data flow diagram – Level 2
The processes in level 1 are expanded here. The Level 2 DFD for manage scripts is shown in Figure 4.4, the same for manage triggers is shown in Figure 4.5, Trigger Operations is shown in Figure 4.6, Log Master is shown in Figure 4.7 and PCAP operations is shown in Figure 4.8.

Figure 4.4 Level 2 DFD for Script Manager Module
Figure 4.4 shows level 2 DFD for script manager. The user can add a script, edit an existing script or delete a script from the store. The user can also run a script with default configurations and check for the behavior of the script on the remote system. The scripts are stored in the disk.

Figure 4.5 Level 2 DFD for Trigger Manager Module
Figure 4.5 shows level 2 DFD for Trigger Manager. A trigger is a script coupled with the argument the script takes and the configurations. The default configurations are the iterations and threads. Triggers are stored in the memory for quick use, and they are also stored locally for future use. They are read from the system on startup and are saved to the system when application is closed. When application is running, the triggers are stored in memory.

Figure 4.6 shows level 2 DFD for trigger operations module. A trigger can be started or stopped by the user. The user has options to either start all threads of a trigger, or the user can just start a single thread of a trigger. The logs are produced when the script is run and each thread generates a separate log. The success and failure codes are hard coded in the expect script and the status notification of trigger is sent back to the user.

Figure 4.6 Level 2 DFD for Trigger Operations Module
Figure 4.7 shows level 2 DFD for Log Master module. The logs generated by trigger are received by the log master. The log master tails the logs and finds the lines appended to the log file after the previous view of log file. The first view of log file shows last 30 lines. And shows the freshest logs then after. This is done by a polling method.

Figure 4.7 Level 2 DFD for Log Master moduleFigure 4.8 shows the Level 2 DFD for the PCAP analyser module. The user can upload or analyze a PCAP file. The file is first uploaded and then sent to the analysis pipeline. Here the file is first dissected and then data analysis is done. The analysed data is then cleaned, and processed for visualizations. The visualized data is then given to the client.

Figure 4.7 Level 2 DFD for Log Master module4.5 Summary
The above data models depict how data is processed by the system. This constitutes the analysis level. The notations applied above represent functional processing, data stores and data movement amongst the functions. The purpose of chapter is to describe major high- level processes and their interrelation in the system. All the above mentioned levels of DFDs illustrate these.

CHAPTER 5
DETAILED DESIGN OF SCRIPT RUNNER AND PCAP ANALYZER
In the Detailed Design phase 20, the internal logic of every module specified in High Level Design (HLD) is determined. Specifically, in this phase the design of each module, the low-level components and subcomponents are described. After determining HLD 21 graphical representation of the software system being developed is drawn. Each module’s input and output type, along with the possible data structures and algorithms used are documented during the detailed design phase. The following sections provide such information of the modules.

5.1 Structure Chart
The structure chart 20 shows the control flow among the modules in the system. It explains all the identified modules and the interaction between the modules. It also explains the identified sub-modules. The structure chart explains the input for each modules and output generated by each module.

In the system, there are five sub modules. They are Script Manager, Trigger Manager, Trigger Operations, Log Master and PCAP Master. The description of the sub modules, the flow of data and the results of each sub module are shown in Figure 5.1. The modules each receive input and process the input and produce output which is seen to end users in a web dashboard. The Script manager module starts with the user trying to add/edit scripts. The end result for this module is the script store is changed accordingly. The Trigger manager also works like Scripts manager but instead it works with CRUD for Triggers. The Trigger Operations module deals with the start and stop operations which are performed on the trigger. Log Master module deals with accepting logs and returning the user back with the logs from last log read to the newly appended log. PCAP Master module deals with upload of PCAP file and the analysis of the same. The user is returned with visualizations in dashboard if the action was Analyze.

Figure 5.1 Structure Chart of Script Trigger and PCAP analyser5.2 Functional Description of Modules
The internal working of each of the modules is explained in this section. It also describes the software component and subcomponent of the system.

5.2.1 Script Manager Module
This Module deals with the CRUD operations performed on a Script. It can also indirectly affect the Triggers in the store. The flowchart for the same is shown in Figure 5.2.

Purpose: The purpose of this module is to do changes on the scripts already present in the store, or to delete a script or to upload a new script.

Input: The users action are taken as input from the UI , going further the File Name is taken as an input for editing or deleting a particular script or a script is taken as an input for inserting a script.

Output: The status of the operation is displayed as a toast in the UI.

Functionality: The input is given to the module which checks if the action is Add, Edit or Delete. The Script name is then taken as input to delete or update it, incase of Edit or Delete Action. The Script is edited by the user, if its add or edit option. In case of delete all triggers related to the script are also deleted.

Figure 5.2 Script Manager flowchart5.2.2 Trigger Manager Module
This is the second module of the system which is responsible for the CRUD operation of triggers. The flowchart for the same is shown in Figure 5.3. Scripts coupled with configurations and parameters for script is called a trigger.

Purpose: The purpose of this module is to setup and manage the triggers available to the user.

Input: The action of user is taken as an input and then configurations, parameters and script path are taken in case of add / edit. The trigger name is taken as an input in case of an edit / delete operation.

Output: A toast is shown in the UI to the user.

Functionality: The input is taken and type of action is determined. The triggers are added by coupling a script with its parameters and configuration for triggers. The trigger is deleted on delete. If edit operation, the trigger is edited, based on input given by the user.

Figure 5.3 Trigger Manager flowchart
5.2.3 Trigger Operation Module
This is the third module of the system which handles the start/stop operation performed on the trigger. The flowchart for the same is shown in Figure 5.4 .Purpose: The purpose of this module is to let users start or stop the triggers.

Input: The type of operation to be performed that is start/stop is given to the system through UI. The trigger name is also given to the system.

Output: The success or failure of the trigger is shown to the user through a message. A toast is also displayed in case of start/stop a trigger.

Functionality: The input is given to the module which checks with the current state of the trigger and starts/stops it as directed if considered safe. Status of the trigger (Started / Stopped / Error) are returned to the user.

Figure 5.4 Trigger Operations flowchart
5.2.4 Log Master Module
This is the fourth module of the system which checks generated logs and returns the freshest ones to the user. The flowchart for the same is shown in Figure 5.5 .Purpose: The purpose of this module is to let users get logs in real time.

Input: The logs generated by the trigger are given to the module.

Output: The logs which are appended after the last read to the same log are returned
Functionality: The input is given to the module which checks the last point of log read and returns the user logs from last read line to the current last line of the logs. The last read line is set to current last line.

Figure 5.5 Log Master flowchart5.2.5 PCAP Master Module
This is the last module of the system which deals with the management of PCAP files and their analysis. The flowchart for the same is shown in Figure 5.6 .Purpose: The purpose of this module is to let users upload PCAP files and analyse them.

Input: PCAP File is uploaded to the system.

Output: Visualizations based on the analyzed PCAP file and the report are returned back to the user.

Functionality: The input is given to the module which uploads the PCAP file if the action specified is Upload. If the action specified is analyze, then the PCAP file is checked in store and analyzed. The report generated from analysis and visualizations like graph are returned back to the user.

Figure 5.6 PCAP Master flowchart5.3 Summary
The internal working of the application’s modules with the necessary data flow through each of them has been described in this chapter. A clear view on control flow within system was conveyed by the structure chart with the functionality of its modules being explained. The flow charts explain the working of each module with flow of control in the module specified which gives a complete understanding of the functioning.
CHAPTER 6
IMPLEMENTATION OF SERVICE TRIGGER AND NETWORK ANALYZER
The implementation phase is significant phases in the project development as it affords final solution that solves the issues. In this phase the low level designs are transformed into the language specific programs such that the requirements given in the software requirements specification 20 are satisfied. This phase entails actual implementation of ideas that were described in analysis and design phase. The technique and the methods that are used for implementing software must support reusability, ease of maintenance and should be well documented.

Programming Language Selection
The programming languages chosen to implement the project are C++, JavaScript and Python. JavaScript is one of the most useful languages in the UI development environment with plenty of tools and libraries available. Some of the benefits that JavaScript provides which were key for choosing the same are:
Simple, easy and highly readable program syntax with ES6
Extensive support for Web Applications
Platform independent
Myriad set of libraries for web-based tools.

Modular development thanks to React
Asynchronous programming support
Python is yet another such language which is highly readable and user friendly and the Flask Framework helps us implement the REST Paradigm into simple, unified projects. Some of the key benefits of Python which resulted in the selection of Python as the language for developing the dashboard are:
Easy to implement REST API
Multitude of packages and support for development
Extensive support for Web Application development
Easy to read syntax
Easy binding with C/C++
6.2 Platform Selection
The backend of this system works only on Linux and Mac OS as the code relies on Terminal commands a lot. The system’s front end based on React is designed to work on Linux, Mac OS and Windows operating systems although the development was done completely on RHEL (Linux). Front End testing was done on Windows systems as well to ensure no incompatibility issues occurred. Since the front end which is accessed by the end user is a web dashboard built with Portability in mind, any of the commonly used browsers can be used to access it.

6.3 Code Conventions
This section discusses the coding standards followed throughout the project. It includes the software applications that are necessary to complete the project. Proper coding standards should be followed because large project should be coded in a consistent style. This makes it easier to understand any part of the code without much difficulty. Code conventions are important because it improves readability in software, allowing the programmers to understand code clearly.

6.3.1 Naming Conventions
Naming conventions helps programs in understandable manner which makes easier to read. The names given to packages, scripts, graphs and classes are to be clear and precise so that their contents can easily be understood. The project uses both Java and Python, and the naming convention followed in the two are slightly divergent from each other.

The conventions followed for this project in JavaScript are as follows:
Classes: Class names are nouns. The upper camel casing method is followed, in which the first letter of every word is in capital, including the first word.

Example: ScriptManager.

Methods: Methods should be verb. For methods, the upper camel casing is followed, except for first letter of the name.

Example: getValue( ).

Variables: Variable names must be short and meaningful.
Example: macAddress, which indicates the mac address of the device.

The conventions followed in Python for this project are as follows:
Classes: Class names are nouns. The upper camel casing method is followed, in which the first letter of every word is in capital, including the first word.

Example: ScriptManager Methods: Methods should be verb. The method name is a description of its function with words connected by underscore
Example: get_value( ).

Variables: Variable names must be short,meaningful and if multiple words are present, they are connected by underscore. All letters are small.
Example: mac_address, which indicates the mac address of the device.

6.3.2 File Organization
The files used to implement the project were organized and kept in certain order based on their types. In the backend Flask Framework, the files are arranged as follows:
All the script files are stored in the scripts folder of the application.

The Python files are present in the source (src) folder of the application.

Logs are present in the logs folder
All the data that is to be stored by scripts is in data folder
Trigger configurations are stored in scripts/conf folder
In the front end React + Redux
The distributable html , css compiled javascript files are present in dist folder.

Installed packages are in node-modules folder
All core JavaScript files are present in source (src) folder. It contains all the components which are not linked with redux in components folder, all components linked with redux in containers folder. All redux actions are stored in actions folder and all the redux reducers are stored in reducers folder. The entry point index.js is an immediate child of src folder.
6.3.3 Declarations
Standard declaration conventions are followed in JavaScript, C++ and Python languages while coding. Standard names are given which make it easy to understand the role of each entity declared. Multiple declarations per line are not allowed because of commenting and to reduce ambiguity.

6.3.4 Comments
Comments are necessary part of any coding conventions as it improves the understandability of the code developed. In the project files, thanks to the integrated development environments, commented areas are printed in grey by default, so they are easy to identify.
In C++ and Javascript, comments for blocks of code are started by a ‘/*’ and are delimited by a ‘*/’. Single line comments are started by ‘//’. In Python comments start with a ‘#’. There are keyboard shortcuts and mouse options provided to comment out or uncomment blocks of code with ease. Comments are used for explaining what function a certain piece of code performs especially if the code relies on implicit assumptions or otherwise perform subtle actions.

6.4 Difficulties Encountered and Strategies Used to Tackle Them
This section discusses the difficulties encountered in the development of the project. One of the main difficulties encountered were in speed of reading and parsing of PCAP file in Python.

6.4.1 Speed of parsing PCAP file
As all the popular Python PCAP parsing libraries take a lot of time to parse a PCAP file ( ~ 20 seconds for a 20MB file), a custom PCAP parser in C++ is built which can parse the file and return required information in much lesser time. A speedup of over 200% was achieved. The PCAP of 20MB can be parsed in less than 4 seconds.
6.5 Summary
This chapter deals with the programming languages used which are JavaScript, Python and C++, the development environment and the code conventions followed in the languages during implementation of the application. It also explains the difficulties encountered in the course of implementation of the system like speed of parsing the PCAP file and the strategies used to handle them.

CHAPTER 7
SOFTWARE TESTING FOR SCRIPT RUNNER AND PCAP ANALYZER
The aim of Software Testing 21 is to detect defects or errors testing the components of programs individually. During testing, the components are combined to form a complete system. At this particular stage, testing is concerned to demonstrate that the function meets the required functional goals, and does not behave in abnormal ways. The test cases are chosen to assure the system behavior can be tested for all combinations. Accordingly, the expected behavior of the system under different combinations is given. Therefore test cases are selected which have inputs and the outputs on expected lines, inputs that are not valid and for which suitable messages must be given and inputs that do not occur very frequently which can be regarded as special cases. For testing software, various test strategies are to be used such as unit testing, integration testing, system testing and interface testing.

In this chapter, several test cases are designed for testing the behavior of all modules. When all modules are implemented completely, they are integrated and deployed on Tomcat server. Test cases are executed under same environment. Test cases mainly contain tests for functionality of all modules. Once the application passes all the test cases it is deployed on the production environment for actual real time use.

7.1 Test Environment
In this system all the inputs to system are given through the UI, built in HTML, CSS and Grommet. The output is also viewed through the GUI Dashboard built using Grommet. Integration and system testing was on done a designated test node which was set up to simulate the production environment.
7.2 Unit Testing
Unit test is the verification effort on the smallest unit of software design, the software modules. Unit testing ensures that the bugs that occur can be pinpointed easily since the code tested on is a small unit. The section describes some of the unit tests run with test case details and brief explanations.

7.2.1 Unit Testing of Script Manager Module
The following show the test cases for Script manager on which this testing is performed. The testing for forecasting of parameters’ values is as shown in the table below.

Table 7.1 Script Manager add Script test
Sl No. of Test Case 1
Name of Test Case Add Script test
Feature being Tested Adding of scripts to the store
Description Adding scripts to the local file system
Sample Input A bash script to add 2 numbers , with an expect to expect output
Expected Output Store update with no exceptions
Actual Output Store update with no exceptions
Remarks Successful function flow

Table 7.1 shows the test case details for testing the Script manager add script sub module. This test was successful. Here, the input was a simple bash script which is used to add 2 numbers. An expect script is also used to get state of the script being run.

The Script Manager module is also tested for edit script test. This is described in Table 7.2.

Table 7.2 Script Manager Edit Script test
Sl No. of Test Case 2
Name of Test Case Edit script
Feature being Tested Edit the existing bash script
Description Edit the locally saved bash script with changes
Sample Input Change the addition script to subtraction script
Expected Output Script store updated
Actual Output Script store updated
Remarks Successful function flow
Table 7.2 shows the test case details of Script manager submodule. The input to this submodule is an edit request. The submodule edits a script stored locally. This test was successful.

7.2.2 Unit Testing of Trigger Manager Configuration Module
Unit testing of trigger manager module involves two test types each of which have been explained with sample test case below.Table 7.3 Trigger manager add trigger test
Sl No. of Test Case 3
Name of Test Case Trigger manager add trigger
Feature being Tested Add trigger
Description Adding trigger with its configurations locally
Sample Input Addition script with configurations and iterations and threads
Expected Output Add trigger with no exceptions
Actual Output Add trigger with no exceptions
Remarks Successful function flow
Table 7.3 shows the test case details for Trigger manager add trigger test. The input to this submodule is the script and trigger configuration entity. The submodule takes the input and uses the data to add a trigger to the local directory. This test was successful. Table 7.4 Trigger Manager delete trigger test
Sl No. of Test Case 4
Name of Test Case Trigger manager delete trigger test
Feature being Tested Deleting a trigger
Description Delete a trigger from the trigger store
Sample Input Name of trigger
Expected Output Trigger store update with no exceptions
Actual Output Trigger store update with no exceptions
Remarks Successful function flow
Table 7.4 shows the test case details for Trigger manager delete trigger submodule. The submodule takes the input (Trigger Name) and runs the service to delete a trigger from the trigger store. This test was successful.
7.2.3 Unit Testing of Trigger Operation Module
The unit testing of trigger operation module involves two test cases which test the Starting of trigger and stopping of trigger. They are described in Tables 7.5 and 7.6 respectively.

Table 7.5 Trigger operations start trigger test

Sl No. of Test Case 5
Name of Test Case Trigger start
Feature being Tested Trigger start
Description Starts a trigger and changes the state of the trigger
Sample Input Trigger name
Expected Output Starts a trigger and changes the color next to trigger in the dashboard green
Actual Output Starts a trigger and changes the color next to trigger in the dashboard green
Remarks Successful function flow
Table 7.5 shows the test case Start Trigger test. The inputs to this submodule is the name of the trigger. Once the input is given and the module starts the trigger according to the configurations stored and changes the state of the trigger accordingly.

Table 7.6 Trigger operations Stop Trigger test

Sl No. of Test Case 6
Name of Test Case Stop trigger test
Feature being Tested Stop trigger test
Description Stops the trigger and changes the state of trigger
Sample Input Trigger name
Expected Output Stops a trigger and changes the color next to trigger in the dashboard grey
Actual Output Stops a trigger and changes the color next to trigger in the dashboard grey
Remarks Successful function flow
Table 7.6 shows the test case details for Stop trigger test. The input given to the submodule is the name of the trigger. Once the input is given and the module stops the trigger according to the configurations stored and changes the state of the trigger accordingly. This service functioned successfully.

7.2.4 Unit Testing of Log Master Module
The unit testing of Log Master module involves two test cases which test the get Fresh logs and Get Initial logs. They are described in Tables 7.7 and 7.8 respectively.

Table 7.7 Get Fresh Logs test

Sl No. of Test Case 7
Name of Test Case Get Fresh Logs
Feature being Tested Getting newest logs from the log file
Description Return the appended logs to the user
Sample Input Log File
Expected Output Returns the appended logs to the user
Actual Output Returns the appended logs to the user
Remarks Successful function flow
Table 7.7 shows the Get Fresh logs test. The inputs to this submodule is the Log File. Once the input is given the submodule finds the point where the logs were last read from, and then returns the log from last read point to the current read point of the logs.

Table 7.8 Get initial logs test

Sl No. of Test Case 8
Name of Test Case Get Initial logs
Feature being Tested Getting logs for the first time
Description Returns logs when accessed for the first time
Sample Input Log file
Expected Output Returns the last 30 lines of log file
Actual Output Returns the last 30 lines of log file
Remarks Successful function flow
Table 7.8 shows the test case details for Get Initial Logs test. The input given to the submodule is the Log File. Once the input is given the module returns the last 30 lines of the log file.

7.2.5 Unit Testing of PCAP Master Module
The unit testing of PCAP Master module involves two test cases which test the PCAP upload and PCAP analysis. They are described in Tables 7.9 and 7.10 respectively.

Table 7.9 PCAP uploader test

Sl No. of Test Case 9
Name of Test Case PCAP upload
Feature being Tested Uploading of PCAP
Description Uploads the PCAP to store
Sample Input PCAP path
Expected Output Stores the PCAP to the store
Actual Output Stores the PCAP to the store
Remarks Successful function flow
Table 7.5 shows the test case PCAP uploader test. The inputs to this module is the path of PCAP in users device. Once the input is given the module starts to download the PCAP file from the user and stores the file locally.

Table 7.10 PCAP Analyzer test

Sl No. of Test Case 10
Name of Test Case PCAP analyzer test
Feature being Tested PCAP analysis
Description Analyses PCAP and returns report
Sample Input PCAP file name
Expected Output Analyses the PCAP file and then returns report and visualizations of analysed data on the dashboard
Actual Output Analyses the PCAP file and then returns report and visualizations of analysed data on the dashboard
Remarks Successful function flow
Table 7.6 shows the test case details for PCAP Analyzer test. The input given to the submodule is the PCAP file name in PCAP store. Once the input is given the module analyses the PCAP file and then returns report and visualizations of analysed data on the dashboard. This service functioned successfully.

7.3 Integration Testing
Integration testing is a systematic technique for constructing the program structure while at the same time conducting tests to uncover errors associated with interfacing. The objective is to take unit tested components and build a program structure.7.3.1 Integration Testing for Script Manager
All the submodules were put together and integration testing was done on the overall script manager module. This is shown below in Table 7.10.

Table 7.11 Integration of Script Manager module

Sl No. of Test Case 11
Name of Test Case Integration of script manager module
Feature being Tested Integration of script manager module
Description Checks working of whole script manager module
Sample Input Script changes
Expected Output Script store changed accordingly
Actual Output Script store changed accordingly
Remarks Successful function flow
Table 7.11 shows the integration testing of script manager module. Script manager module updates the script store in the remote device accordingly.

7.3.2 Integration Testing for Trigger Manager module
The submodules of trigger manager were put together and integration testing done on the overall trigger module. This is shown in Table 7.12.

Table 7.12 Integration of Trigger Manager module

Sl No. of Test Case 12
Name of Test Case Integration of trigger manager module
Feature being Tested Integration of trigger manager module
Description Checks working of whole trigger manager module
Sample Input Scripts and configurations
Expected Output Trigger store updated accordingly
Actual Output Trigger store updated accordingly
Remarks Successful function flow
Table 7.12 shows the integration testing of trigger manager module. Trigger management is done successfully and the trigger store is updated accordingly.

7.3.3 Integration Testing for Trigger Operations Module
The submodules of trigger opertation module were put together and integration testing done on the overall trigger operaton Module. This is shown in tables 7.13.

Table 7.13 Integration of Trigger Operation module
Sl No. of Test Case 13
Name of Test Case Integration of trigger operations module
Feature being Tested Integration of trigger operations module
Description Checks working of whole trigger operation module
Sample Input Start/Stop trigger
Expected Output Trigger start or stop and the status of trigger updated accordingly
Actual Output Trigger start or stop and the status of trigger updated accordingly
Remarks Successful function flow
Table 7.13 shows the integration testing of Trigger Operations module. Trigger operations like start all triggers, stop all triggers, start a trigger and stop a trigger is done successfully and the status of each trigger is also updated.

7.3.4 Integration Testing for Log Master Module
The submodules of Log Master module were put together and integration testing done on the overall Log Master Module. This is shown in tables 7.14.

Table 7.14 Integration of log master module
Sl No. of Test Case 14
Name of Test Case Integration of log master module
Feature being Tested Integration of log master module
Description Checks working of whole log master module
Sample Input Log file
Expected Output Returns the logs to the user according to the situation
Actual Output Returns the logs to the user according to the situation
Remarks Successful function flow
Table 7.14 shows the integration testing of log master module. It takes the log file as the input and Returns the logs to the user according to the situation.

7.3.5 Integration Testing for PCAP Master Module
The submodules of PCAP Master module were put together and integration testing done on the overall PCAP Master Module. This is shown in tables 7.15.

Table 7.15 Integration of PCAP Master module
Sl No. of Test Case 15
Name of Test Case Integration of PCAP master module
Feature being Tested Integration of PCAP master module
Description Checks working of whole PCAP master module
Sample Input PCAP file
Expected Output PCAP is uploaded and then analysed and then the analysed data is structured and visualizations are returned to the user
Actual Output PCAP is uploaded and then analysed and then the analysed data is structured and visualizations are returned to the user
Remarks Successful function flow
Table 7.15 shows the integration testing of PCAP Master module. PCAP file path from the users device is taken as input and then the file is uploaded and then the uploaded file is read, parsed and analysed. The analyzed data is then structured, visualized as graphs and returned back to the user.

7.3.6 Integration of Script Runner and PCAP Analyzer System
The five main modules were put together and integration testing done on the overall Script Runner and PCAP analyser system. This is shown in Table 7.16.

Table 7.16 Integration of Script Runner and PCAP Analyzer system

Sl No. of Test Case 16
Name of Test Case Integration of Script Runner and PCAP Analyzer system
Feature being Tested Integration of Script Runner and PCAP Analyzer system
Description Checks working of whole system
Sample Input Configuration parameters, Scripts , Generated logs and PCAP file path
Expected Output Script changes, trigger changes, trigger notification message, appended logs, analysed PCAP data
Actual Output Script changes, trigger changes, trigger notification message, appended logs, analysed PCAP data
Remarks Successful function flow
Table 7.16 shows the integration testing of Script Runner and PCAP Analyzer system. All the modules flows works simultaneously and successfully.7.4 System Testing
System testing 21 is the testing in which all modules, that are tested by integration testing are combined to form single system. The system is tested such that all the units are linked properly to satisfy user specific requirement. This test helps in removing the overall bugs and improves quality and assurance of the system. The proper functionality of the system is concluded in system testing.

The whole system is evaluated in this system testing, with all the main modules being tested. The system testing is as shown in Table 7.17.Table 7.17 System testing
Sl No. of Test Case 17
Name of Test Case Script Runner and PCAP Analyzer system testing
Feature being Tested Script Runner and PCAP Analyzer system testing
Description Checks working of whole system
Sample Input Configuration parameters, Scripts, Generated logs and PCAP file path
Expected Output Script changes, trigger changes, trigger notification message, appended logs, analysed PCAP data
Actual Output Script changes, trigger changes, trigger notification message, appended logs, analysed PCAP data
Remarks Successful function flow
Table 7.17 shows the system testing. Here all the modules are combined and tested. The system should do the management of scripts. All the CRUD operations of trigger should be able to perform. Also the CRUD operations on the Trigger should also be performed. The system should also be able to start or stop a trigger. Logs should be displayed live to the user on the dashboard. The system should be able to accept a PCAP file form the user and able to parse and analyse it .On testing, the system successfully met all the requirements.

7.5 Summary
This chapter includes the general testing process, which starts with unit testing of the five modules followed by integration testing wherein the submodules and modules are merged together. System testing where the entire system is tested for its functionality and correctness was performed. Finally, the functional testing of user interfaces is performed in the test cases related to the dashboard. The tests proved successful in most test cases and abnormal behavior was not traced in any of the five major flows.
CHAPTER 8
EXPERIMENTAL RESULTS AND ANALYSIS OF SCRIPT RUNNER AND PCAP ANALYZER
In analysis of a process, experiments are commonly used to evaluate the inputs to the process which most times define the output of the process. They also help pinpoint the appropriate and thus target inputs to achieve a desired result. The output obtained from the system is compared with the expected output to verify the correctness of the system 22. There are several metrics for comparison. Analyzing the experimental output is verifying whether the evaluation metrics are satisfied. This chapter discusses the performance characteristics of the system.

8.1 Evaluation Metrics
Evaluation metrics are the criteria for testing different algorithms. The behavior of the algorithms or techniques can be determined using these metrics. Some techniques satisfy some of the metrics. In this project, the outputs that are obtained from the different inputs given to the system are compared with the ideal output as per requirements to check whether the metrics are satisfied. The system was evaluated based on how many of the flows worked perfectly in a given situation and the response time of the flows. These are suitable measures of performance since the different flows must work individually without rendering the store in an inconsistent state and must work with least possible latency as per requirements.

8.2 Experimental Dataset
All the modules should work with least amount of delay and latency possible, but since the application currently supports only a single user environment, high latency can be seen with the Front End rendering and the PCAP analyzer module. So the experimental data set is modeled with those modules in mind.

Table 8.1 Sample inputs to Front End render module
Input Set Number Number of triggers
1 2
2 10
Table 8.2 Sample inputs to PCAP Analyzer module
Input Set Number Size of PCAP File Number of Packets
1 18 MB ~80000
2 2 MB ~9000
The Tables 8.1 and 8.2 show sample input sets for Front End render and PCAP Analyzer module respectively. The tables give a general idea about the formats of inputs fed to the two modules.

8.3 Performance Analysis
This section explains the experimental results of this project. The system successfully worked all these required flows and rendered the data in the Front End.

8.3.1 Front End Render
The front end render is based on REST API. Hence higher the number of triggers stored, higher the render time.

Table 8.3 Performance of Front End Render
Number of triggers Response Time(seconds)
2 1
10 1.5
The Table 8.1 shows the performance of front end render flow. Initially tested with 2 triggers the response time for render was 1 second. When the number of triggers were scaled to 5 times i.e. 10, the response time increased to 1.5 seconds. This implies that the system is stable and scalable since only a 50% increase in time is seen with a five-fold increase in number of triggers.

Figure 8.1 Graph for performance analysis of Front End Render
8.3.2 PCAP Analyzer
PCAP analyzer module takes slightly longer time than the other modules as there is reading, parsing and analysis of file taking place in a pipeline. This will result in a slightly higher latency. Latency in this module is also involved with the presentation of analyzed data using graphs.

Table 8.4 Performance of PCAP Analyzer
Size in MB Response Time(seconds)
2 1
20 2
The Table 8.1 shows the performance of PCAP Analyzer flow. Initially tested with 2 MB PCAP file size, the response time for report generation was 1 second for the reading parsing and overall analysis of the file.

Figure 8.2 Graph for performance analysis of PCAP analyzer
When the size of the PCAP file was scaled to ten times ie. 20MB, the response time increased to 2. This implies that the system is stable and scalable since only a two-fold increase in time is seen with a ten-fold increase in file size.

8.4 Summary
The results obtained consisted of all the results expected at the beginning of the project. The render worked with no errors and an acceptable response time which was stable when scaled. PCAP analyzer also worked on the same lines but with slightly higher latencies. Thus, the system is overall in an efficient and ideal functional flow.

CHAPTER 9
CONCLUSION AND FUTURE ENHANCEMENT

The project was initiated because of the need for a switch of Enterprise Applications from a Standalone UI to a more intuitive Web UI. A script runner and PCAP analyzer application was made. The project also emphasizes on easy script management for the end users and also very easy running of scripts, with a success or failure notification with a stream of live logs.

The front end module and PCAP Analyzer module work with acceptable latency and their response times do not change with increase in number of users. The system is efficient, reliable, consistent, flexible and user friendly.
9.1 Limitations of the Project
Although the working system is efficient enough as per requirements, there are a few cases of usage which will lead to loopholes in the functioning. Some of these limitations of the project are:
The system functions accurately when all inputs are accurate and in format. But it cannot correct the data which is slightly out of format in file uploads. In case of format variance, an exception is generated and the flow is stopped and thus the task won’t be completed.
Another flaw is that if the error cases are not properly defined in the expect script provided by the user and the execution of script has an unexpected state unknown to the user then the program will wait for the script to end till infinite time and the script has to be ended abruptly by the user.

9.2 Future Enhancements
To overcome the limitations of the project, the following will be taken up in the future:
The log file would have a watch daemon which tracks all the changes done on the log. In case there are no changes seen on the log for a particular amount of time, then the Script is automatically closed and gives out an error.

9.3 Summary
This chapter gives an overlook of the entire script runner and PCAP analyzer system and briefly states the limitation of the project and future enhancements to grow and overcome the limitations of the system built during the course of the project.

References

1 Fink G., Flatow I, “Introducing Single Page Applications. In: Pro Single Page Application Development”, Apress, Berkeley, CA, 2014

2 Ferrill, Micha, “LIGHTWEIGHT REAL-TIME DISPLAY TOOL – USING OPEN SOURCE SOFTWARE”, International Foundation for Telemetering, International Telemetering Conference Proceedings (2013)
3 expect: Scripts for controlling interactive processes, D Libes – Computing Systems – Citeseer, 1991
4 Travis E. Oliphant, “Python for Scientific Computing”, Computing in Science & Engineering (Volume: 9, Issue: 3, May-June 2007)
5 Kishor Wagh , Dr. Ravindra Thool, “A Comparative Study of SOAP Vs REST Web Services Provisioning Techniques for Mobile Host”, Journal of Information Engineering and Applications, Vol 2, No.5, 2012

6 Rauf, I., Siavashi, F., Truscan, D., Porres, I., “An Integrated Approach to Design and Validate REST Web Service Compositions” , Journal of Global Research in Computer Science Volume 2, No. 6, 2011, pp. 118-125

7 Abhijit Jain, “Data visualization with the D3.JS Javascript library”, Journal of Computing Sciences in Colleges archive, Volume 30 Issue 2, December 2014
Pages 139-141.

8 Christian Elsasser, Need for Speed – Python meets C/C++, Springer CCIS, volume 238, 2011, pp 412-416.

9 L. Shanker Maurya and G. Shankar, “MAINTAINABILITY ASSESSMENT OF WEB BASED APPLICATION”, J. Glob. Res. Comput. Sci., vol. 3, no. 7, 2012

10 N. Synytskyy, J.R. Cordy and T.R.Dean, “Resolution of Static Clones in Dynamic Web Pages”, IEEE 5th International Workshop on Web Site Evolution, Amsterdam, 2003 p49-58.

11 F. Ricca, P. Tonella, and Ira D. Baxter, “Web Application Transformations based on Rewrite Rules”, Information and Software Technology., Volume. 44,No.13, 2002, pp. 811-825.

12 Qingxia Kong, Yang Cai and Quanyin Zhu, “The Case Study for the Basic Information Service of Job Post Resource Based on Web Mining”, International Conference on Computer Science and Service System, 2012, pp 1-4.

13 Mark Pilgrim, “Packaging Python Libraries”, In: Dive Into Python 3. Apress, 2005
14 A. Nitze, “Evaluation of JavaScript Quality Issues and Solutions for Enterprise Application Development,” in Software Quality. Software and Systems Quality in Distributed and Mobile Environments: 7th International Conference, SWQD 2015, Vienna, Austria, January 20-23, 2015
15 Vasan Subramanian, “Routing with React Router”, In: Pro MERN Stack. Apress, Berkeley, CA, 2007
16 W. Roby , X Wu , T. Goldina , E. Joliet , L. Ly, W. Mi , C. Wang , Lijun Zhang, D. Ciardi, G. Dubois-Felsmann, “Firefly: Embracing Future Web Technologies”, Software and Cyberinfrastructure for Astronomy, 2016
17 S. Ali, L. C. Briand, H. Hemmati, and R. K. Panesar-Walawege, “A Systematic Review of the Application and Empirical Investigation of Search-Based Test Case Generation,” IEEE Transactions on Software Engineering, Vol. 36, 2010, pp. 742-762.

18 Kutera Robert, Gryncewicz Wieslawa, “Web Oriented Architectural Styles for Integrating Service e-Marketplace Systems”, Seventh International Symposium on Business Modeling and Software Design, BMSD 2017
19 Jiaqiaojie, Li juanli, Wang yuanyuan “Design and Implementation of Remote Online Examination System Based on Integration Framework”, IEEE Xplore, 2012 pp 1-4.

20 Mu Huaxin, Jiang Shuai, “Design Patterns in Software Development”, Proceeding of the IEEE 2nd International Conference on Software Engineering and Service Sciences, 2011, pp 15-17.

21 Ron Patton,” Software Testing”, Pearson Education India, 2nd Edition, 2006.

22 Susan Hyde, Thad Dunning, “The Analysis of Experimental Data: Comparing Techniques”, Proceeding of the Annual Meeting of American Political Science Association, Boston, 2008, pp 233-242.

23 “A design system made for React.js”, 2018. Online. Available: http://www.grommet.io/ Accessed: 27-May-2018
24 “Focus on the essential experience”, 2018. Online. Available: https://github.com/grommet/grommet Accessed: 27-May-2018

Appendix
Appendix A: Screenshots

The homepage that is displayed to all users when they log in to the dashboard is shown in Figure A.1. The main modules of the Script Runner and Trigger Analyzer System operate through three individual pages accessible as tabs from the home page.

Figure A.1 Homepage of Script Runner
The main modules, Script Manager can be accessed through ‘Compose Trigger’ tab. Trigger Manager can be accessed through clicking any of the triggers displayed on the Home Page, or from the Quick Trigger tab. The Trigger Operations module can be accessed through the Start/Stop option beside the Triggers. The Log Manager module can be accessed by starting any of the trigger and it will be popped up on the bottom, when a trigger is selected. The Network Analyzer module can be accessed through the Network Analyzer tab on the menu bar.

Figure A.2 Trigger Manager edit Trigger
The next module Trigger Module is shown in Figure A.2. When a thread is selected it shows the thread wise options as shown in Figure A.3 and when its started a stream of live logs can be seen as shown in Figure A.4

Figure A.3 Select Thread in Trigger Manager

Figure A.4 Start thread in Trigger Manager

The trigger operations can be activated from the Home page start or stop trigger. The trigger is started in the figure A.5

Figure A.5 On Start a Trigger
The final module is the Network Analyzer module. The network analyzer loads, parses and analyzes the PCAP file and creates a report, using tables, charts and graphs as shown in Figure A.6

Figure A.6 Network Analyzer

x

Hi!
I'm Delia!

Would you like to get a custom essay? How about receiving a customized one?

Check it out