# CS432 Assignment 2 Solution and Discussion

• Assignment No. 02
Semester: Fall 2019
CS432 –Network modeling and simulation Total Marks: 20

Due Date: 5thDecember, 2019
Assignment Objectives:

• To enhance the capabilities of network topologies on the network.
• To calculate the latency and jitter of the network.

• Your assignment must be in .doc format (Any other formats like scan images, PDF, bmp etc will not be accepted).
• No assignment will be accepted through email.

Rules for Marking:

It should be clear that your assignment will not get any credit if:
• The assignment is submitted after due date.
• The submitted assignment does not open or file is corrupted.
• Your assignment is copied from internet, handouts or from any other student
(Strict disciplinary action will be taken in this case).

Assignment

Question No. 1 [Marks: 5]

Following network shows the data transmission between LAN1 host and LAN2 Host. Number of packets is sent from LAN1 to LAN2. You are required to calculate the Latency and Jitter during the data transmission. Values should be given in the following table with complete calculation (formula and answers).
Also write down the main difference between latency and jitter.

Packet ID Time AT Point A Time AT Point B Latency Jitter
1 TA1= 3 sec TB2= 6 sec L1=?
2 TA2 = 7 sec TB2 = 12 sec L2=? J1 =?
3 TA3 = 15 sec TB3 = 22 sec L3=? J2 =?

Question No. 2 [Marks: 5]

Suppose a university has a ‘Research center” network that is comprised of the main departments of the university. Answer the following questions by considering the topology of the given network. 1. What is the total data rate of the network?

2. Which application is generating the maximum load per user in Management department?

3. Which application is generating the minimum load per user in Computer Science department?

4. Which department is consuming maximum network bandwidth and why?

Question No. 3 [Marks: 5]
Suppose the utilization of a system is 45% then what will be the queue depth?
Question No. 4 [Marks: 5]
What are the steps required in step-wise approach to measure the downtime cost of the system? Discuss them.

• Q.1 Solution:

Packet ID Time AT Point A Time AT Point B Latency Jitter
1 TA1= 3 sec TB2= 6 sec L1=TB1–TA1=> 6-3 = 3 sec
2 TA2 = 7 sec TB2 = 12 sec L2=TB2–TA2=> 12-7 = 5 sec J1 = L2-L1=> 5-3 = 2 sec
3 TA3 = 15 sec TB3 = 22 sec L3=TB3–TA3=> 22-15 = 7 sec J2 = L3-L2=> 7-5 = 2 sec

Latency is a term used to outline the amount of time it takes for a packet to transfer to its destination.
Whereas, jitter is the delay that varies over time when the signal weakens.

Q.2 Solution:

1. What is the total data rate of the network?

2. Which application is generating the maximum load per user in Management department?

3. Which application is generating the minimum load per user in Computer Science department?

4. Which department is consuming the more network bandwidth and why?
Answer: Math department is consuming the network bandwidth more because there are total 8 applications that are running in Math department.

Q. 3 Suppose the utilization of a system is 45% then what will be the queue depth?

Q.3 Solution:
Queue depth = utilization / (1-utilization)
= 45/(1-45)
= 45/-44
= -1.0227

Question No. 4
Write down the five steps of step-wise approach to measure the downtime cost of the system.
Q.4 Solution:
Step-wise approach to measure downtime cost
ii) Define What You Protect
iv) Classify Outage Types,
v) Calculate cost

• Typical, approximate, values for latency that you might experience include:

• 800ms for satellite
120ms for 3G cellular data
60ms for 4G cellular data which is often used for 4G WAN and internet connections
20ms for an mpls network such as BT IP Connect, when using Class of Service to prioritise traffic
10ms for a modern Carrier Ethernet network such as BT Ethernet Connect or BT Wholesale Ethernet in the UK

• Q.1 Solution:

Packet ID Time AT Point A Time AT Point B Latency Jitter
1 TA1= 3 sec TB2= 6 sec L1=TB1–TA1=> 6-3 = 3 sec
2 TA2 = 7 sec TB2 = 12 sec L2=TB2–TA2=> 12-7 = 5 sec J1 = L2-L1=> 5-3 = 2 sec
3 TA3 = 15 sec TB3 = 22 sec L3=TB3–TA3=> 22-15 = 7 sec J2 = L3-L2=> 7-5 = 2 sec

Latency is a term used to outline the amount of time it takes for a packet to transfer to its destination.
Whereas, jitter is the delay that varies over time when the signal weakens.

Q.2 Solution:

1. What is the total data rate of the network?

2. Which application is generating the maximum load per user in Management department?

3. Which application is generating the minimum load per user in Computer Science department?

4. Which department is consuming the more network bandwidth and why?
Answer: Math department is consuming the network bandwidth more because there are total 8 applications that are running in Math department.

Q. 3 Suppose the utilization of a system is 45% then what will be the queue depth?

Q.3 Solution:
Queue depth = utilization / (1-utilization)
= 45/(1-45)
= 45/-44
= -1.0227

Question No. 4
Write down the five steps of step-wise approach to measure the downtime cost of the system.
Q.4 Solution:
Step-wise approach to measure downtime cost
ii) Define What You Protect
iv) Classify Outage Types,
v) Calculate cost

• latency

Latency is sometimes considered the time a packet takes to travel from one endpoint to another, the same as the one-way delay.

More often, latency signifies the round-trip time. Round-trip time encompasses the time it takes for a packet to be sent plus the time it takes for it to return back. This does not include the time it takes to process the packet at the destination.

Network monitoring tools can determine the precise round-trip time on a given network. Round-trip time can be calculated from the source since it tracks the time the packet was sent and computes the difference upon acknowledgement of return. However, a delay between two endpoints can be difficult to determine, as the sending endpoint does not have information on the time of arrival at the receiving endpoint.

• @zareen
Jitter Calculation
Here’s an example. We have collected 5 samples with the following latencies: 136, 184, 115, 148, 125 (in that order). The average latency is 142 - (add them, divide by 5). The ‘Jitter’ is calculated by taking the difference between samples.

136 to 184, diff = 48
184 to 115, diff = 69
115 to 148, diff = 33
148 to 125, diff = 23
(Notice how we have only 4 differences for 5 samples).
The total difference is (48+69+33+23=173) - so the jitter is 173 / 4, or 43.25.

We use this same mechanism no matter how many samples you have - it works on 5, 50 or 5000.

• jitter

Before we go please understand the jitter.
Jitter is the amount of variation in latency/response time, in milliseconds. Reliable connections consistently report back the same latency over and over again. Lots of variation (or ‘jitter’) is an indication of problems.

Jitter shows up as different symptoms, depending on the application you’re using. Web browsing is fairly resistant to jitter, but any kind of streaming media (voice, video, music) is quite suceptible to Jitter.

Jitter is a symptom of other problems. It’s an indicator that there might be something else wrong. Often, this ‘something else’ is bandwidth saturation (sometimes called congestion) - or not enough bandwidth to handle the traffic load.

• @zareen iska solution provide kar dain

• @Hamna-Hashmi available!

• @zareen iska solution provide kar dain

2

3

1

2

1

1

1

4