Uploaded by BRAHIMI MASSINISSA

ADCI DEVOIR

advertisement
I have Stumbled Upon a Question Which I really can't figure out how the answers came up. I Will Post the Question And Answer Below.
Consider a distributed system that has the following characteristics: * Latency per Packet (Local or remote, incurred on both send and receive): 5 ms. * Connection setup time (TCP only): 5 ms. * Data transfer rate: 10 Mbps. * MTU: 1000 bytes. * Server request processing time: 2 ms
Assume that the network is lightly loaded. A client sends a 200-byte request message to a service, which produces a response containing 5000 bytes. Estimate the total time to complete the request in each of the following cases, with the performance assumptions listed below:
1) Using connectionless (datagram) communication (for example, UDP);
Answer : UDP: 5 + 2000/10000 + 2 + 5(5 + 10000/10000) = 37.2 milliseconds
We were not given any formula so I have trouble finding what the numbers in above calculation actually means.
2000/10000 - i think 10000 has to be 10Mbps * 1000 , i just dont know what 2000 means
(5+10000/10000) - ( I know that this has to be multiplied by 5 because MTU is 1000 Bytes , But I just dont know what the numbers Mean)
Here is how I calculate for UDP and TCP.
Total transmission time (UDP) = transmission time for request message packet + Server request processing time + transmission time response message
Total transmission time (TCP) = connection setup time + transmission time for request + server request processing time + transmission time for response message packet
Download