Now this question I have among various others I'm attempting to figure out is part of my homework. However, I am NOT asking you to solve it for me, I just want to be sure I know what I'm doing.
The question in particular I am asking is this,
Assume that you are connected to the homework server through two routers. How
many milliseconds will it take to send the file? Assume that each router has a processing
delay of 80 s.
It's not as simple as just 160 s right, (which I would then convert to ms)? Because you have to first turn the message on your computer into a frame, then send it, and at each router you must go to the Network layer of the header to check where to go, but is that just the processing delay?
And once it's received by the server it has to bump it all the way back to the application layer correct? How do I find the time based off of the router processing delay?
Or am I COMPLETELY overthinking this problem and it really is just 160 s converted to milliseconds?