What is HTTP?
Hypertext Transfer Protocol (HTTP) is a network communication protocol that transfers data between an HTTP client (mobile application or browser) and the server. The HTTP protocol assumes using a client/server architecture for data transfer. Client applications send the request to the server, and the server processes the client's request and sends a response back to the client.
How to Understand HTTP Timings?
The following is an explanation of the Timings:
- DNS Resolution time: DNS resolution time is the time it takes for a server to receive and process a request from a client, also known as latency. If your computer or browser caches DNS server information, or if the DNS server is nearby, latency is very low.
- Connecting time: Connection time is the time it takes for the client to connect to the server. You can shorten connection times by using Keep-Alive connections.
- TLS Setup time: TLS setup time (also known as TLS handshake time) is the time to establish a secure connection over SSL. Before the browser can establish a secure connection with the server, you need to complete several steps: confirm your identity, select algorithms, and exchange security keys.
- Sending time: Sending time is the time it takes to send data, including HTTP headers and POST content, to the server. It mainly depends on your network bandwidth.
- Waiting time: Waiting time is the amount of time the server takes to process the client's request. Includes the time it takes for the server to search the database for data and apply some business logic to the client request.
- Receiving time: Receiving time is similar to Sending time; this is the time it takes to receive data, including HTTP headers and body from the server. It mainly depends on your network bandwidth.
How do HTTP timings help?
HTTP timings help detect bottlenecks; for example, if your DNS lookup takes longer than you expected, it may be caused by your DNS provider's caching settings. When you see a longer time to the first byte, you should check the latency between endpoints and the current server load. Slow content transfer can be caused by an inefficient response body, such as sending too much data (unused JSON properties, etc.) or a slow connection.