The way people find information on the internet has completely changed in the last two decades. These days Google (or search engine in general) is the entry point for any website. Even your frequent visitors don’t remember the domain name and search for you via the search engine. Earlier known as address bar of the web browser is now known as omnibar for the reason that it merges functionality of both address bar and search field. So, when people seek some information from internet, they don’t visit any website specific to that information directly, but go to search engine which presents them the most relevant urls. Now because the internet speed these days is so fast and the data is cheaper, the user opens the first few links in new tabs and starts reading the one which loads first. So clearly "the fastest loading webpage wins the visitor."
No one likes a slow loading website, not the search engine either. Google considers the performance of a webpage very seriously while indexing it and positioning in search results. Hope we are on the same page for the fact that every website needs to be faster loading. Even if your web page is the only source of the required information and it loads very slow, another faster loading competitor will appear sooner than you expect.
A lot happens between you press enter in your browser address bar (or omnibar now) and the page rendered for you to start reading. We need to understand each of these steps and optimize one by one.
Your browser first needs to find where your website is hosted (the IP address of the server). Nothing can be downloaded before the browser has resolved your website domain name to an IP address. Some dns providers are slower than others. In general the free dns providers are comparatively slower than their premium counterparts. So, for a faster dns resolution, consider switching to premium dns providers like Amazon Route 53 or dns made easy etc.
Once IP address is known, the browser connects your server and seeks the required page and other referred resources. Time spent in connecting the server depends on several factors in addition to internet speed of your visitor. Main factors are geographical location of your server (w.r.t visitor’s location), the hosting company infrastructure and resources alloted to you by the hosting company. So, you should carefully pick a good hosting company and server location closest to your target audience.
When the server gets a request, it performs a lot of operations to collect the information requested and then formatting it. Some of the common operations are authenticating the user’s privileges against the requested information, reading some files from file system, connecting to the database server and fetching required information etc. Total time consumed in all these operations depends on multiple factors. The server machine should be good on resources. As it needs a lot of disk read write operations, a server with SSD is a big plus point. Opting for a server with SSD against older HDD is a great value in terms of performance for the extra money spent. The algorithms/process followed by the developer is also very significant at this point. A poorly written code may eat up all the good resources of the server against a very simple request. So, choose your technology partner/development company wisely.
Very significant part of the total time is spent in transferring the content/data from server to the user’s machine. This time spent depends on several factors apart from the visitors internet speed and geographical location of the server. Below are the most important points to reduce the total time spent in downloading the information from the server to the browser.
As you would already know the size of a file can be reduced upto multifold by compressing it (as zip or any other compression method like gz, rar etc for that matter). The same technology is supported and used by your web browser and the server. The server can compress the content with gz compression before sending it to the browser and then browser can uncompress it when received. That way the total size of data to be transferred is reduced and hence the speed improves significantly. So, all the resources on your webpage must be served as compressed if those can be and the browser supports it.
There say “A picture is worth a thousand words”. True, it tells more than a thousand words but at the same time on a webpage its load is also more than a thousand words. Generally, more than 50% of the size of a web page is from the images. But many of those can be reduced to half (or may be lesser) size keeping almost the same visual quality. A good web graphic designer must know and understand which format an image must be saved to and at what quality to save it to make look beautiful but less in size. Your development company must know very well about the latest and most efficient image formats to reduce load and make the page load faster. For an example webp format developed by Google is now widely supported by most of the browsers. If used correctly, it can reduce the image sizes by a significant factor. But at the same time it is not supported by all browsers, some support with a fallback url while microsoft edge with application guard does not support the fallback url either (till the time of writing). So, your developer must know the workarounds for using such latest technology supports with browser support and the ways to use the same in css.
More number of requests to server make the page load slower even if the total content size is same. Total number of requests for a web page can be significantly reduced by your developer by merging multiple css files in one, multiple js files in one, using sprites of images etc. That adds a great value to total loading speed of your webpage.
Cache is a great tool to improve the performance of a website. There is server side cache as well as client side cache. Server side cache is used to store the output of some time consuming processing on the server side so that on the next request your server does not need to do the intense processing but gets the response ready from cache. Client side cache is used to store the downloaded content on your visitors browser so that when they open the same page (or another page which needs the same resources) next time, they do not need to download the same resources again but can use from the stored cache. So, static resources of the website should have a good client side cache policy to make the browsers able to cache the resources and depending on the application logic and process, a good server side cache helps save time as well as resources which results in lower hosting costs with faster performing website. But your developer must know how to make the browser load the updated resources instead of using cached ones when those are changed and when to invalidate and regenerate the server side cache. This way overall, good caching policies on client side as well as server side can make your website perform much faster and use fewer resources.
There are several tools which can check your website for performance and compare it with other websites. These tools also give you suggestions to improve the speed. These tools cannot check/suggest about server side improvements like server side caching, efficient programming practices etc but give a lot of suggestions as per client side. A few common tools where you can check your website performance are Pingdom.comGoogle page speed insightsGoogle Chrome Lighthouse
Please remember that the recommendation given by these free online tools are just recommendations generated by some software algorithms which are not necessarily true and/or achievable. For an example let’s talk three recommendations. Considering here only css while it applies to js and other static resources as well.
We load css on most of the pages in 2 requests. One is common to the entire website (all the pages of that website) and the other is required on that specific page only.
Now, as per recommendation to reduce the number of requests, the tools would recommend to merge these 2 requests into one (where both of those are the results of merging several different files).
But in that case, the one big css file will be different for each page (because page specific css for each page is different) and the client side caching will be useless unless the visitor comes to the same page again and again.
So, the conclusion is: the recommendations by these free tools are good hints for the developer, but it is up to your developer to understand and decide what is better for overall performance.