There are mainly 5 steps in the working of a search engine: 1. Discovery, 2. Crawling, 3. Indexing, 4. Relevancy, 5. Ranking
When a new page is published on the website, search engine robots search for it first. A website owner uses a sitemap so that search engines will find new pages on their website easier.
Crawling is the process of collecting all the linked web pages on a website using software called "spiders" or "crawlers." Googlebot's crawl the keywords and title tags that are added to the content of the website.
When web pages are collected, they are indexed in the database. When web pages are indexed, they are thoroughly analyzed. This is referred to as "web indexing."
The search engine first checks the relevancy of the pages it has indexed in order to find out which pages contain data relevant to a particular search.
A search engine will display the best result based on any user's query. When the search engines find a website that has the best information, they place it at the top.