"Search websites? Really? Are you kidding?" Searching for information on the Web became a no-brainer. Just type your query into the search engine of your choice! Well, the choice might actually be your browser's, but even then, hardly anybody questions the procedure nowadays. Or thinks about it. Strangely enough, not many seem to question the results they get either. We are pretty sure that these results are expertly fetched and ranked, with best matches on top of the list. Sort of, in Google we trust. Amen?
When we perform a web search, we rarely think about how the result list gets compiled. This knowledge, however, can help us understand how search engines tick and how we can use them to find what we are searching.
To be included in search engine's database, a web page needs to be indexed by a program usually called bot, short for robot. The program scans the content of the page just as it is found on the Web, and tries to make out what it is about.
Since bots are not human, they can only rely on what's on the surface when analyzing a page. They look for words and word combinations, count their appearances and evaluate other factors. For example, words in headlines, set in bold font or emphasized otherwise, are considered more important. The same goes for special on-page areas like title and description.
As the result of this scan, the page gets included in the search engine's index for some word combinations from its content. If someone makes a query to search websites for exact same words, the engine may consider to include the page in its result list.
The position at which the page is presented in this list is a mystery. Pages from larger and older websites with many inbound links tend to rank higher, but there is no guarantee. The pursuit of a higher ranking with search engines gave birth to a whole industry known as search engine "optimization".
"Manipulation" would have been a more proper term, even if with some negative connotations. For my part, I don't consider reasonable human reaction to an arbitrary piece of software unethical.
Google Search Console doesn't even list queries my pages get impressions for, deeming them too obscure and personal. Last month I got clicks from search pages for "best time's to travel to antarctica" and "snæfellsnes peninsula day trip reykavik gray line 18000 isk". Repeating them here may give engines a good reason to include this very page next time somebody tries to search websites for the same.
As Google keeps track of both queries you make and pages you visit afterwards, your search results might very well be personalized according to your browsing preferences.
When you search websites, at least some of the results will be in the list form. From 7 Deadly Sins to Fifty Shades of Grey, countable sells. Here's my take on it.
Not all queries are the same.
“In order to ask a question you must already know most of the answer.”
Queries for "brexit leave" and "brexit remain" will probably reveal different results about, not of, the recent British referendum.
Lists are a huge phenomenon of our time. Just have a quick look at the Internet, including this page.
I find it fascinating how modern search engines are able to compose a top 10 list out of what seems like 75,000,000 or so hits an average query returns, all in a mere fraction of a second.
Occasionally, I am really surprised how spot-on the very top link on the first result page is. For the rest of the time, I am amused and/or mildly irritated by how irrelevant all results from the first page look to me.
An online article I read recently makes a similar point by stating that we are happy with the Top 10 presented to us "largely because we're unaware of the better results that are often buried out of sight."
“...lists are a form of cultural hysteria...”
Google developers often stress that the first page of results is composed to reflect as many search intents as possible. It can mean that only 2 out of 10 presented links fit yours.
Don't limit yourself – scroll beyond the first page.
You can challenge search results found by one search engine asking several others the same question. The more often a specific page comes up in the result list, the more relevant it may proove for your search intent.
If you are unhappy with found websites the first time around, modify your query and try anew. Repeat until succeeded.
If in despair, ask a real person working in the field you are researching. Chances are, a kindred spirit has a better insight and understanding of your problem than an one-size-fits-all algorithm.
“Google can bring you back 100,000 answers. A librarian can bring you back the right one.”
I hope I'll live long enough to see someone push Google out of relevance. It's not as foolish an idea as it might seem. It happened before, it happens every day (Internet Explorer, anyone?)
"Someone" can be something, however, I would prefer "it" to be a person rather than the European Commission (for example).
How about a public rating system for webpages? Visitors would rate them for content keywords, awarding thumbs-up or -down, or everything in-between.
Use the votes to search websites by keywords. Pages with the most ups should rank higher – simple.
This can be easily implemented with a browser add-on. Something similar to WOT (Web of Trust), maybe even the same platform?
Just one thing, keep the bots out – add a captcha. And don't forget you read it here first!
This is a short introduction to what shall become my guide to some remarkable websites I consider worth visiting. Most of these websites are hard to search for and even harder to find. They are my private "search websites" results.
Consider the links below as the beginning.
Further publications in this section will be first made available to those who subscribe to newsletter. (Hint: The subscription form is on top of the screen.)
Tags: #humansarebetter #webwatch #caughtontheweb
Access the current issue when you subscribe!
Is it useful 👍? Awful 👎? Leave a message! Your comments help making this site better (and give me a kick—one way or another).