When we consume information through a search engine, we tend to select the top results most of the time. Search engine algorithms, on the other hand, are driven by advertisement revenue and popularity-enhancing click bots. This project aims to empirically evaluate the fairness of search engines by submitting a set of keywords to multiple search engines. In particular, the study should be carried out by changing the demographic location of the user using VPN services to see how the results are prioritized and presented to users.
You are expected to focus on the following issues:
- Survey the literature on search engine fairness
- design an experiment for gathering data from multiple search engines using multiple keywords
- compare the business standings of companies that appear on the first page of the results with companies appearing on the subsequent pages.
- investigate the need for a decentralized and transparent search engine
- propose a design for a decentralized search engine to enhance fairness.
We plan to write a research article describing the findings of this work. This project has immense potential for future research if the research activities are carried out diligently.
Skills and experience
- Willingness to work on new areas and challenging problems
- desire to read papers, develop software designs, and write software
Contact the supervisor for more information.