Blockchain Search Engines: Can Decentralization Defeat Google?

Profile picture of Maria Lobanova Hacker Noon

Maria lobanova

Journalist, crypto investor

Searching for information on the World Wide Web has become routine for billions of users. During this time, search engines have become so firmly entrenched in our lives that they have gained unprecedented power over what we read, the information we consume, and the services we use. The algorithms that generate search results, originally designed for the convenience of the user, have become a propaganda tool that wields power over public consciousness. Search engines serve as an agenda to control information. Instead of providing impartial access to data, they have led to the polarization of society.

While few people can realize it, the Internet is gradually shifting from Web 2.0, which is based on a concept of social interactivity, where users act as content creators; to Web 3.0, which assumes that data will no longer be exclusively stored on centralized servers. With the advent of a decentralized Internet, the search engines we use today will be replaced by new ones, devoid of Google’s opaque algorithms and censorship propensity. Indeed, the first decentralized search engines are already appearing on the market.

How Google went from revolutionary to dictator

In 1996, two Stanford computer science graduate students, Larry Page and Sergey Brin, started the BackRub research project, which aimed to create a tool for finding information on their university’s home page. In 1998, they patented their solution under the Google name. In mid-2005, a year after a surprisingly successful IPO, Google’s market capitalization hit a record $ 52 billion. And after another fifteen years, Google has become the world’s leading search engine, dominating almost every category – the most used on desktops and mobile devices, as well as the most popular search engine among advertisers. Today, Google controls the dissemination of almost everything from product descriptions in online stores to local stories. Meanwhile, criticism of Google and other leaders in the local search market, such as Yandex for the Russian-speaking segment or Bing for the Chinese, is increasing every year.

The past decade has been marked by numerous high profile investigations, searches and lawsuits against major players in the search engine market. Google has been caught creating an information trap, or “filter bubble,” in which users receive information consistent with their previous search queries. The tech giant has also been found guilty of manipulating search results in order to promote its own Google Shopping service. Additionally, computer scientists have uncovered structural biases against women and people from ethnic minorities in Google’s search algorithm and language models, as well as biases against small local media, whose articles are relegated to the media. last lines of Google News search results.

Among the latest high-profile scandals is an antitrust lawsuit that has been brought against Google by a group of US attorneys general representing 17 US states. Complainants’ claims reveal startling details about Google’s advertising activity. In addition to imposing blatantly oppressive commission regimes on advertisers, accounting for 22-42% of ad spend, Google has been accused of:

  • manipulate advertising prices with the help of bogus buyers and sellers, for whom the company has created a whole department called gTrade
  • complicity with Facebook
  • Deliberately slow down the loading of pages that do not use AGP (Accelerated Mobile Pages technology) in order to lock ad traffic into Google
  • violating the privacy rights of over 750 million Android users
  • lobbying against legislative initiatives to protect privacy

and much more…

the 173-page trial prepared by a group of prosecutors examines in detail how the world’s leading search engine compels Internet advertisers to use its platform. A description of an internal corporate dialogue that took place in 2016 between Google executives can be considered emblematic. Participants note that Google earns “LOTS OF MONEY” from ad commissions, while admitting that the company operates the way it does – and that’s a quote – because “we can afford it.”

Now that Google has secured its position as the world’s dominant search engine, monopolizing over 90% of the global market share, the devastating consequences of this state of affairs are being discussed not only among IT specialists and marketers. , but also by national regulators. “Google’s dominant position and its trade agreements in the market create barriers for new or competing businesses, denying them access to consumers. This suppresses innovation and deprives consumers of choice, while the quality of service from the most dominant company is declining, ”recently noted Rod Sims, chairman of the Australian Competition and Consumer Commission.

Decentralization instead of dictatorship

Sometimes, to get out of a difficult situation, you have to hit rock bottom, and that couldn’t be more true in the search engine market today. Despite Google’s undisputed position in this area, the balance of power could shift dramatically over the next 5-10 years. This will not happen because of pressure from global regulators on Alphabet Corporation, nor as a result of numerous fines or attempts to force the company to adopt more open practices. The reason will be the natural evolution of the World Wide Web itself: the transition from Web 2.0 to Web 3.0 which has been talked about for 20 years.

Internet searches are currently designed for client-server architectures and work with protocols such as TCP / IP, DNS, URL, and HTTP / S. When a query is entered in the search bar, the search engine generates a list of hyperlinks to third party sites where the relevant content is located. The user clicks on one of them and the browser directs them to the specific server in the network where the content is stored via an IP address. So, in Web 2.0, content depends on the server, on the physical location where it is stored. If this server is destroyed, the content is also destroyed. Additionally, the content may be surreptitiously edited, as the URL link to the content will remain the same. Currently, web content is extremely vulnerable as it can be manipulated, deleted or blocked at any time.

Web 3.0 works in a completely different way. The content is addressed by the hash of the content itself. Once the content is located through its hash and downloaded, the user becomes one of their distribution points, much like torrent networks work. This approach effectively eliminates threats that could destroy or manipulate the content because, if the content changes, the hash of the content will also change. With a decentralized web, the architecture of the Internet itself will change: there will be no more sites as they are understood today. All content will be stored in a peer-to-peer (P2P) network, where it can be found unaddressed at a specific server location. The problem of “broken” links will go away over time, as the links to the original content will remain the same forever.

The new architecture of the World Wide Web will require new search engines. Search engines will no longer be able to hide their indexing algorithms, like Google is doing. There will be no need for crawlers that collect data on changes to site content. There will be no risk of being censored or having your private information stolen.

Visually, results in a decentralized search engine will differ little from those found in the usual centralized format, but there are several key advantages:

The search results will include the desired content, which can be read or viewed directly in the search results without going to another page.

Buttons for interacting with apps on any blockchain and making payments to online stores can be integrated directly into search snippets.

In content-oriented Web 3.0, search engines lose their supreme power over search results, as they will be generated by participants in peer-to-peer networks, whose preferences will determine the ranking of cyberlinks.

Transparent classification

The main obstacle when developing a search engine is designing a content ranking system. Google’s history began with the creation of its PageRank algorithm in the mid-1990s, but search engines designed for Web 2.0 are not suited to content-oriented Web 3.0.

Finding a mechanism to classify links in Web 3.0 is no trivial task. Having moved away from centralized filing, developers must decide how to pass the right to review content into the hands of users, while also thinking about ways to prevent possible malicious manipulation. Cyber, a decentralized search engine project, came up with such a ranking solution for Web 3.0.

Cyber’s protocol ranking mechanics apply the principles of tokenomics, which is based on the idea that network participants themselves should have an interest in forming a knowledge graph, which will successfully generate super search results. long-term intelligent. Therefore, users will need V tokens (volts) to index the content and A tokens (amps) to categorize it. Network participants will need to store H (hydrogen) tokens in their wallet for a period of time in order to receive V and A tokens. H, in turn, is produced by liquid staking of the main network token (BOOT for Bostrom and CYB for Cyber). Thus, Cyber ​​users will be able to access the resources of the knowledge graph with a network token and receive similar staking income to Polkadot, Cosmos or Solana.

The rank of cyberlinks associated with an account depends on the number of tokens. But if tokens have such an impact on production, isn’t there a risk that the results will be manipulated? To avoid this, at the start of the project, 70% of the tokens will be distributed to users of Ether and its applications, as well as to users of the Cosmos network. The airdrop will be carried out on the basis of an in-depth analysis of the activities on these networks. Therefore, the bulk of the stake will go to users who have demonstrated their ability to be of benefit.

Keywords


Source link

Rosemary S. Bishop