Researchers have developed a new software that can fix 90% of data on the internet. Everybody gets frustrated by following a link to an interesting website but only to discover the page is no longer available and gets an error page.

It is more frustrating for Science, Healthcare, Industry and other areas is when machines communicate and find out that the resource they are looking turn out to be missing from the identifier.

This can cause problems when a computer is processing large amount of data for financial or scientific data.

If the resource is still on servers it can be retrieved by an algorithm which can recreate the missing links.

This process has two limitations.First it homes on a single point of failure where there might be wider issues across a database. Secondly it relies on the knowledge of destination data source.

The proposed algorithm uses the fact that preserve their structure event after its move to a different location. therefore, the algorithm creates an exclusive graph for each entity.

When the broken link is detected the algorithm starts its task to find the new link for the detached entity.

To end this the crawler controller module searches for the superiors of each entity in the inferior data set and vice versa. after some steps the search space is narrowed and the best candidate is chosen.

Researches tested the algorithm on two snapshots of DBpedia within which contains almost 300,000 personal entities.Their algorithm identified almost 5,000 entries that were changed between the first and second snapshot.Their algorithm relocation 9 out of 10 of the broken links.

SHARE
Vicky is an entrepreneur at heart who has made his hobby turned passion, his profession now. Becoming a blogger, was the most important part of his journey

LEAVE A REPLY

Please enter your comment!
Please enter your name here