Snowden Used Normal web Crawler tool to collect NSA files - BestCyberNews: Online News Presenter in the present world

BestCyberNews: Online News Presenter in the present world

Start knowing

Breaking

Snowden Used Normal web Crawler tool to collect NSA files

Edward Snowden used Normal software to gain access to at least 1.7 million secret files, The NewYork Times reported, quoting senior intelligence officials investigating the breach.


The New York Times Sunday quoted a senior US intelligence officials as saying: “We do not believe this was an individual sitting at a machine and downloading this much material in sequence,”adding that the process was “quite automated.”

Snowden reportedly set the right algorithm for the software that indicated subjects and how far it was to follow the links. The whistleblower was reportedly able to gain access to 1.7 million files, including NSA’s internal “wiki" materials that are used to share data across the world.

Officials added that the files were accessible because the Hawaii outpost was not upgraded with the latest security measures.

The web crawler used by Snowden was similar to, but not as advanced as the Googlebot crawler, used by Google and its search engine to access billions of websites and download their contents for fast search results.

The whistleblower did raise some flags while working in Hawaii, prompting questions about his work, but he was able to ward off criticism successfully.

According to Josh Levy of Free Press: "Since the first revelations last summer, hundreds of thousands of Internet users have come together online and offline to protest the NSA’s unconstitutional surveillance programs. 

These programs attack our basic rights to connect and communicate in private, and strike at the foundations of democracy itself. Only a broad movement of activists, organizations and companies can convince Washington to restore these rights.”



Author Venkatesh Yalagandula Follow us Google + and Facebook and Twitter

No comments:

Post a Comment