Snowden Used Web Crawler to Collect Classified Files from the NSA System
The New York Times reported that Edward Snowden, the former NSA contractor who exposed multiple top-secret US intelligence programs, used an automated web crawler to collect the information from the National Security Agency system.
It appears that Snowden was able not only to access, but also to download large volumes of top-secret information using highly unsophisticated software just because he was working at an agency office in Hawaii that was the last on the list to receive the latest security upgrades, according to the Times. The Guardian also reports that Edward Snowden intentionally sought position at the Hawaii office of NSA because it was known that the office still has not received the security update, which made it possible for an insider to collect top-secret information with the help of web crawler without raising alarms. The Times reports that Snowden exploited that NSA’s “rudimentary protections against insiders” to aggregate top-secret materials.
The Guardian journalist Glenn Greenwald indicated that we might see more NSA and government insiders step forward with more top-secret revelations. “There will be more sources inside the government who also inspired by Snowden’s courage,” he said in an interview to “Reliable Sources” program on CNN.