Edward Snowden used an inexpensive, readily available, Web crawler tool — also known as a spider — to methodically browse and index through a treasure trove of secret NSA files, The New York Times reported.
The implication is that he didn't personally search, sequentially or randomly, for the files he purloined. He appeared to have been interested in select material and used specific search terms— which focused in large part on U.S. military capabilities — in vacuuming secrets from the agency's database.
The tool moved automatically from point-to-point within the NSA computer system following links embedded in documents and copying roughly 1.7 million files in its path, the Times reported.
"We do not believe this was an individual sitting at a machine and downloading this much material in sequence," an official told the Times.
In the wake of the Bradley Manning WikiLeaks affair of 2010, the NSA was in the process of implementing security measures against an "insider attack," but Snowden operated — either by coincidence or design — out of a Hawaii outpost where internal security measures had not yet been upgraded.
Snowden knew that the NSA had only basic protections against insider penetration and that it did not entirely compartmentalize its secrets.
Still, the presence of Web crawling spider software within the NSA's computer system should have set off alarms, sources told the Times. Snowden was challenged, at least once, in Hawaii for his activities but claimed he was doing routine network maintenance in his systems administrator capacity, the Times reported.
Besides his own passwords, it is likely Snowden used those of his co-workers and supervisors.
Snowden has denied specifically targeting military information as well as claims that he is working at the behest of a foreign power.
© 2014 Newsmax. All rights reserved.