Accurate, Focused Research on Law, Technology and Knowledge Discovery Since 2002

FCW – NSA shows how big ‘big data’ can be

FCW.com – Frank Konkel -“As reported by Information Week, the NSA relies heavily on Accumulo, “a highly distributed, massively parallel processing key/value store capable of analyzing structured and unstructured data” to process much of its data. NSA’s modified version of Accumulo, based on Google’s BigTable data model, reportedly makes it possible for the agency to analyze data for patterns while protecting personally identifiable information – names, Social Security numbers and the like. Before news of Prism broke, NSA officials revealed a graph search it operates on top of Accumulo at a Carnegie Melon tech conference. The graph is based on 4.4 trillion data points, which could represent phone numbers, IP addresses, locations, or calls made and to whom; connecting those points creates a graph with more than 70 trillion edges. For a human being, that kind of visualization is impossible, but for a vast, high-end computer system with the right big data tools and mathematical algorithms, some signals can be pulled out.”

Leave a reply