NIST Big Data Working Group (NBD-WG)



Big Data at NIST


There is a broad agreement among commercial, academic, and government leaders about the remarkable potential of “Big Data” to spark innovation, fuel commerce, and drive progress. Big Data is the term used to describe the deluge of data in our networked, digitized, sensor-laden, information driven world. The availability of vast data resources carries the potential to answer questions previously out of reach. Questions like: How do we reliably detect a potential pandemic early enough to intervene? Can we predict new materials with advanced properties before these materials have ever been synthesized? How can we reverse the current advantage of the attacker over the defender in guarding against cybersecurity threats?

However there is also broad agreement on the ability of Big Data to overwhelm traditional approaches. The rate at which data volumes, speeds, and complexity are growing is outpacing scientific and technological advances in data analytics, management, transport, and more.

Despite the widespread agreement on the opportunities and current limitations of Big Data, a lack of consensus on some important, fundamental questions is confusing potential users and holding back progress. What are the attributes that define Big Data solutions? How is Big Data different from the traditional data environments and related applications that we have encountered thus far? What are the essential characteristics of Big Data environments? How do these environments integrate with currently deployed architectures? What are the central scientific, technological, and standardization challenges that need to be addressed to accelerate the deployment of robust Big Data solutions?

NIST Big Data Public Working Group:

NIST is leading the development of a Big Data Technology Roadmap. This roadmap will define and prioritize requirements for interoperability, portability, reusability, and extendibility for big data analytic techniques and technology infrastructure in order to support secure and effective adoption of Big Data. To help develop the ideas in the Big Data Technology Roadmap, NIST is creating the Public Working Group for Big Data.

Scope: The focus of the NBD-PWG is to form a community of interest from industry, academia, and government, with the goal of developing a consensus definitions, taxonomies, secure reference architectures, and technology roadmap. The aim is to create vendor-neutral, technology and infrastructure agnostic deliverables to enable Big Data stakeholders to pick-and-choose best analytics tools for their processing and visualization requirements on the most suitable computing platforms and clusters while allowing value-added from Big Data service providers and flow of data between the stakeholders in a cohesive and secure manner.


  • Develop Big Data Definitions
  • Develop Big Data Taxonomies
  • Develop Big Data Requirements
  • Develop Big Data Security and Privacy Requirements
  • Develop Big Data Security and Privacy Reference Architectures
  • Develop Big Data Reference Architectures
  • Develop Big Data Technology Roadmap

via NIST Big Data Working Group (NBD-WG).

New Language for Quantum Coding

Quantum software has finally left the dark ages with the creation of the first practical, high-level programming language for quantum computers. Although today’s devices are not ready for most practical applications, the language, called Quipper, could guide the design of these futuristic machines, as well as making them easier to program when they do arrive.

“It does all the nice features of a modern classical programming language, adapted to quantum computing,” says Bob Coecke of the University of Oxford, who was not involved in the work. “It’s a tour de force.”

This picture shows the general definition of a...

This picture shows the general definition of a qubit (quantum bit) as the quantum state of a two level quantum syste (Photo credit: Wikipedia)

An important feature of a quantum computer is that its bits – known as qubits – can take the values 0 and 1 at the same time. This allows the computer to perform two or more computations simultaneously. But designing computer algorithms that make use of this quantum parallelism is tough.

So quantum programming has so far been mostly low-level, concerned with instructing the quantum logic gates that control the qubits.

New language helps quantum coders build killer apps – physics-math – 05 July 2013 – New Scientist