European consortium develops new ways for dealing with Big Data

Big Data is a major factor driving knowledge discovery and innovation in our information society. However, large amounts of data can only be used efficiently if algorithms for understanding the data are available and if these algorithms can also be appropriately applied in highly scalable systems with thousands of hard drives. Big Data thus presents complex challenges for software developers, as the necessary algorithms can only be created with the aid of specialist skills in a wide range of different fields, such as statistics, machine learning, visualization, databases, and high-performance computing.

The new BigStorage project, funded by the European Union, will thus develop new approaches to deal with Big Data concepts over the next three years, from theoretical basic research to the development of complex infrastructures and software packages. As an Innovative Training Network (ITN) of the European Union, it also plays an important role in the training of researchers and developers in the international context. The various tasks are being addressed by a European consortium of research teams and industrial partners. The work being undertaken at the Data Center at Johannes Gutenberg University Mainz (JGU) will focus on the impact of new storage technologies as well as the convergence of high-performance computing and Big Data.

Read more at:

 European consortium develops new approaches for dealing with Big Data — ScienceDaily.