Research generates vast quantities of data. Various institutions have come up with a plan to deal with them.
The volume of data worldwide is growing at a staggering pace. One study claims that it will have reached 35 zettabytes, or 35 billion terabytes, by the year 2020, 44 times the amount recorded in 2009. Admittedly, not all of these data are important enough to keep. Even research data need to be actively managed, i. e. sorted, protected and made accessible. This process is known as data lifecycle management. A coordinated joint effort is crucial here. Research institutions and libraries are of course best placed to determine how this effort should be organised, since they are already used to managing large volumes of data.