Information is a sword that cuts two ways. For companies, it is undoubtedly the motor that powers their success. But in the era of digitalization, the deluge of information seems to continue unabated, pushing firms to the very limit of what their capacities permit. According to the EMC Digital Universe-study, the quantity of data produced every year between now and 2020 will increase tenfold to 44 zettabytes, equal to 44 billion gigabytes. How can you get on top of the huge volumes of information and data that get generated by business processes? The cloud offers a way out of the chaos of big data. Cloud storage as offered by the Open Telekom Cloud gives firms a resource for managing immense volumes of data.
Where does all of the data come from?
Communication systems within companies are the growth engine behind the stockpiles of data. E-mails remain one of the most popular of these systems. Technology market research center The Radicati Group has estimated that some 215 billion e-mails were sent in 2016 – every day, that is. This number is forecast to grow by some 10 billion annually between now and 2020. Adding to the countless electronic messages and attachments, there is also data from intranet solutions and collaboration platforms. This information is almost never deleted: Instead, companies dutifully save more and more of it. Depending on their business model, firms have a lot of archiving to do. To get an idea of what this entails, simply think about a media company, which is permanently storing printed, video, and audio data.
Alongside conventional communication systems, new offerings such as connected cars (vehicles with internet access), predictive maintenance (forecasting models) and applications for the internet of things add yet more to the near-limitless quantity of data. This exponentially expanding universe of information provides the raw material for future business activities, but it also demands that storage capacities be available on a flexible basis. The challenge facing companies is to obtain a lot of storage, not just at a good price, but also in a manner that ensures it is available quickly and as needed.
Simple and secure external data storage
One way of mastering the volume of data is to keep internal operations "inhouse", i.e. within the company, while relocating some of the information to a suitable external storage option (cloud storage). The demand is there. As a NetApp survey shows, two-thirds of SMEs aren't even sure if they can fully restore their data from backups if their systems were to crash.
Using public cloud resources relieves a company's internal IT unit from establishing short-term storage capacities that make no difference to competitiveness levels, as well as from developing and managing inhouse storage solutions. It also does away with continuous and expensive investments in storage hardware and upskilling for storage system management – not to mention the anxiety of possible data losses.
Using cloud storage makes sense in a host of different industries, as demonstrated by five examples about real companies where archiving data is essential.
1. A DVD evening that is 20,000 years long
It sounds almost unbelievable, but it's true: Starting 2022, researchers at Chile's Large Synoptic Survey Telescope (LSST) will begin compiling a map of the heavens over a period of ten years. This will generate a huge volume of images and other data. The plan is to take 2,000 shots with over 15 terrabytes every night and then run an analysis on them. The end result of this titanic project will be a catalog containing approx. 37 billion astronomical objects such as stars, galaxies, and asteroids.
To store this volume of information, you would need a total of 500 petabytes, i.e. 500 million gigabytes, of memory. To give you an idea of what this means, it is the equivalent of about 100 million DVDs, which would take 20,000 years to watch end-to-end.
Against this backdrop, it comes as no surprise to learn that the astronomy project is one of the largest big data undertakings in the world. There is another astronomy megaproject that is pushing IT to its current limits: The plan to merge the data generated by 260,000 antennas located across the globe. As of today, it is still not clear just how this challenge will be broached: Given the inconceivably large quantities of data and frequencies that the antennas will gather, storing and processing the information is simply not possible (yet).
2. Humans as a data source
We'll stay with the natural sciences but take a step from astronomy to biology, where gene analysis has seen a surge in importance within the field of medicine. The technology for performing sequencing analyses has made nothing less than a quantum leap in recent years. At present, complete sequencing still costs 10,000 dollars – but that's nothing compared to the 10 million dollar pricetag in 2007.
Specialist providers long ago realized how lucrative partial gene analysis could be and now offer this service for 99 dollars. A test includes 3.27 billion base pairs, about 1.8 meters of genetic material. We could well see increased demand for gene analyses in the coming years: In one conceivable application, a large country such as China or the USA could spearhead a genetic typing program for its population.
3. Digitizing libraries
Let's take a look at the liberal arts now. These disciplines are currently undergoing a "quantitative revolution" that sees them turn their backs on traditional research methods. Historians are now analyzing data and statistics with the aid of computers, animation, and interlinked data points. Poring over dusty tomes of history in the library has almost been, well, consigned to the history books. The advantages of the new approach are obvious: Big Data makes it possible to apply text mining techniques to generating new links, evaluating merged data volumes, and ultimately producing new insights. Then there is also option of creating and continuously modifying digital editions of publications. All in all, it represents a milestone in the process of developing computer-supported academic processes. Suitable data storage options are, again, a precondition for this paradigm shift.
4. "Airy" big data analyses
Moving from academia to industry, Vestas Wind Systems is a company based in Arhus, Denmark, and it is one of the world's leading names in wind turbines. The firm has one particular advantage: It only needs a few hours to calculate how much energy every potential new wind farm can generate. This is possible thanks to gigantic databases containing old records detailing temperatures, wind direction, and precipitation, and these can be combined with the information from satellite images, tide tables, and wind charts. A total of some 160 factors go into the company's big data analyses, and others include the full complement of data (output, runtime, repairs, etc.) from Vestas' existing 50,000 turbines.
5. Music from the cloud
A glance at the music sector reveals how important the cloud has become in our day-to-day lives. 2015 marked a watershed, according to the International Federation of the Phonographic Industry (IFPI): For the first time ever, digital music platforms made a bigger profit than their physical equivalents. Driven by the ubiquity of smartphones, streaming had overtaken CDs in just 10 years. Today, even small streaming services provide access to millions of songs that users can access whenever they want. Cloud storage is what makes this possible, ensuring that business models such as music streaming on demand are viable in the long run.
Summary: There's no way around data storage
All of these examples demonstrate how significant the role of big data can be and what challenges companies and the IT sector are likely to face. Volume, velocity, and variety – these are the three most important terms in this process. One thing is already clear, however: There is no way of avoiding the cloud for anyone who wants to manage an apparently unlimited volume of data while at the same time remaining competitive and focused on the future. This realization isn't restricted to exotic pursuits, like mapping the stars or sequencing genes.
We'll use a simple example to clarify our point. In the automotive sector, a host of information is amalgamated for the inspection and approval processes covering millions of goods produced by different companies. These range from small parts to finished cars, and they also comprise components and items from preliminary stages. Using cloud storage can also result in a significant boost to efficiency here as well.
At a glance: the benefits of Open Telekom Cloud
- Security: The data are hosted in highly secure Telekom computing centres in Germany.
- Scalability: Computing power and memory can be ordered and set up online and adapted flexibly at any time.
- Pricing models: We offer you flexible and fixed contractual periods as well as a combination of both models.
- No vendor lock-in: Open Telekom Cloud is based on OpenStack, a freely available open-source standard. You can change the provider at any time.
- Individual configuration: CPU, RAM, storage, network – you can put a package together for yourself that matches your requirements to the optimum degree.
- IaaS for all: Open Telekom Cloud is extremely flexible and therefore suitable for companies of every size.