CSC: Data Demands Continue Exponential Growth

This year’s annual Creative Storage Conference in Los Angeles revealed that storage demands in the media and entertainment (M&E) industry are continuing on the steep upward trajectory that began around the turn of the millennium.
Author:
Publish date:
Updated on

LOS ANGELES, CAThis year’s annual Creative Storage Conference in Los Angeles revealed that storage demands in the media and entertainment (M&E) industry are continuing on the steep upward trajectory that began around the turn of the millennium. Driven by higher resolution content, higher frame rates, more bits per pixel and new immersive formats, the demand for storage could reach one exabyte per project within a decade.

Conference organizer Tom Coughlin of Coughlin Associates noted in his opening remarks that 4k production is becoming commonplace, 8k is getting off the ground and 16k— 16,000 pixels on the long axis—is coming. Cameras routinely run at 40 and 60 frames per second, with some models capable of much faster rates. “There are cameras that can shoot at thousands of frames per second for special effects,” he said.

Looking 10 years up the road, he said, innovative projects—virtual or augmented reality, for example—produced in 16k by 8k, 24 bits per pixel, 300 fps could potentially generate 115 gigabytes/second data rates or 414 terabytes/hour. “I think we’ll be seeing single projects reaching an exabyte of raw content generated,” he said. In decimal terms, an exabyte is roughly one billion gigabytes.

Every year, Coughlin reports on his latest survey of the M&E industry’s storage demands. There is typically good and bad news regarding archiving. Nearly half of respondents said their annual archive growth rate is greater than 6 percent. Digital tape is favored by 40 percent of those surveyed with LTO, at 72 percent, the predominant format; camera tape is a distant second. About 28 percent use hard disk drives, 16 percent local storage networks, 5 percent the cloud and 6 percent various optical disks.

“But about 46 percent of survey participants said they never update their archives. Shame on them,” said Coughlin.

Use of public and private cloud storage has taken off in the non-linear editing community. In 2012, 15 percent of respondents stored data in the cloud; now, it’s 30 percent, he said. “Cloud storage is being used to create collaborative workflows for people across time and space. The more stuff there is in a cloud environment, the more things you tend to do there— because that’s where the data is.”

Back on terra firma, how data is being moved is also changing. Mike Oakes of DDN noted that the financial services industry was moving away from fiber channels as Ethernet speeds increase through 100 Gb/s, with 400GbE on the horizon. “They need bandwidth, low latency and high transfer rates; some of that is going to push into our industry,” he said, adding, “I’m not saying one is better than the other.”

Last year, said Oakes, “About $21 billion was spent on Ethernet switches. That’s a lot of money. Same space, fiber, was about $2.5 billion.”

“M&E is becoming a Moore’s Law-driven industry,” stated Neil Smith, LumaForge Systems. Intel’s Gordon Moore predicted in 1965 that processor power would double every 18 months; meanwhile, prices also decrease. “That applies to everything,” said Smith, including CPUs, GPUs, RAM and networking.

Smith highlighted LumaForge’s solution for VR and VFX, two compact machines communicating at 2,000 Mbps over a single piece of Cat 7. “In 30 years, we’ve gone from water- cooled mainframes to something shoebox-sized,” he said.

Cloud-based workflows came up in one panel discussion after another. As Oakes noted, the cloud offers efficiencies, such as a 24-hour global work cycle, with one location handing off a project at the end of its workday as another location begins its day. “I can get my stuff done that much quicker, and I can have a better quality product because my people aren’t tired.”

But reliance on the cloud also has its downside, as noted by cybersecurity expert Uzi Yair, GTB Technologies. Yair observed that a “frenemy”— an unwitting employee—could accidentally synchronize content to the internet and no one would be any the wiser. With a reported 12,000 cloud providers in addition to big names like Dropbox, he said, malware could potentially open its own account and reroute pirated content.

The conference often offers a glimpse of future tech. Two organizations are using predictive sequencing to store data using the four base pairs of DNA, reported Oracle’s Brian Campanotti. “The density is 2 petabytes per gram of DNA material. MIT is showing two or three times that density.” The method offers a projected shelf life of 10,000-plus years. Of course, he noted, “We have proof that this will last millions of years. Maybe in the future, we won’t have to deal with migrational challenges. But the truth of the matter is we will for the next few years.”

“How often do you replace your storage?” asked keynote speaker Alex Grossman of Quantum. “Every three to five years. Why? Because you have to migrate off it.”

With so much content being stored, archived and remonetized, automation now plays a key role, according to Grossman. “This is where things are changing. People tend to call this software-defined storage.”

There are numerous storage solutions available, based on components from the three hard-drive vendors still in business, he noted. Head-to-media speeds have remained fairly constant; the bottlenecks are elsewhere, making hybrid hard disk/solid state drive storage solutions attractive. “Look at the workflow stages and optimize the storage for each stage,” he advised.

Creative Storage Conference
creativestorage.org