LOS ANGELES, CA—Data is said to be growing globally at a 40 percent compound annual rate and could total 44 Zettabytes by 2020. Depending on the source, between 80 and 90 percent of today’s data was generated in just the past two years.
So where is this data? The recent 2016 Creative Storage Conference, marking its tenth anniversary, attempted to answer that question with a day of panels and presentations in Culver City, CA that considered the data storage and access trends relevant to the media and entertainment (M&E) industry.
The simple answer to the storage question—indeed, to computing needs in general—is the cloud. Representatives from Google at the conference revealed that the company sank nearly $10 billion into capital expenditure in 2015, including a trans-Pacific fiber link, and now apparently has the world’s largest distribution infrastructure. Speed is everything these days, so a network of over 100 edge locations in more than 30 countries ensures that data remains cached close to major populations, they said. Google is deeply embedded with the M&E industry, offering cloud services such as transcoding, archiving and data recovery, peered directly with the big Hollywood studios.
Higher resolutions, high dynamic range, high frame rates and virtual reality are all starting to place greater demands on content workflows and distribution infrastructures. Videos are currently being uploaded to You-Tube at the rate of 432,000 hours—that’s almost 50 years—per day. The Internet of Things will soon account for as much as 10 percent of all data.
“We’re eating eight meals a day and wondering why we’re gaining weight,” said HGST’s Jeff Greenwald, noting that neither technology nor budgets are keeping pace with the data explosion. “Data is going to consume a larger part of the budget—or you’ll have to delete more or store less.”
While Google touted its cloud as more secure than on-premises solutions, there is still a need for local storage hardware. Alex Grossman of Symply argued that a fiber channel SAN works fine for large facilities, but where individuals and small workgroups are concerned, Thunderbolt 3.0 is a game-changer, avoiding the complexity, management issues and fixed cost of entry associated with fiber channel. “There are no IP hassles,” he said, and it’s fast: 32 GB/second, twice the speed of fiber channel and three times that of 10 GbE.
Throughout the day, presenters returned to the implementation of hybrid workflows that utilize the most appropriate solution for specific application tiers. The cloud, object and tape methods might be combined, for example, or solid state drives could be implemented locally for their quick performance. Samsung now offers a 16 TB SSD, said the company’s Tien Shiah: “We have crossed the [storage capacity] threshold of hard disk drives.”
Rob Peglar of Micron made a case for NVM (non-volatile memory), otherwise known as persistent memory, which can now run faster than the Linux kernel can context switch. Faster storage needs a faster network, added Mellanox’s Rob Davis, advocating for NVMe, written specifically for SSDs, which avoids the bottleneck of legacy interfaces designed for legacy drives.
Once data is stored, it needs to be searchable. “The key is metadata,” according to Graymeta’s Aaron Edell. “It’s a source of truth for everything that you are storing.”
“Companies are not going to accept content without proper meta tagging in the future,” agreed Jeff Stansfield of Advantage Video Systems. Google’s Jeff Kember promoted the company’s proven machine learning, familiar to many via Google Photos, which potentially offers automated meta tagging for moving images, too.
Roy Taylor of AMD offered an update on the virtual reality business, which has reportedly received $1.1 billion in investment this year to-date. “AMD will power 83 percent of all VR,” he said, noting that the company’s technology powers all major games consoles.
AMD’s new Radeon RX 480 graphics card costs just $199 and is certified for the major VR systems. “We think it’s the right thing to do, to democratize VR,” he said. “It can only work if there are hundreds of millions involved.”
But it could take 35 years to achieve the horsepower—743 Teraflops—necessary to deliver the photorealistic 16K, 144 fps and zero latency required by the human eye, he cautioned. “No one company can do it by themselves,” he said, adding that AMD has joined the Immersive Technologies Council.
Even if VR fails to catch on with consumers, despite the current hype, there is a future elsewhere, according to Taylor. VR is ideal for safety training, for instance, and has already been pioneered in surgery, mental health care and retail, design and architecture applications.
Consultant, educator and filmmaker Larry Jordan returned to the theme of solutions for small content creators on the final panel, Conversations with Independent Storytellers. “We’re talking real world, not enterprise,” he said, making a plea to the assembled manufacturers for affordable and appropriate storage products and tools for those on tight budgets.
“I have boxes and boxes of drives,” said Woody Woodhall of Allied Post Audio, but he has no need to store multiple terabytes on a single drive: “Audio doesn’t work that way.” A DNx file for a TV show might be 90 GB but the audio file is “3 GB, tops,” he said, noting that the smallest drive he can buy, other than a USB stick, stores over 60 GB.
Further, it’s impossible to download a high-quality movie file from remote clients, he said. “Because the cloud idea is spectacular, but the cloud reality is non-existent. Maybe it’s Time Warner Cable, my provider, but I can’t pull down a 1 TB file.”
Creative Storage Conference