Quantcast
Channel: Media & Entertainment Technology » flows
Viewing all articles
Browse latest Browse all 2

Content Capture At Creative Storage Conference

$
0
0

 August 3, 2010, Culver City, CA, 2010 creative storage conference opened with a session on "Content Capture: Many Cameras, Many Effects, Many Storage Devices".

The transition from film to digital in movie making is driving requirements for storage, especially high definition video, to increase dramatically in volume. The panel, moderated by Pallab Chatterjee, CTO at Silicon Map & Sr Editor here at M&E Tech, gave their views of the logical alternatives before the group addressed common questions.
The panel was comprised of:
David Trumbo, CTO at media technology market partners
Jeff Ravencraft, president at USB – IF, senior technology strategist at Intel
Steve Lampen, multimedia manager at Belden
Joe Wojdacz, disruptive innovationist at after camera services
Alex Grossman, CEO at active storage

Trumbo suggested streaming tape as one alternative. He noted that video and data started with tape recording in the '50's and didn't start the conversion to digital until the '80's. Now, the LTO (Linear Tape-Open) Consortium is on version 5 with 1.5 TB of storage. The standard calls for a data rate of 140 MB/s and includes integrated hardware encryption. With the latest version, the user has a LTFS (linear tape file system) that exists in the first partition and looks like a hard drive to the user. In addition, the media costs less than an equivalent hard drive.

Ravencraft offered the concept that storage didn't have to be tied into the main storage and server systems. Instead, storage could be tied into the system via USB 3.0, which offers faster transfer rates and lower power consumption than previous versions. This option is particularly useful for off-site operations where flash-based peripherals could upload files to the local editing stations. The interface has a complete eco-system of support for applications, security, and certified products.

Lampen noted that data communications required some type of cabling whether on copper or optical fibers. Ethernet is runs at rates up to 10 Gb/s with future extensions going to 40 and 100 Gb/s. Cat 6A cables are capable of transmitting up to 40 G data rates at up to 50 meters, but need to be custom designed for the specific installation. For short runs up to 10 meters, fiber optic cables are able to handle 100 Gb data rates. In most cases, the fiber is not the cost limiter, it's the cost and support of the box with the electro-optics.

Wojdacz opined that the industry is changing business models as awareness of digital issues increases. Now many are planning to use technology for content creation. For example, GPU-based processes for storage and image processing. This large increase in data volume for acquisition and processing leads to an increased reliance on an information technology person in the production process.

The storage functions are the linchpin of digital production and workflows. The state of the art in the digitization of images and audio are enabling the new processes to overtake film as a production medium. In addition to lower costs and shorter times to viewable results, the digital processes automatically synchronize the audio and images.

The challenges in moving to digital techniques are many. The shoot ratios, the number of shots per scene, are increasing dramatically as directors and producers are now able to attempt many more shots on digital than they would want to try on film. This additional material creates a data management issue. The metadata—file names, camera ID, imaging data, lens data, and script supervision—is increasing faster than anyone can imagine. The close link between set and post production flows is broken. To make matters worse, all of the necessary data is not accessible or manageable. All of the various types of metadata are not the same and the data layout impacts the ingest performance.

GPUs have enabled interaction at full resolution, so digital images can be edited in real time. This moves a major part of the editing to the on-set production facilities and reduces the time in post production. Now, the production flow must include pre-visualization and a virtual cinematographer on the production set. Storage is now pervasive, for both current assets and for those created earlier.

Digital techniques have changed the value chain and production flows. More of the image processing is moving to the front end of the production flow into an overlapping, collaborative flow. The new flows have mixed the functions of on-set and post-production. The addition of more automation and integration across the processes will lead to increased accuracy and security while reducing production time schedules and costs.

Grossman echoed many of Wojdacz' statements and added that storage is becoming an active function. The challenge is to understand how to invest in and use storage as the workflow evolves. The industry is moving towards greater collaboration while pushing the limits of available storage facilities as both capacity and volumes increase exponentially. For an HD program, the shoot ratio is going up because some times the cameras are never off. A non-HD program might use 1-2 TB whereas a reality program in HD might generate 20 TB for a one hour show.

The available tribal knowledge in HD is still evolving. A typical shoot will use multiple cameras flooding the system ingest. Each camera might be generating over 300 MB/s all of which goes directly into multiple editors. This raw content is converted into playout and eventually archived. All of the steps: capture, edit, archive require the movement of data.

Finding the right piece of data is getting much harder, given the vast volumes of data in place. Some type of asset management tool is necessary to manage the data and work flows. Existing workflows lack an open backup window, which only makes things more difficult. Before starting production, an important item is to plan and architect the storage system for flexibility and growth. Each facility is unique and has requirements that make storage design a one-of process, rather than an easily replicated plug-in model. The industry needs to have standards that consider the integration of flexibility and collaboration into the storage systems.

In response to the query on bit error rates in hard drives, the panel suggested verifying data integrity after the transfer. Most of the panel considered data corruption as a smaller issue, with human error and data loss through renaming and de-duplication functions as the greater issue. Standard best practices call for multiple storage locations and automatic backup.

Metadata carries the details of production and technology but is a stumbling block for the archive and backup operations. The panel responded that asset management needs to be integrated into the drives to tie everything together. It's critical to archive value and may become the "killer app" for smaller operators and consumers. The good practices are moving towards the independents in software and XML while the big studios still create a book for everything. Discipline and following standard practices helps to maintain the files and ease file management.

Archiving the files is another issue that seems to be important. The panel noted that the metadata is critical to maintain the data through changing media and standards. The lack of standards for the metadata means that the service industry for transfers and conversions from one format to another will continue to be necessary for the medium term. There are some standards for aspects of the various technologies in use, but ideally the standards should be embedded into the applications so the users automatically buy-in to the standard.


Viewing all articles
Browse latest Browse all 2

Trending Articles