camera Proposal

Introduction

This document describes the next generation network camera. It is a reimagining of the state of the art of camera technology specifically designed to be the front end to a file based workflow that is back-ended by a cloud infrastructure.

Sony cameras evolved from traditional broadcast designs when the need was to send an analog signal across a studio. Since then data transfer has evolved. Tape based workflows are dying out and being replaced with radically different methods. Post production functions are being moved from the post facility to the set. Today, the camera is only a part of the process and the true power is in the system.

What is the camera?

A camera provides the interface between the physical world and the digital world. It converts light, the image from the lens, into a data file. As such it provides the tools for the Director and Director of Photography (DP) to control the final image. It needs an interface to allow the DP to:

· Measure and control exposure

· Manage the color through look up tables (LUTs)

· To monitor feedback of camera and signal status and levels

There is a wealth of other information generated during shooting. In addition to its primary role of capturing the image, the camera simplifies and automates metadata collection and embeds it in the data stream. The metadata includes:

· Lens data (focal length, aperture, etc.)

· Camera setup parameters (exposure, etc)

· Director of Photography input (LUTs etc.)

· GPS derived data

o Geolocation

o Time reference (precision reference to automate TC)

· Inertial, angular and motion data (similar to iPhone sensors)

· 3D rig metadata (e.g. interaxial distance)

· Slate data received wirelessly

· Additional production notes as needed

The camera is part of a complete system

Color and metadata management are critical functions performed by the camera system.

In the last century, Kodak was the authority in color management. In the 21st century, Sony should be that voice. Sony has the opportunity to redefine color management in a way that captures the creative decisions made during photography, carries those decisions and preserves them, and allows further refinement when needed in the post production process.

The camera is part of this process – but only a part, the true power is in the system.

The camera is a networked terminal in the system that converts information from the physical world into useable digital information. Done properly, as part of an integrated system, the camera allows Sony to control the images flowing through the post production process.

Sony needs to acquire the strength to be the 21st Century voice of color management, integrate the technology into its cameras and bring to market the systems that leverage the capability it provides.

Control of the camera is essential.

The camera does not stand alone. It is the front end of the production system and it is a key component in a complete system.

Figure 1. The camera System

camera

Unlike conventional cameras, the camera as part of an overall system defers those functions which can be done later to other downstream components. It is a minimalist approach supported by processing power in the rest of the system. The key features are:

· A “Raw” signal output

· No onboard processing in the camera except as needed for local monitoring or transmission to storage

· Bundling of metadata into the output

These features result in significantly reduced size, weight, complexity and power consumption.

At its basic level the proposed camera consists of a set of modules that can be configured to provide the user just the functions that they need at that time.

Figure 2. Modular Construction

Camera modules include:

· The Imager module

o Lens mount

o Imager

o A/D converter

o RAW interface

· Local control module

· Monitor output module

o 422 720/1080

· Network interface adapter options

o 8Gbps dual link Fibrechannel

o Dual link 10Gbps Ethernet

· Storage adapter

o Accepts SSD media with capacity up to 500GB to 1TB

· Wireless interface module(s)

o Remote control interface

o Opportunistic download

o Real time monitor feed

· Electronic viewfinder

· Power options

o One or more battery packs

o AC adapter

The camera has two logical data paths: one for control data and the other for image data. The control data path can be either multiplexed with the data path, or a separate wired or wireless (WiFi, Bluetooth, etc) link.

For real time transfer, the image data path could be either 10GigE or 8Gpbs Fibrechannel (dual link for 4k). 10GigE is the preferred interface because it allows for the use of commodity IT hardware to build the infrastructure. Here data is passed in real time directly to the system.

For operation with a local storage cache, the image data path could be wireless 802.11n with typical data rates of around 145Mbps, and best case data rates of 600Mbps.

In this mode the camera allows “Opportunistic” download to nearby data storage – a local server that allows “dumping” and backup of the on-board solid state memory while shooting continues. This shortens the time required for data management during and after the shooting day. It is not necessary for this to be real time capable – could run much slower and still be valuable.

Remote control module

The remote control module sets the operational parameters of the camera. It does not actually modify the camera’s signals, the data flow, instead it can:

· Allow for conventional “video operation”

· Create metadata – e.g. LUTs for later rendering

· Provides the look and feel of traditional video control

· Transmit the metadata to the camera or another module for multiplexing with the image file

It can be wirelessly connected to the camera through the cameras control data path.

The remote control module might look and feel much like a remote control unit on a traditional camera, the primary difference is that the output of this remote control unit is metadata that is embedded with the image data.

Director of Photography (DP) interface

The DP interface allows the DP to address the camera to directly enter information that can be multiplexed with the image to set parameters (e.g. LUTs) to direct monitoring and post production processing.

It would be a handheld wireless device that provided:

· Camera status monitoring

· Image and exposure analysis

· Control over LUT selection

The DP interface might be an Android or similar application.

LUT Rendering Monitor

Camera system monitors receive image data with embedded metadata (e.g. LUTs). They apply LUTs and render the image data for real time display of the corrected image. When used with the remote control, it allows monitoring of the impact of real time camera parameter adjustment.

Storage Modules

Image data is recorded as a lossless raw data stream. Operational considerations (e.g. on location or a studio) will determine whether the recording is done on small portable media such as solid state storage, disk arrays or a modified SR tape deck recording a raw data stream.

Two types of storage module are:

· Recordable Media Dock

o For unloading SSD media

o eSata, NAS and USB 3.0 interfaces

o Add-on function to dump media to LTO-5

· Network Server Application

o Software running on Linux/Mac/Windows server

o Manages real time transfer of RAW images and metadata

o Manages opportunistic wireless transfer of RAW images and metadata

o Managed through UI and web services (Conductor)

Data movers for live transmission and/or live monitoring

These modules provide real time transmission of raw image data and metadata for real time display and transmission. Different versions can provide full quality or monitor quality as required.

Render module

This module is used for live production and is inserted in the data path at or before the vision mixer/switcher. It applies the accumulated LUTs in a system such as Ellcami and renders the image into a conventional HD-SDI format.

Used in a variety of Post Production roles the module:

· Feeds to non-render capable monitors (e.g. consumer sets in offices or viewing rooms)

· In preparation of dailies materials for use in editing systems

As well as providing other similar functions.

For Post Produced materials

Since the camera system tracks metadata with image files throughout the workflow post production is quicker and therefore cheaper. For final release, images are rendered in the finishing stage or Digital Intermediate as appropriate.

Networking the Camera

The sea change between today’s cameras and the camera system is that image processing is moved from the camera into the system. This is enabled by networking the camera and re-assessing its role as a source of raw image data rather than a source of preprocessed image data with baked in LUTs.

RAW output from the camera can be transferred in real time using either Fibrechannel or 10Gbps Ethernet (10GigE). Fibrechannel is already used extensively for high data rate operations streaming synchronous data to and from spinning media. 10GigE can be similarly used provided with basic configuration parameters including isolating camera data transmission from camera control & metadata transmission, using a non- blocking switch and generally avoiding contention with other traffic.

The next two sections illustrate how the networked camera is used.

In the Studio

As an example the following diagram shows how a HDC1500 might be used in a studio configuration where the show is recorded as data files for subsequent post production.

Figure 3. File based configuration for HDC1500

Key aspects of this configuration are:

· Multiple levels of image processing

· Baked in LUT reduces downstream options

· Data format conversion from HD-SDI to file based

· Expensive dedicated ingest hardware (Omneon Spectrum in this example)

This next figure shows how this configuration looks with the camera system.

Figure 4. camera system for Studio Recording

Aspects of this configuration are:

· Versatile control of the camera by both Remote Control Module and DP Interface

· Direct data transfer of image files over network to recording module

· The network server can be constructed for commodity IT components running specialized Sony software.

On Location

A key aspect of shooting on location is the difficulty of getting sufficient bandwidth back to the post production facility so that post production can proceed. This next configuration takes advantage of the camera’s ability to opportunistically dump content from its attached solid state storage while continuing to shoot. In this case the content dump is to cloud storage to which the post production facility is also connected.

Figure 5. The camera attached to cloud storage

Conclusion: What does this mean for Sony’s products?

The camera represents breaks new ground, using modular system design to produce the best possible image quality with advanced workflows. The camera is designed to reach far beyond where Red has set the bar in file based camera technology. The camera is an uncluttered image capture device that defers image processing (applying LUTs etc.) downstream modules in the system.

The camera will be cheaper to make than the equivalent in today’s product line because of the reduced processing in the camera itself however Sony cannot look to this fact to increase margins. Rather it will allow Sony to continue to compete with new cameras like the Red Epic by offering not only better imaging technology but also a superior workflow.

Sony’s product lines are already to moving from dedicated hardware to software processing on multipurpose platforms like Ellcami. The camera system will go further. For image processing Ellcami has an important role to play but with respect to functions like data recording, Sony’s new products will in part be software running on hardware made from commodity IT components.