Sunday, March 6, 2011

A few hundred of my closest strangers...

When I first heard about Twitter, I was thinking it's similar to Facebook's status update feature, but limited to only 140 characters. It wasn't appealing at the time.

Some time later, I noticed a number of the bloggers I been following were making frequent updates on Twitter and only the occasional blog posts. I decided to sign up on Twitter, follow the bloggers I liked and even follow some of people the bloggers liked.

The benefit of Twitter for me became clear when the annual NAB (National Association of Broadcasters) Conference began in Las Vegas. Many of the people I've been following were at NAB, and been tweeting and taking pictures on the show floor. They also linked to press releases or websites going into depth about interesting products and announcements.

I've expanded my follow list on Twitter considerably since then, and find that it invaluable in keeping up with interesting trends in film, media and technology.

Saturday, January 1, 2011

Streamlining the future

I think future productions will have streamlined work flows where a lot of tasks will be automated all the way from preproduction to distribution. Much of the automation will be in the background by collecting, and re-purposing metadata. The metadata will make it possible for software to exchange information more efficiently and consistently.

Here are some of my predictions of what could happen in the future. Some of these ideas are existing products, services or research and I've listed links.

Pre-production:

Let's start with the script. As a script is being developed and written, everything is tagged as metadata. Since most screenwriting programs already handle formatting, this is a background task the writer isn't aware of it until they want to be.

Final Draft 8's file format is XML, so can be imported into any text editor or other screenwriting programs like Celtx and Adobe Story. This means better collaboration between writers, using their preferred software.

Final Draft

Collaboration begins with the script stored in a secure central server, where revisions and notes can be appended. Story meetings done by remote video conferencing, while everyone is viewing/modifying the same documents, like in Adobe Story or Celtx Studio. This virtual collaboration will extend into the production meetings.

Adobe Story

Celtx Studio

After approval, script breakdown and budgeting can be streamlined when tagged data is be sorted into locations, cast, props, vfx sequences, etc. The tags are used to data mine various databases for props, locations and actors.

The producer finds tax incentives, vendor deals, crew/cast hires, etc and have it all reflected in the budget in real time. Cost overages can be tracked more precisely. Variations of the budget can be generated more efficiently.

Schedules can be viewed in both a global or departmental level, with real time updates on tasks.

As actors get cast, they could be face tagged as their characters. Those photos/videos can be used for preliminary digital character model building if needed for a digital double.

FaceGen

Locations and props get selected, new photos/videos could be tagged with metadata that would be sent to the art director, cinematographer, production designer and visual effects supervisor.

Make3d

Some of those photos/videos could be used by the storyboard/concept artist and previsualization team to develop animatics of key scenes, which gives the vfx supervisor a better idea on what techniques for different shots.

Previsualization could play an important part of the story development process. George Lucas did this when developing the Star Wars Prequels. As sequences are written, a previs team can animate and edit the sequence together, giving a better idea how the story is progressing.

Frameforge3D

Poser

sketchup

The cinematographer could use the photos/videos with the colorist to prep look development for the visual style. When the cinematographer does camera/lenses tests, all media and metadata is preserved to test workflow pipeline. This is when the editorial team is assembled to continue the workflow testing through post.

Iridas Onset

3cp.gammadensity.com


Testing becomes crucial since many aspects of the pipeline from cameras to NLEs are computer based and may have compatibility issues to be resolved.

Production:

Script supervisor has a live link to the production database, so all entries are maintained on set and remotely synced with main database. The camera and sound reports are also synced to the main database.

www.scriptesystems.com

www.script-supervisors.com

Tablet devices that have real time feed of video/audio dailies from camera takes, so no need to crowd around the video village.

Light Iron Digital has a service called Outpost Mobile Systems, running their custom software called LiVE PLAY, which shows dailies from Red Cinema Cameras playing out to iPads.

LightIron Outpost2


Data cinema cameras are offloaded to data stations. The cards will be archived to local storage, in addition to being remotely archived over high speed, encrypted data lines. The metadata from the camera is also archived, as well as synced to main production database. The camera metadata includes camera and lens information, as well as information from a motion control device.

Real time dailies feed is also sent by high speed encrypted data lines to editorial room with script supervisor, camera and sound reports, along with any camera metadata.

Post-production:

All onset reports and metadata in the production database are pushed to the editorial room with the dailies. Dailies get face tagged with character names, as well as scene, shot, take, location and other pertaining data.

The dailies also gets tagged with script dialogue that matches the audio, like Adobe OnLocation currently does. OnLocation can even import a script from Adobe Story and match the dialogue with parts of the script.

Adobe Onlocation

The editorial team would use their own asset management system for ingesting, transcoding and tracking assets. There are lots of asset management systems available, some are affordable like Final Cut Server or CatDV Pro.

Final Cut Server

CatDV Pro

Tactic

A number of NLEs like Final Cut Pro and Adobe Premiere Pro interact very well with other applications in their product suites, but it's more important that applications interact with competing applications to minimize the redundant work when switching between NLEs and other applications.

Adobe Premiere can read Final Cut Pro XML format if the media is accessible by both applications, an Final Cut Pro timeline can be recreated in Premiere.

Third party developers like Automatic Duck, and Intelligent Assistance specialize in these types of tools.

Automatic Duck

Intelligent Assistance

Since a lot of different digital cinema cameras have their own proprietary format, it'll be important to have a high quality intermediate format for mastering. So camera original material can be transcoded to the intermediate format with metadata. This especially becomes important for visual effects and color grading.

There are some codecs like Cineform Raw, Avid DNxHD and Apple ProRes, that can be used as an intermediate, and hopefully become more robust in the future.

DNxHD

ProRes

Neo4k Cineform Raw

Adobe is working on a Cinema DNG which is similar to the DNG format they developed for still cameras.

Cinema DNG

There's a lot more ideas for new workflows, techniques and technologies that I'll mention in the coming year, but this is a good start.

Tuesday, December 21, 2010

It's a good time for filmmakers. You can go to any mainstream electronics store, pick up an HD video camera, pretty fast computer and writing, editing, graphics and audio software for affordable prices, then go online and find hours of tutorials, inspirational short films, and forums to learn to make a film.

Now this isn't anything new. Filmmakers have had affordable gear available for at least a decade. The tools were good, the quality decent to pretty good, but required a lot of work to get exceptional results.

The caveats of course were what you were willing to compromise on. The paradigm was good - cheap - fast... pick two.

As time passed, Moore's Law made things more interesting. For the amount of money we spent 10 to 15 years ago, today we've gotten exponentially more powerful and affordable tools. Our paradigm is still good cheap and fast, though now we can have all three in various price ranges.

These days we're seeing movie budgets from 0 to $300 million. The interesting thing is that 4k Red, HD capable DSLRs, prosumer HD cameras are used on both low and high budget productions.

Previously our choice was shoot DV, run it though some film look recipe, transfer to film, looking gritty, blown out with obvious video artifacts. It was a compromise that sometimes worked and sometimes didn't.

The current crop of cameras are much closer to a cinematic aesthetic, so most viewers won't notice the difference. Even a consumer camera has some flavor of HD with a 24p mode. Transfer to film is optional, since digital projection is steadily being added to theaters. Even if a theatrical release isn't planned, HD quality can be maintained with both streaming/downloads and Blu Ray. Scaling the image down to something more manageable on YouTube and Vimeo actually enhances the look if done correctly.

On the post side, software like Apple Final Cut Pro and Adobe After Effects quickly matured and put to use on major feature films and TV series. In response, some of the traditional heavy hitters have been restructuring their pricing. Software only products like Avid Media Composer, DaVinci Resolve and Autodesk Smoke For Mac are affordable for boutiques and individuals, without compromising on functionality.

Enthusiasm has been high amongst professionals, especially those who opened boutique shops using powerful yet affordable hardware and software.

There has been some curious backlash though in forums, blogs and panels. Look around and you'll read about how film is still the gold standard for image acquisition, and how Media Composer is the editing system of choice for most professional feature film editors.

The HD DSLRs are the latest tools to get the backlash. Highly compressed codecs, poor image processing and less than ideal form factors, are some of the complaints. Yet both on YouTube and Vimeo, there are many examples of stunning imagery. That seems to be what baffles the pros who don't want to compromise on image fidelity. The supporters of these cameras find they get a beautiful image for a very affordable price, so they are more than willing to deal with the shortcomings.

The uneasiness is to be expected. There are many examples of industries that used expensive unwieldy equipment with experienced operators, replaced by user friendly, inexpensive tools. Adding to the uneasiness has been closures of some established post houses in recent times.

Lost will be the apprentice/mentor model where someone spends years honing their skills under the tutelage of a seasoned pro. Learning now is a mixed variety of school, online tutorials, videos, books, and hands on experience.

This DIY learn as you go method isn't perfect either, as some will just copy tutorials instead of learning the core fundamentals. There will be new methods of figuring out who's skilled and who isn't.

One of biggest benefits is an openness about teaching anyone, and not keep it a secret art only to be passed onto insiders. Software developers have noticed too, and been developing their own tutorials and special learning edition versions of their products. They also have heavily discounted student editions, to help grow the future user base.

Now that anyone can make a movie, there is a flood of content online, good and bad. This is no different with art, music, photography or literature. Not everyone will attempt to make a living at it, and not all of those who try will succeed. It'll definitely be exciting to see what creatively will come from users of these new tools.