Skip to main content

Overview of the Technologies and Trends Driving Innovation in Production and Post

Throughout 2020, the media and entertainment industry has undergone a revolutionary period of rapid change. Many technical achievements and innovations have kept broadcasters on air and allowed production and post operations to continue whilst still safeguarding staff and public health amidst a global pandemic. These unique circumstances have given rise to an extended period of innovation. Of course, the development and adoption of many of these technologies pre-dates the pandemic, but in many cases, COVID-19 has acted as an accelerant. Many of the successes of the past year will be built upon and shape the direction of innovation throughout 2021 and beyond.

With this in mind, it seems apt to take stock of the some of the key technologies trends in both production and post that are driving the industry forward.

Remote production can mean different things to different people, but generally was mainly associated with REMI (remote integration model) workflows. However, the increased need to distance crew and operators in 2020 has placed an increased focus on technologies that can be used to decentralize workflow and allow more operators to move off-premise and ideally to work from home or on the move. Secure remote desktop and virtual workstations software like Terradici as well as ultra-low latency KVM solutions have allowed operators to “remote in” to facilities and use existing production equipment. New possibilities are also being created by vendors’ efforts to add functionality to products and services such as integration with the leading video collaboration platforms like Microsoft Teams and Zoom for use in broadcasts and other productions. As in many other industries, the seismic shift to remote operation at scale seen in the past year has caused a cultural shift in mindset to the reliability and openness of remote working. Many mission critical applications must still remain on premise but we expect to see technology providers continue to place strategic emphasis on developing their products to facilitate and support even more ambitious remote production workflows in the future.

The use of cloud technologies has been integral to many remote production solutions, but also enabling more flexible and scalable op-ex models to be applied by production companies as well as in VFX and post houses. Storage in the cloud for file based workflows has become the most mature thanks to increasing adoption of Microsoft Azure, AWS and other providers’ services, but it is now becoming easier to lift more processes up into the cloud, especially with the maturing ecosystem of IP native production equipment.

These innovations are helping to realise the goal of a complete end-to-end ecosystem of interoperable IP products from signal capture all the way through to delivery. For example, the Grass Valley LDX100 system camera is able to forgo a traditional base station and can output an IP native video signal direct from the camera to interface directly with an IP workflow. The arrival of new cameras such as this is significant in that it allows the use of IP from the very beginning of the live video pipeline. More vendors will be focusing on adding compatibility with key IP protocols like NDI, SMPTE 2110, and SRT to their products to give them a competitive advantage, whilst the production and post communities will look to increasingly leverage IP workflows and protocols that prove most beneficial to their needs.

The evolution of IP in media and entertainment production and post will increasingly allow the industry to draw on the benefits of other IT technologies that have been developed across other verticals and industries. The convergence of technological innovation across a range of sectors from mixed reality (XR), real-time game engine powered graphics, and the ongoing development of low-latency IP centric workflows has broken down the barrier between physical and virtual realms.

Virtual events, remote crowds, and further experimentation with immersive video formats have all been explored as means to compensate for the absence of large-scale audiences from events during the course of the pandemic. Virtualization in terms of the replacement of dedicated hardware with software or browser-based applications has been well documented in media production, however, the concept of virtual production was is still only in its infancy prior to COVID-19.

One of the more established production roles that is illustrative of a method of virtual production is that of esports observers. These crew members perform a similar role to camera operators, tracking key moments in competitive video gaming to give the audience at home the best seat in the house to watch the action unfold. Understanding the movement and visual language unique to each game title means observers must be able to anticipate and position themselves accordingly, just as a camera operator would on the sidelines of the pitch, but as a spectator in the game.

The proliferation of video collaboration, remote working and the continued restrictions on social gatherings in some countries have all meant that an increasing amount of our time and interaction is spent mediated by digital and virtual environments. More virtual production techniques and roles will emerge to assist in representing XR mixed reality experiences and interactions on our screens. This can be as simple as screen recording or capturing digital exchanges as they happen online, but the challenge for production and post companies is how they will use technologies to make their coverage of virtual and hybrid interactions feel authentic and visually engaging for viewers.

At a different end of the spectrum, the use of virtual sets is becoming more sophisticated. Disney’s first season of their landmark episodic live-action Star Wars series The Mandalorian is often cited for its pioneering use of camera tracking, LED video walls, and Unreal Engine, to wrap and position virtual backdrops around the talent in real time on set. Partnerships between companies like Disguise and Epic Games are helping push forward the toolset that allows video-over-IP to work alongside virtually produced assets and are bringing the relationship between production and post teams closer together on set.

Lastly but perhaps most importantly, the continued democratization of production and post tools has meant that now anyone can be a content creator. Two of the largest social media platforms, Instagram and TikTok are now aggressively competing with one another to win the online short form video audience. How are they doing that? Primarily by making their platform the destination of choice for creators. Both apps now have video editors in-app, and although they are far more rudimentary than industry standard NLEs like Premiere Pro or Avid Media Composer, these features make both apps end-to-end production and publication tools. This will be hugely influential in the development and mindset of the next generation of creatives that enter the production and post industry over the next decade. For Gen Z, their first video camera will be their smartphone, and their first experience of editing software will not even be Apple iMovie or Windows Movie Maker, but the UI of their preferred social media platform.

The ability for independent content creators to connect directly with an audience and monetize this relationship will ultimately be the most disruptive to the video content supply chain moving forward as we witness consumers spend an increasing share of their watch time with user generated content (UGC). In response to this, production and post-production companies face the challenge of how they keep their services reactive and agile to the needs of a rapidly evolving market as more content owners position D2C offerings? It is clear that in the current media and entertainment landscape that production and post companies that are able to keep pace with the current rate of technological innovation will be at an advantage.

Date Published:

Chris Evans

About the author

Chris Evans

Chris specialises in providing market insight and analysis across the professional video technology industry and video content supply chain. Chris draws on a background in video production to apply an end-to-end understanding of workflow, end-user needs, and product specific knowledge across a range of research methodologies and services.

His areas of expertise include: cloud technologies in live broadcast; virtual and remote production; user generated content and live streaming; the sustainable future of the video entertainment industry; large format and >4K video acquisition; vertical specific use cases for pro video products and services.

Chris joined Futuresource in 2017 as a member of the broadcast equipment team. As video technologies have proliferated into an everyday tool for a diversity of professional applications, Chris has taken leadership of Futuresource’s Professional Video services. Chris holds a Bachelor of Arts (BA) in Film and English from the University of Southampton.

Latest Professional Broadcast Reports

Cookie Notice

Find out more about how this website uses cookies to enhance your browsing experience.

Back to top