Digital Marketplaces Unleashed
Page 99
In contrast to these solutions that are partly promoted with the marketing buzzword “cloud” but do not always represent “real” cloud applications (at most, they can be assigned to the private cloud category), several sometimes new vendors offer “real” cloud services for media production. These services either cover fundamental functions of the media production, such as encoding, transcoding or quality control, or provide more comprehensive cloud‐based solutions that are built on top of such fundamental functional services.
Examples of the former are “QCloud” by Tektronix, “Vidcheck OnDemand” by Vidcheck and the “MXF Analyser Cloud Service” by IRT for quality control, or “Zencoder” by Brightcove and “Vantage Cloud” by Telestream for transcoding services. Cloud services for transcoding and quality control sometimes completely rely on cloud‐based workflows (e. g. QCloud processes media content in Amazon “EC2” instances and stores it in the Amazon cloud (“S3”)) but can also be realized as a hybrid scenario. These services are typically integrated by an API and use a pay‐per‐use billing model.
An example for solutions built on top of fundamental cloud services is “Pageflow”, a production and publication platform for multimedia content developed in collaboration with German public‐service broadcaster Westdeutscher Rundfunk (WDR). It also uses Amazon’s “S3” cloud storage to store its media content (audio, video, pictures). From there the content is transferred to Brightcove’s “Zencoder” cloud service where it is transcoded for various end devices (e. g. different formats, codecs, resolutions). “Pageflow” can be used as an entirely cloud‐hosted service at various tariffs or – thanks to its open source code – can be installed on your own servers for a hybrid cloud scenario. Another example of such solutions is Cube‐Tec’s “MXF Legalizer” a tool that – in addition to quality control – repairs files in the cloud that are detected as damaged.
Established manufacturers like Imagine Communications are also moving to the cloud. They offer products that are purely cloud‐based like CloudXtream, an end‐to‐end solution for media content distribution and monetization which comprises (even) live video, video on demand (VOD), cloud DVR (cDVR), dynamic ad insertion (DAI), packaging, encoding, transcoding, storage management, and cloud orchestration. Services that already exist as a product/appliance and are now offered as a cloud service like Versio, a channel playout, or SelenioFlex for live encoding. All tools are offered as services on the Microsoft Azure cloud.
Two other providers of SaaS solutions for the media industry that rely on “Amazon Web Services” as the underlying cloud platform are Reelway with their product of the same name and Sony with the product “Ci”. Both solutions offer CMS (Content Management System) or MAM functionality (Media Asset Management) for media content stored in the cloud. “Reelway” offers various MAM functions such as cataloguing and the organization of and search for media assets. In addition, it offers a rough cut editor, the possibility to export editing decision lists (EDLs) and other features. Sony’s “Ci” on the other hand provides a number of functions for collaborative work in the acquisition domain along with the storage and organization of the clips. By the means specially developed camera adapters, proxies can be directly uploaded into the cloud from the filming location via wireless networks and then can be annotated, exchanged and edited (rough cut) with the help of the Ci applications.
“ioGates” regards itself as a MAM addition that aims to simplify complicated conversion and ftp‐based transfers by storing and processing the media content in the cloud. The focus lies on workflows in the TV service productions segment where it is necessary to coordinate or optimize processes across several service providers. The BMBF‐funded project “dwerft – linked film and tv services”, of which IRT is a partner, is doing research in this field. One of the project’s goals is the development of an extensible knowledge representation (ontology) that shall improve the lossless exchange of information (metadata) between different media production applications and services.
“Make.TV” [8] is a live video cloud solution that provides users with a cloud‐based live video production toolset. With this set of tools, consisting of “Acquire”, “Selector”, “Manager” and “Playout”, the user works in a shared environment to acquire, manage and distribute (near) live or recorded videos on the Internet, fully realized in the public cloud (see Fig. 64.3). For acquisition, different acquisition devices (smart phones and tablets using the “Streamtag App”, PCs, but also professional cameras with downstream and external encoders) can be used to stream signals into the live video cloud. Inside the live video cloud, these signals can be acquired, monitored, and routed by the “Aquire” tool. For organizing, selecting, monitoring, and routing of live video feeds, “Selector” provides the user with a video router inside the cloud. Management tasks like browse, filter, trim, and download of video feeds or the function to receive alerts or transfer video feeds to MAM/PAM systems and ingest recorded video feeds are provided by the “Manager” tool. For distribution, the “Playout” tool provides the user with a playout inside the cloud to schedule 24/7 live streams and distribute content simultaneously on multiple platforms, e. g. live via the internet to the end customer, transferred to your own production infrastructure or directly to a CDN. Simultaneously, input signals or the final production can be archived in a simple CMS in the cloud. The different tools are controlled via a browser‐based application on a tablet or PC using the proxy streams transcoded live in the cloud. In addition, signal paths for the communication via intercom or a chat between the production participants and red light signals (tally) are available. Of course, the produced signal can also be fed into existing production systems via a decoder gateway. Therefore, the critical signal paths for maximum quality of the final product are the source signals to the mixer in the cloud and, of course, the distribution to the signal destination. Note, however, that “live” is not equivalent to “low latency”.
Fig. 64.3Make.TV Overview. (© http://www.make.tv)
Other examples of providers with a similar concept are companies like VJU iTV Development GmbH with their product Streemfire, Media IT Profy with his product Broadcast.me or SintecMedia with his product CloudOnAir.
Last but not least, with the “Media Services” or “Media Tools”, Microsoft, Amazon and Google provide a range of different cloud services via their marketplaces that can be useful for broadcasters. While these are mainly SaaS offerings in the case of Google (e. g. research and verification with “Google Search”, visualizations with “Google Maps” and “Google Earth”, publication with “Google+” and live streaming with “YouTube” etc.), Microsoft with its cloud‐platform “Azure” offers services for editing and preparation of content (live ingest, live encoding, upload, encoding, conversion, live streaming etc.). Azure also allows you to develop your own cloud‐based applications according to the broadcasters’ requirements – even across several layers, for example as a hybrid solution. Such platforms can support broadcasters and media companies in creating custom SaaS applications according to their needs.
Furthermore, SaaS services for file synchronization and sharing that are not especially targeting the media industry are becoming increasingly popular also in media production. Thanks to “Dropbox” and others these services have conquered the market in a very short time and have become part and parcel of everyday life and also gained tremendous importance for broadcasting. Especially for journalists and editors who travel a lot and need access to their data anytime and anywhere, such a service enormously facilitates work. Data can be synchronized between different devices, shared with others, and it enables collaborative work on documents etc. However, recent data‐related scandals (cf. the iCloud hack in 2014) increase doubt about the security and reliability of such a solution since the data is usually kept on the (cloud) provider’s servers and thus elude direct
control by the user. These and other questions about security and data protection still seem to be unclear and therefore require special attention, especially when it comes to the usage in broadcasting and journalism. At the same time companies like Uniscon invent systems particularly providing a very high security of the data stored in the cloud.
Products available on the market today already demonstrate that sections of the professional media production can be implemented through Internet‐ or cloud‐based applications. This is especially the case for applications/services that do not have special hardware requirements and require no or only minimal installation of special software adjusted to clients. The handling of the tools is also usually designed in a way that the applications can be used without extensive training. Often, however, compromises (still) have to be made, especially with regard to quality and stability. In the short to medium term, cloud services – especially if used via public networks – do not yet represent a full replacement of current production infrastructures and at best serve as a supplement in appropriate use cases. On the other hand, many of today’s solutions demonstrate that in the long term more of these tools will be used in the production since they, above all, meet the creative needs and are thus increasingly requested by the editorial departments.
Business and integration models are still to be confirmed. Many solutions are offered as a complete cloud solution with no or rather simple programming interfaces, e. g. for upload and download. They can be used quite easily as an extension to an existing broadcasting infrastructure. Connecting single modular media services from e. g. Azure, Amazon AWS, or Google and orchestrating them for an effective production workflow is, however, a much more complex integration task that requires in depth knowledge and a clear vision how the final infrastructure of a media house should look like, including any metadata. Cloud based services typically charge for CPU usage, CPU up‐time, and the amount of data download rather than data upload. It is obvious that costs in a self‐orchestrated infrastructure are more complex to predict. Moreover, the cost modell for cloud services drives the design of applications and workflows. Thus, may be today’s charging models undergo substantial changes in the future.
64.3.2 Media Distribution
Cloud services have already been very successfully deployed in the field of media distribution. In fact, some cloud providers started to build up infrastructures for their own business in media distribution and only later decided to offer parts of that infrastructure to the public for more general use. At first, those services focused on the transport of data, but in the meantime such services can consist of several cloud‐based services. The range extends from processing of video and audio (encoding and/or transcoding) to data storage and distribution via different streaming technologies all the way to monitoring and troubleshooting. So called content delivery networks (CDN) were created to deliver data reliably, safely and fast to the end user. By optimized transport algorithms (consisting of server management, routing, caching) the different kinds of data (video/audio, email, web pages etc.) are routed through several networks to the end user.
Today, CDN services are essential for many content providers (e. g. VoD providers but also providers of linear programs) to handle the huge amount of requests from their users. YouTube for example has more than a billion users worldwide. That corresponds to almost one‐third of all Internet users. Every day YouTube clips with a total duration of several hundred million hours are played and billions of views are generated [9]. In Germany major sports events like soccer championships or the Olympic games regularly result in peak loads which the German public broadcasters have to handle. And the trend points still upwards, comparing the European soccer championship 2016 with the World soccer championship in 2014, twice as much traffic was streamed.
In the field of video streaming, cloud encoding and transcoding has increasingly become popular. Apart from professional hardware encoders more and more cloud encoding services are used. Whereas hardware based encoders need to be purchased, maintained and placed somewhere, the corresponding cloud‐service are easily accessible anywhere and anytime and offer a pay‐per use billing model. The better scalability of a process chain and the close link to CDNs are very positive effects, as well.
Today content providers have to make their content available in many different formats due to the constantly changing and versatile end user devices as well as the varying access to the content (e. g. smart TV, smart phone, laptop). Therefore, the streams have to be provided in various formats. VoD (Video On Demand) or live streams via e. g. HDS, HLS, MPEG‐DASH, HbbTV in SD, HD or even UHD are just a few requirements the providers have to meet to remain competitive. The need for more scalability and large data volumes causes more and more content providers to step into the cloud.
As more and more parts of the distribution workflow are shifted to the cloud it is essential for content providers to regain a certain degree of control over their streaming services; not least to guarantee the best possible service for the end user (cf. QoS – Quality of Service). Comprehensive monitoring from encoding to the end user is the only option to identify errors or problems and to react as quickly as possible. Some high end products are even a step ahead. They are able to respond nearly in real time and if necessary take measures to prevent or improve the situation at the end user side.
Such business models are already pursued by several companies, with Conviva and Akamai as the main players. However, they do not necessarily meet all the requirements of e. g. the German public broadcasters. Therefore, IRT together with “Das Erste” developed a QoS monitoring plugin (ARGOS) [10] which suits the broadcasters needs as well as data protection regulations.
64.4 The Viewer and the Cloud
Of course the viewers and consumers have been using the cloud for a long time – consciously, or probably unconsciously – in the form of social networks; services such as “Dropbox”, “Google Drive”, “OneDrive”, “Amazon Cloud Drive”; Google, Apple and Microsoft accounts for fixed or mobile devices; online video recorders and video stores or vendor‐specific accounts for smart TVs. Technically skilled users operate their own home NAS servers or streaming servers and make them available “on the go”. There is a fundamental need to have “one’s own” photos, video clips or TV content available anywhere and anytime. An increasing number of end devices include cloud‐based software solutions along with the hardware – from photo cameras to video editing to heating, lighting and TV control via home automation in the cloud. “User‐generated content” is distributed on virtual channels via “YouTube” and is marketed in a highly profitable way. Video and audio consumption over CDNs has been widely established. In those cases, the main task of the cloud is to accept and store data, to transcode to various representations and to deliver data as a non‐interrupted stream.
However, cloud offerings start to reach clearly beyond such scenarios, especially when user or device identifications – or in general user data – become an integral part of the service: The “Second Screen Framework” of IRT and T‐Systems developed for the ARD.Mediathek und “rbbtext” [11] establishes a virtual connection between smart TVs and mobile devices as a 2nd screen irrespective of the manufacturer. The ADAMS project [12] has developed cloud based mechanisms to continue media consumption across various devices while the user is moving through his home or within a city. CE manufacturers are starting to interconnect their media devices with household appliances in smart homes – in fact, these household devices themselves are more and more becoming media devices. Data exchange is usually handled through a public cloud operated by manufacturers, telecom companies and others.
There are various business models: sometimes cloud usage is free for the life‐time of the device which has been bought and which connects into the cloud and in some cases the user has to pay a regular fee. Models where the costs become dependent on the usage and amount of data are not yet c
ommon, but may emerge with services that require more intense data exchange.
In all cases, the service integrity, data security and privacy is a significant risk. Data transfers and logins to the cloud can be subject to hacking attacks. For many services, the user even accepts that the cloud operator analyses the users’ data and his behavior as a basis for further business, i. e. the user “pays with his data”. However, experience tells us that viewers are willing to sacrifice personal data and take on risks in exchange for comfort and convenience. It is precisely this convenience and comfort at a comparably low cost coupled with interesting performance features that the consumer is already used to and that professional (media) applications will be measured against in the future, because A/V acquisition, storage, processing and distribution in high quality have long ceased being a distinguishing feature of large media companies and can be principally acquired from the cloud via a mouse click by everybody.
64.5 The Cloud from a Legal and Privacy Perspective
64.5.1 The Legal Background
(Public) cloud computing and protection of personal data are two subjects that have triggered many discussions recently. Especially the Snowden revelations and the latest spectacular court decisions stirred up the controversy (cf. United States of America vs. Microsoft Corporation). The issue behind is that cloud computing does not care about national borders. On the one hand the cloud logic implies that the location of data processing is not guaranteed in advance. On the other hand, traditionally the jurisdiction is bound to the location were the data processing takes place. Thus, up to now it was common practice for globally operating cloud providers that the location of the corporate headquarters determined the jurisdiction and not the location of the data center itself. This was particularly important with regard to subcontractor structures. The latest court ruling in the so called “Microsoft Ireland” case between the Microsoft Cooperation and United States of America started to change that practice. The case began in 2013 when a warrant was issued by a New York court requesting Microsoft to hand over emails and private information associated with a certain account hosted in a Microsoft data center in Dublin. After Microsoft lost the first trial in 2014, on July 14, 2016 the U.S. Court of Appeals for the 2nd Circuit ruled that Microsoft is not required to comply with the warrant. The ruling is seen as a victory for the whole industry and as a landmark decision. During the trial Microsoft got a lot of support from the industry (Amazon, Apple), the media (CNN, Washington Post) and even from the Irish and European government. Now it remains to be seen what impact and consequences this ruling will have on the jurisdiction and in practice regarding global cloud computing services.