Chapter 10
Chapter 10
Multimedia Cloud
• For applications that use the Platform-as-a-service (PaaS) cloud service model, the architecture
and deployment design steps are not required since the platform takes care of the architecture
and deployment.
• Component Design
• In the component design step, the developers have to take into consideration the platform specific features.
• Platform Specific Software
• Different PaaS offerings such as Google App Engine, Windows Azure Web Sites, etc., provide platform specific
software development kits (SDKs) for developing cloud applications.
• Sandbox Environments
• Applications designed for specific PaaS offerings run in sandbox environments and are allowed to perform
only those actions that do not interfere with the performance of other applications.
• Deployment & Scaling
• The deployment and scaling is handled by the platform while the developers focus on the application
development using the platform-specific SDKs.
• Portability
• Portability is a major constraint for PaaS based applications as it is difficult to move the
• Infrastructure Services
• In the Multimedia Cloud reference architecture, the first layer is the
infrastructure services layer that includes computing and storage
resources.
• Platform Services
• On top of the infrastructure services layer is the platform services
layer that includes frameworks and services for streaming and
associated tasks such as transcoding and analytics that can be
leveraged for rapid development of multimedia applications.
• Applications
• The topmost layer is the applications such as live video streaming,
video transcoding, video-on-demand, multimedia processing etc.
• Cloud-based multimedia applications alleviates the burden of
installing and maintaining multimedia applications locally on the
multimedia consumption devices (desktops, tablets, smartphone, etc)
and provide access to rich multimedia content.
• Service Models
• A multimedia cloud can have various service models such as IaaS,
PaaS and SaaS that offer infrastructure, platform or application
services.
• Real Time Messaging Protocol (RTMP) is a protocol for streaming audio, video and data over the Internet.
• The plain version of RTMP protocol works on top of TCP. RTMPS is a secure variation of RTMP that works
over TLS/SSL.
• RTMP provides a bidirectional message multiplex service over a reliable stream transport, such as TCP.
• RTMP maintains persistent TCP connections that allow low-latency communication.
• RTMP is intended to carry parallel streams of video, audio, and data messages, with associated timing
information, between a pair of communicating peers.
• Streams are split into fragments so that delivery of the streams smoothly.
• The size of the stream fragments is either fixed or negotiated dynamically between the client and server.
• Default fragment sizes used are 64-bytes for audio data, and 128 bytes for video data.
• RTMP implementations typically assign different priorities to different classes of messages, which can
affect the order in which messages are enqueued to the underlying stream transport when transport
capacity is constrained.
• HTTP Live Streaming (HLS) can dynamically adjust playback quality to match the available speed of
wired or wireless networks.
• HLS supports multiple alternate streams at different bit rates, and the client software can switch
streams intelligently as network bandwidth changes.
• HLS also provides for media encryption and user authentication over HTTPS, allowing publishers to
protect their work.
• The protocol works by splitting the stream into small chunks which are specified in a playlist file.
• Playlist file is an ordered list of media URIs and informational tags.
• The URIs and their associated tags specify a series of media segments.
• To play the stream, the client first obtains the playlist file and then obtains and plays each media
segment in the playlist.
• HTTP Dynamic Streaming (HDS) enables on-demand and live adaptive bitrate video delivery of
standards-based MP4 media (H.264 or VPC) over regular HTTP connections.
• HDS combines HTTP (progressive download) and RTMP (streaming download) to provide the ability to
deliver video content in a steaming manner over HTTP.
• HDS supports adaptive bitrate which allows HDS to detect the client’s bandwidth and computer
resources and serve content fragments encoded at the most appropriate bitrate for the best viewing
experience.
• HDS supports high-definition video up to 1080p, with bitrates from 700 kbps up to and beyond 6
Mbps, using either H.264 or VP6 video codecs, or AAC and MP3 audio codecs.
• HDS allows leveraging existing caching infrastructures, content delivery networks (CDNs) and
standard HTTP server hardware to deliver on-demand and live content.
• Functionality
• Live video streaming application allows on-demand
creation of video streaming instances in the cloud.
• Development
• The live streaming application is created using the Django
framework and uses Amazon EC2 cloud instances.
• For video stream encoding and publishing, the Adobe
Flash Media Live Encoder and Flash Media Server are used
• Functionality
• Video transcoding application is based on multimedia
cloud.
• The transcoding application allows users to upload video
files and choose the conversion presets.
• Development
• The application is built upon the Amazon Elastic
Transcoder.
• Elastic Transcoder is highly scalable, relatively easy to use
service from Amazon that allows converting video files
from their source format into versions that will playback
on mobile devices like smartphones, tablets and PCs.