Uploaded by hasham

Multimedia Assignment 2 Hasham

advertisement
“Multimedia
Assignment 2
1. Write about all video formats and their structure, things they store?
 Flash Video Format (.flv)
Because of the cross-platform availability of Flash video players, the Flash video format
has become increasingly popular.
Flash video is playable within Flash movies files, which are supported by practically
every browser on every platform. Flash video is compact; using compression from On2,
supports both progressive and streaming downloads.
 AVI Format (.avi)
The AVI format, which stands for audio video interleave, was developed by Microsoft.
It stores data that can be encoded in a number of different codec’s and can contain both
audio and video data. The AVI format usually uses less compression than some similar
formats and is a very popular format amongst internet users.
AVI files most commonly contain M-JPEG, or DivX codec’s, but can also contain almost
any format.
The AVI format is supported by almost all computers using Windows, and can be played
on various players.
Some of the most common players that support the avi format are:
-
Apple QuickTime Player (windows & Mac)
-
Microsoft Windows Media Player (Windows & Mac)
-
VideoLAN VLC media player (Windows & Mac)
-
Nullsoft Winamp
 Quicktime Format (.mov)
The QuickTime format was developed by Apple and is a very common one. It is often
used on the internet, and for saving movie and video files.
The format contains one or more tracks storing video, audio, text or effects. . It is
compatible with both Mac and Windows platforms, and can be played on an Apple
Quicktime player.
 MP4 Format (.mp4)
This format is mostly used to store audio and visual streams online, most commonly
those defined by MPEG. It Expands MPEG-1 to support video/audio "objects", 3D
content, low bit rate encoding and support for Digital Rights Management.
The MPEG-4 video format uses separate compression for audio and video tracks; video is
compressed with MPEG-4 video encoding; audio is compressed using AAC compression,
the same type of audio compression used in .AAC files.
The mp4 can most commonly be played on the Apple QuickTime Player or other movie
players. Devices that play p4 are also known as mp4 players.
 Mpg Format (.mpg)
Common video format standardized by the Moving Picture Experts Group (MPEG);
typically incorporates MPEG-1 or MPEG-2 audio and video compression; often used for
creating downloadable movies. It can be played using Apple QuickTime Player or
microsoft Windows Media Player.
 Windows Media Video Format (.wmv)
WMV format, short for Windows Media Video was developed by Microsoft. It was
originally designed for internet streaming applications, and can now cater to more
specialized content. Windows Media is a common format on the Internet, but Windows
Media movies cannot be played on non-Windows computer without an extra (free)
component installed. Some later Windows Media movies cannot play at all on nonWindows computers because no player is available.
Videos stored in the Windows Media format have the extension .wmv.
 3GP File Extension (.3gp)
The 3gp format is both an audio and video format that was designed as a multimedia
format for transmitting audio and video files between 3G cell phones and the internet. It
is most commonly used to capture video from your cell phone and place it online. This
format supports both Mac and windows applications and can be commonly played in the
following:
-
Apple QuickTime Player
-
RealNetworks RealPlayer
-
VideoLAN VLC media player
-
MPlayer
-
MIKSOFT Mobile 3GP Converter (Windows)
 Advances Streaming Format (.asf)
ASF is a subset of the wmv format and was developed by Microsoft. It is intended for
streaming and is used to support playback from digital media and HTTP servers, and to
support storage devices such as hard disks. It can be compressed using a variety of video
codecs.
The most common files types that are contained within an ASF file are Windows Media
Audio, and Windows Media video.
 Real Media Format (.rm)
RealMedia is a format which was created my RealNetworks. It contains both audio and
video data and typically used for streaming media files over the internet.
Realmedia can play on a wide variety of media players for both Mac and Windows
platforms. The realplayer is the most compatible.
 Flash Movie Format (.swf)
The Flash movie format was developed my Macromedia.
This format can include text, graphics and animation. In order to play in Web Browsers,
they must have the Flash Plug-In Installed. The flash plug in comes preinstalled in the
latest version of many popular Web Browsers.
 The Real Video Format
The Real Video format was developed for the Internet by Real Media. The format is used
for streaming of video at low bandwidths. This sometimes causes the quality of the
videos to be reduced
2. What is compression? example.
Compression is a encoding of data that contains fewer bits. This allows a more efficient
storage and transmission of the data. The inverse process is called decoding. Software
and hardware that can encode and decode are called decoders. Both combined form a
codec and should not be confused with the terms data container or compression
algorithms
The Video Compression algorithm utilized in numerous usually consists of the following
steps:
- Motion Estimation
- Motion Compensation and Image Subtraction
- Discrete Cosine Transform
- Quantization
- Run Length Encoding
- Entropy Coding
3. All video compression topic

H.264 is currently the highest quality video codec available, which means that it can usually
look better at the same file size, or the same at smaller file sizes, than other codecs. It's also
widely deployed, and most web browsers can play it (either natively, or via a Flash plugin),
and so can many mobile devices, like the iPhone and Android. But there are licensing issues
to consider.

Theora is a free and open video codec with no licensing issues. It is natively playable in
many browsers, including Firefox, Chrome, and Safari. But compression is a bit worse than
H.264.

Cinepak This is an older codec that was developed by Radius Corp. for playing back video
on slow-speed CD-ROM drives. Cinepak provides relatively high quality and good
performance.

Indeo Video 3.2 An Intel-developed 24-bit video codec that first appeared in the 1980s. Like
Cinepak, this codec was designed for video playback on slow CD-ROMs.

Indeo Video Interactive 4.1 This codec provides higher quality than the earlier Indeo
versions. It uses wavelet technology.

H.261
This standard is part of the ITU H.320 series of videoconferencing standards that is
similar to MPEG. It defines video compression in increments of 64 Kbits/sec, along with
additional specifications for luminance and chrominance information.

H.263
An Intel Corp.-developed codec designed for delivering video over 28.8-Kbit/sec
modem connections.

TrueMotion RT
A codec developed by Duck Corp., it provides high-quality full-motion
video playback.

VDOnet VDOwave
Developed by VDO Corporation, this codec was designed to deliver
video that is optimized for the Internet.

DVI (Digital Video Interactive)
An Intel Corp.-developed video and audio compressor on a
chip.

VP8 is a newer codec. Like Theora, it is free and open. But it compresses better than Theora,
and about as well as the H.264 Baseline profile. H.264 is still better in its Main and High
profiles. But that will hopefully change as VP8, and VP8 encoders, mature. VP8 is expected
to be widely adopted in web browsers, and can be played today in the newest versions of
Chrome, Firefox, and Opera.

VP6 is supported natively in Flash 8, and so is widely used on the web. It is a good codec,
but doesn't compress as well as H.264. We encode to VP6 in the FLV container, with MP3
audio.

MPEG-4 is often used in a 3GP format on mobile devices.

WMV is a Windows codec. We specifically support WMV 8, which is confusingly the same
as WMV2.
MPEG-4 concerning video compression is the support of even lower bandwidth consuming applications,
e.g. mobile units, and on the other hand applications with extremely high quality.

First a reduction of the resolution is done, which is followed by motion compensation in order to
reduce temporal redundancy.

The next steps are the Discrete Cosine Transformation (DCT) and a quantization as it is used for the
JPEG compression; this reduces the spatial redundancy (referring to human visual perception).

The final step is an entropy coding using the Run Length Encoding and the Huffman coding
algorithm.
MPEG-4 uses techniques as putting pictures in a sequence. It essentially compares two compressed
images, saves the picture, and it saves only the difference from each additional sequential image, such
as movement, thus saving time, memory space and processing power.
A higher compression rate is amongst the advantages of MPEG-4. It can sync audio and video, and is
great for real-time viewing. MPEG-4 was designed to support low-bandwidth applications.
 MPEG-4 uses media objects to represent aural, visual or audiovisual content. Media objects can be
synthetic like in interactive graphics applications or natural like in digital television. These media
objects can be combined to form compound media objects. MPEG-4 multiplexes and synchronizes
the media objects before transmission to provide QoS and it allows interaction with the constructed
scene at receiver.
 MPEG-4 organizes the media objects in a hierarchical fashion where the lowest level has primitive
media objects like still images, video objects, and audio objects. MPEG-4 has a number of primitive
media objects which can be used to represent 2 or 3-dimensional media objects. MPEG-4 also
defines a coded representation of objects for text, graphics, synthetic sound, talking synthetic
heads.
 MPEG-4 provides a standardized way to describe a scene. Media objects can be places anywhere in
the coordinate system. Transformations can be used to change the geometrical or acoustical
appearance of a media object. Primitive media objects can be grouped to form compound media
objects. Streamed data can be applied to media objects to modify their attributes and the users
viewing and listening points can be changed to anywhere in the scene.
 Visual part of the MPEG-4 standard describes methods for compression of images and video,
compression of textures for texture mapping of 2-D and 3-D meshes, compression of implicit 2-D
meshes, and compression of time-varying geometry streams that animate meshes. It also provides
algorithms for random access to all types of visual objects as well as algorithms for spatial, temporal
and quality scalability, content-based scalability of textures, images and video. Algorithms for error
robustness and resilience in error prone environments are also part of the standard.
 For synthetic objects MPEG-4 has parametric descriptions of human face and body, parametric
descriptions for animation streams of the face and body. MPEG-4 also describes static and dynamic
mesh coding with texture mapping, texture coding with view dependent applications.
 MPEG-4 supports coding of video objects with spatial and temporal scalability. Scalability allows
decoding a part of a stream and construct images with reduced decoder complexity (reduced
quality), reduced spatial resolution, reduced temporal resolution. or with equal temporal and spatial
resolution but reduced quality. Scalability is desired when video is sent over heterogeneous
networks, or receiver cannot display at full resolution
The human eye has a lower sensibility to color information than to dark-bright contrasts. A
conversion from RGB-color-space into YUV color components help to use this effect for
compression. The chrominance components U and V can be reduced (subsampling) to half of the
pixels in horizontal direction (4:2:2), or a half of the pixels in both the horizontal and vertical (4:2:0).
An MPEG video can be understood as a
sequence of frames. Because two
successive frames of a video sequence
often have small differences (except in
scene changes), the MPEG-standard offers a
way of reducing this temporal redundancy
DCT allows, similar to the Fast Fourier Transform (FFT), a representation of image data in terms of
frequency components. So the frame-blocks (8x8 or 16x16 pixels) can be represented as frequency
components.
During quantization, which is the primary source of data loss, the DCT terms are divided by a
quantization matrix, which takes into account human visual perception. The human eyes are more
reactive to low frequencies than to high ones. Higher frequencies end up with a zero entry after
quantization and the domain was reduced significantly
The entropy coding takes two steps: Run Length Encoding and Huffman coding. These are well
known lossless compression methods, which can compress data, depending on its redundancy, by
an additional factor
Download