FFmpeg also includes other tools: ffplay, a simple media player, and ffprobe, a command-line tool to display media information. Among included libraries are libavcodec, an audio/video codec library used by many commercial and free software products, libavformat (Lavf),[8] an audio/video containermux and demux library, and libavfilter, a library for enhancing and editing filters through a GStreamer-like filtergraph.[9]
FFmpeg is part of the workflow of many other software projects, and its libraries are a core part of software media players such as VLC, and has been included in core processing for YouTube and Bilibili.[10] Encoders and decoders for many audio and video file formats are included, making it highly useful for the transcoding of common and uncommon media files.
The project was started by Fabrice Bellard[11] (using the pseudonym "Gérard Lantau") in 2000, and was led by Michael Niedermayer from 2004 until 2015.[12] Some FFmpeg developers were also part of the MPlayer project.
The name of the project is inspired by the MPEG video standards group, together with "FF" for "fast forward", so FFmpeg stands for "Fast Forward Moving Picture Experts Group".[13] The logo represents a zigzag scan pattern that shows how MPEG video codecs handle entropy encoding.[14]
On March 13, 2011, a group of FFmpeg developers decided to fork the project under the name Libav.[15][16][17] The event was related to an issue in project management, in which developers disagreed with the leadership of FFmpeg.[18][19][20]
On January 10, 2014, two Google employees announced that over 1000 bugs had been fixed in FFmpeg during the previous two years by means of fuzz testing.[21]
In January 2018, the ffserver command-line program – a long-time component of FFmpeg – was removed.[22] The developers had previously deprecated the program citing high maintenance efforts due to its use of internal application programming interfaces.[23]
The project publishes a new release every three months on average. While release versions are available from the website for download, FFmpeg developers recommend that users compile the software from source using the latest build from their source codeGitversion control system.[24]
Codec history
Two video coding formats with corresponding codecs and one container format have been created within the FFmpeg project so far. The two video codecs are the lossless FFV1, and the lossless and lossy Snow codec. Development of Snow has stalled, while its bit-stream format has not been finalized yet, making it experimental since 2011. The multimedia container format called NUT is no longer being actively developed, but still maintained.[25]
In summer 2010, FFmpeg developers Fiona Glaser, Ronald Bultje, and David Conrad, announced the ffvp8 decoder. Through testing, they determined that ffvp8 was faster than Google's own libvpx decoder.[26][27] Starting with version 0.6, FFmpeg also supported WebM and VP8.[28]
In October 2013, a native VP9[29] decoder and OpenHEVC, an open source High Efficiency Video Coding (HEVC) decoder, were added to FFmpeg.[30] In 2016 the native AAC encoder was considered stable, removing support for the two external AAC encoders from VisualOn and FAAC. FFmpeg 3.0 (nicknamed "Einstein") retained build support for the Fraunhofer FDK AAC encoder.[31] Since version 3.4 "Cantor" FFmpeg supported the FITS image format.[32] Since November 2018 in version 4.1 "al-Khwarizmi"AV1 can be muxed in MP4 and Matroska incl. WebM.[33][34]
Components
Command-line tools
ffmpeg is a command-line tool that converts audio or video formats. It can also capture and encode in real-time from various hardware and software sources[35] such as a TV capture card.
ffplay is a simple media player utilizing SDL and the FFmpeg libraries.
ffprobe is a command-line tool to display media information (text, CSV, XML, JSON), see also MediaInfo.
Libraries
libswresample is a library containing audio resampling routines.
libavresample is a library containing audio resampling routines from the Libav project, similar to libswresample from ffmpeg.
libavcodec is a library containing all of the native FFmpeg audio/video encoders and decoders. Most codecs were developed from scratch to ensure best performance and high code reusability.
libavformat (Lavf)[8] is a library containing demuxers and muxers for audio/video container formats.
libavutil is a helper library containing routines common to different parts of FFmpeg. This library includes hash functions, ciphers, LZO decompressor and Base64 encoder/decoder.
libswscale is a library containing video image scaling and colorspace/pixelformat conversion routines.
libavfilter is the substitute for vhook which allows the video/audio to be modified or examined (for debugging) between the decoder and the encoder. Filters have been ported from many projects including MPlayer and avisynth.
libavdevice is a library containing audio/video io through internal and external devices.
Supported hardware
CPUs
FFmpeg encompasses software implementations of video and audio compressing and decompressing algorithms. These can be compiled and run on diverse instruction sets.
There are a variety of application-specific integrated circuits (ASICs) for audio/video compression and decompression. These ASICs can partially or completely offload the computation from the host CPU. Instead of a complete implementation of an algorithm, only the API is required to use such an ASIC.[37]
The following APIs are also supported: DirectX Video Acceleration (DXVA2, Windows), Direct3D 11 (D3D11VA, Windows), Media Foundation (Windows), Vulkan (VKVA), VideoToolbox (iOS, iPadOS, macOS), RockChip MPP, OpenCL, OpenMAX, MMAL (Raspberry Pi), MediaCodec (Android OS), V4L2 (Linux). Depending on the environment, these APIs may lead to specific ASICs, to GPGPU routines, or to SIMD CPU code.[41]
FFmpeg supports many common and some uncommon image formats.
The PGMYUV image format is a homebrew variant of the binary (P5) PGM Netpbm format. FFmpeg also supports 16-bit depths of the PGM and PPM formats, and the binary (P7) PAM format with or without alpha channel, depth 8 bit or 16 bit for pix_fmtsmonob, gray, gray16be, rgb24, rgb48be, ya8, rgba, rgb64be.
Output formats (container formats and other ways of creating output streams) in FFmpeg are called "muxers". FFmpeg supports, among others, the following:
^UYVY 10bpc without a padding is supported as bitpacked codec in FFmpeg. UYVY 10bpc with 2-bits padding is supported as v210 codec in FFmpeg. 16bpc (Y216) is supported as targa_y216 codec in FFmpeg.
FFmpeg does not support IMC1-IMC4, AI44, CYMK, RGBE, Log RGB and other formats. It also does not yet support ARGB 1:5:5:5, 2:10:10:10, or other BMP bitfield formats that are not commonly used.
FFmpeg contains more than 100 codecs,[71] most of which use compression techniques of one kind or another. Many such compression techniques may be subject to legal claims relating to software patents.[72] Such claims may be enforceable in countries like the United States which have implemented software patents, but are considered unenforceable or void in member countries of the European Union, for example.[73][original research] Patents for many older codecs, including AC3 and all MPEG-1 and MPEG-2 codecs, have expired.[citation needed]
FFmpeg is licensed under the LGPL license, but if a particular build of FFmpeg is linked against any GPL libraries (notably x264), then the entire binary is licensed under the GPL.
FFmpeg is used by ffdshow, FFmpegInterop, the GStreamer FFmpeg plug-in, LAV Filters and OpenMAX IL to expand the encoding and decoding capabilities of their respective multimedia platforms.
As part of NASA's Mars 2020 mission, FFmpeg is used by the Perseverance rover on Mars for image and video compression before footage is sent to Earth.[79]
^"Download". ffmpeg.org. FFmpeg. Archived from the original on 2011-10-06. Retrieved 2012-01-04.
^FFmpeg can be compiled with various external libraries, some of which have licenses that are incompatible with the FFmpeg's primary license, the GNU GPL.