Recent from talks
Nothing was collected or created yet.
Network Device Interface
View on WikipediaThis article has multiple issues. Please help improve it or discuss these issues on the talk page. (Learn how and when to remove these messages)
|
Network Device Interface (NDI) is a software specification developed by the technology company NewTek. It enables high-definition video to be transmitted, received, and communicated over a computer network with low latency and high quality. This royalty-free specification supports frame-accurate switching, making it suitable for live video production environments.
Technology
[edit]NDI is designed to run over gigabit Ethernet[1] with the NDI codec.[2] It delivers 1080i high-definition video at variable data rates typically around 100 Mbit/s.[3]
By default, NDI uses multicast DNS to advertise sources on a local area network, such that NDI receivers can automatically discover and offer those sources. It also supports two other discovery modes (NDI Access, NDI Discovery Server) that allow for operations across subnets and without multicast DNS. Sources are created using an arbitrarily selected TCP port from a range of ports on the NDI senders. When a source is requested, a TCP connection is established on the appropriate port with the NDI receiver connecting to the NDI sender. NDI 3.x has options to use UDP multicast or unicast with forward error correction (FEC) instead of TCP, and can load balance streams across multiple network interface controllers (NICs) without using link aggregation. NDI version 4.0 introduces the Multi-TCP transport.
NDI carries video, multichannel uncompressed audio,[citation needed] and metadata. Metadata messages can be sent in both directions allowing the sender and receiver to message one another over the connection with arbitrary metadata in XML form.[4] This directional metadata system allows for functionality such as active tally information fed back to sources to understand that they are on-air. NDI also allows senders to determine the number of connected receivers, so they can skip unnecessary processing and network bandwidth utilisation when there are no NDI receiver clients connected. NDI Receivers can opt to connect to various combinations of streams, to support, for instance, audio-only or metadata-only connections where video is not required.
The NDI software development kit (SDK) is available for Windows, Linux, and MacOS,[5] and has also been ported to iOS, tvOS, Android, Raspberry Pi, and FPGA.[citation needed] The Standard NDI SDK is available via a royalty-free proprietary license.[6] The NDI Advanced SDK offers OEMs direct access to and from compressed data and other features, with a commercial license.
Comparison of common IP video protocols
[edit]Other IP video protocols for use in professional video production (rather than IP video used for distribution to end users) include SMPTE 2022, SMPTE 2110, ASPEN (largely superseded by SMPTE 2110) and Sony NMI. There are clear differences in the technology used by these protocols.
Protocol Parameter
|
NDI | NDI HX | NDI HX2 | SMPTE 2022-6[7] | SMPTE 2110 | ASPEN[8] | NMI[9] | CDI[10] |
|---|---|---|---|---|---|---|---|---|
| Developed by | NewTek | NewTek | NewTek | SMPTE | VSF and SMPTE | ASPEN Community | Sony | AWS |
| Transport | TCP/UDP/Multi-TCP/Reliable UDP[A] | UDP (TCP) | TCP/UDP/Multi-TCP/Reliable UDP[A] | UDP (RTP) | UDP (RTP) | UDP | UDP | UDP / SRD |
| Image format | Size / Aspect Independent | Size / Aspect Independent | Size / Aspect Independent | SDI Formats only | Size / Aspect Independent | Size / Aspect Independent | ||
| Tally | Yes | Yes | Yes | No | No[B] | No | No | |
| Bidirectional device control | Yes | Yes | Yes | No | No | No | No | |
| Integrated Proxy | Yes | Yes | Yes | No | No | No | No | No |
| Integrated alpha channel | Yes | No | Yes | No | Yes | Yes | ||
| Compression | NDI Codec (SHQ 0/2/7)[11] | NDI|HX (H.264) | H.264/H.265 | NONE[C] | SMPTE 2110-22 spec.(JPEG XS, but not limited to) | NONE | NONE / LLVC Codec | NONE |
| Connection | Socket, Unicast / Multicast and FEC | Unicast / Multicast | Socket, Unicast / Multicast and FEC | Multicast | Multicast | Multicast | Multicast / ? | SRD |
| HD (1080i) data rate | ~100 Mbit/s | 8–20 Mbit/s | ~1–50 Mbit/s | >1.5 Gbit/s | >1.1 Gbit/s | >1.5 Gbit/s | >1.5 Gbit/s / up to 14:1[12] | >1.5 Gbit/s |
| Essence packing | Discrete audio, metadata and video frame packets, single connection | Modified RTSP/RTP type connections | Discrete audio, metadata and video frame packets, single connection | Packetized raw SDI bitstream | Discrete audio, video and Metadata on separate connections with different protocols | Multiple MPEG transport streams | Frame aligned 2022-6 / LLVC | Discrete audio, video and Metadata in SMPTE2110 Formats |
| Infrastructure | Gigabit / wireless / load balanced multi NIC / 10 Gbit | Gigabit / wireless | Gigabit / wireless | 10 Gbit minimum | 10 Gbit minimum | 10 Gbit minimum | Gigabit / 10 Gbit | 100 Gbit EFA / libFabric |
| Service Discovery | Bonjour (mDNS), NDI Access (manual), Discovery Server (NDI4) | automatic via HX driver | Bonjour (mDNS), NDI Access (manual), Discovery Server (NDI4) | NMOS[13] | AMWA IS-04 NMOS | JSON-RPC | Plug & play (NDCP) | NONE |
| API | Royalty-free license, SDK libraries for Win (x86), Mac, Linux (x86 & ARM), iOS,[14] FPGA reference | Hardware encode, decode with NDI libraries | Send with NDI Advanced SDK, receive with NDI libraries | SMPTE standard | SMPTE standard | SMPTE RDD | OPEN SOURCE |
- ^ a b NDI v1.0 was pure TCP. Later versions included options for UDP Unicast and Multicast as appropriate and FEC. NDI 4.0 adds 'Multi-TCP' transport, NDI 5.0 adds 'Reliable-UDP' transport . NDI
- ^ Tally for SMPTE 2110 relies on external data sources, using proposed AMWA IS 07.
- ^ The TICO RDD35 codec can be used to compress UHD by 4:1 so an encoded stream can be carried along a SMPTE 2022-6 channel at the same uncompressed bandwidth as HD.[15] SMPTE 2110 with TR-03 also offers the potential to use TICO RDD35 and JPEG XS.[16] This requires a proprietary encoder and decoder which are generally implemented as silicon on each end.
History
[edit]NDI was publicly revealed by NewTek on 8 September 2015 and was demonstrated at the International Broadcasting Convention in Amsterdam that week.[17] The first device shown using NDI was the NewTek TriCaster which delivered an NDI feed from each of its SDI inputs as well as four output feeds from its vision mixer. The TriCaster could also receive up to two NDI sources from other devices.[18]
NewTek had previously created a predecessor of NDI called AirSend to get video from external devices into their TriCaster products. AirSend had been implemented by a number of character generator (CG) vendors including Vizrt and Chyron. In order to quickly bring these products into the NDI space, NewTek created a new driver to replace the existing AirSend driver, which could be installed on these existing AirSend-compatible devices, instantly converting them to NDI-compatible devices with no change required by the original CG vendors.[19]
BirdDog was an early adopter and in 2018 released Studio NDI, an ASIC implementation of NDI. BirdDog went on to deliver NDI PTZ cameras, along with a host of software applications.[citation needed]
Another early adopter of NDI was VMix, a Windows-based vision mixer that offers NDI inputs and outputs.[20] A significant increase in the NDI installed base came when live-streaming application XSplit added support for NDI.[21]
Later in 2016, NewTek delivered NDI 2.0, which added features including support for service discovery across subnets. In April, Magewell announced integration of their PCIe and USB capture devices, allowing access to any video source on the network.[citation needed]
On 12 July 2017 NewTek announced NDI 3.0 which added multicast, NDI|HX and other new features, introducing support for specific PTZ Cameras with H.264 chipsets and updated firmware.[22]
In April 2018 at the NAB Show, Microsoft announced they had added NDI support to Skype for Content Creators.[23] Other announcements at NAB in 2018 included NDI 3.5, and new NDI support from vendors including EVS[24] and Avid.[25]
Version 3.4 of FFmpeg added input and output support for NDI when optionally compiled by the end-user to a non-open-source NewTek library.[26] However, NewTek was later found to be redistributing pre-compiled binaries of FFmpeg that incorporated their non-open-source library in violation of the GNU General Public License, causing the FFmpeg project to remove NDI support from their code base in March 2019.[27][28]
In April 2019, ahead of NAB, NewTek announced the addition of Multi-TCP mode to NDI 4.0 which uses hardware TCP offload engine present in silicon, assisting lower-spec processors with NDI transport.[29] NDI 4.0 shipped in September 2019 to coincide with the IBC exhibition.
In July 2021, NDI 5 was released adding reliable UDP transmission, redundant discovery server support, and NDI 5 Tools (NDI Bridge, NDI Remote, NDI Audio Direct, FCP-X output)[30]
| Version | Released | Features and enhancements |
|---|---|---|
| 1.0 | April 2016 | Initial Release |
| 2.0 | Sept 2016 | Cross subnet support via Access Manager, ARM encoding support, iOS SDK |
| 3.0 | July 2017 | NDI-HX support, Option for multicast transport with FEC, PTZ support |
| 3.5 | June 2018 | Defaults to Unicast UDP transport method with FEC |
| 3.6 | July 2018 | Packet level bonding improvements |
| 3.7 | Sept 2018 | Discovery improvements |
| 3.8 | Nov 2018 | Performance Enhancements, NDI Embedded SDK for FPGA development |
| 4.0 | Sept 2019 | Multi-TCP Transport mode, HDR 16-bit support, Native synchronized recording, Discovery Server, NDI HX2[29] |
| 4.1 | Nov 2019 | Performance improvements |
| 4.5 | March 2020 | New Multi-TCP implementation, Lower latency with NDI-HX, Predictable port numbers for firewall rules, Improvements to codec quality |
| 4.6 | Nov 2020 | Can support NDI-HX2 encoding on PC's using NVENC[31] |
| 5.0 | July 2021 | Reliable UDP transmission, Redundant discovery server support, NDI 5 Tools (Bridge, Remote, Audio Direct, FCP-X output)[32] |
| 5.1 | Feb 2022 | NDI Bridge enhancements, DNS name support for Discovery Server, SDK improvements |
| 5.5 | Aug 2022 | NDI Router, Quad Inputs on NDI Webcam Input with 4K video support, NDI Remote adds Talkback and desktop sharing features, Auto start applications on system boot, NDI FreeAudio command line tool (found with SDK) |
| 6.0 | Apr 2024 | Added specification for NDI HDR metadata, Improved support for 16-bit colour formats, NDI Bridge Utility for Linux, NDI-KVM support for Video Monitor on macOS, NDI Router for macOS, NDI Studio Monitor on Windows enhanced with HDR support for display and recording |
| 6.1 | Dec 2024 | NDI Bridge improvements, NDI Bridge service, 16-bit colour formats available in FPGA platforms, dynamic bandwidth adjustment API in NDI Advanced SDK |
| 6.2 | June 2025 | NDI Receiver API (provides status, monitoring and control), Updated Discovery Server (provides the capabilities for NDI Receiver API), new Discovery application provided with NDI Tools. |
| 6.3 | Jan 2026 | NDI Sender Monitoring API (provides status and monitoring of sources), Updated Discovery Server (provides the capabilities for NDI Sender Monitoring), Enhanced SpeedHQ codec quality, HDCP compatibility with compliant encoders/decoders. |
Use in Wi-Fi and wide-area networks
[edit]NDI was designed to work on good quality gigabit local area networks using TCP/IP and Bonjour (mDNS) technologies. In order to work across subnets that don't pass mDNS, NDI supports a mechanism known as NDI Access, which allows manual entry of the IP address of machines on other subnets that may be running NDI sources.
Some NDI adopters have run the protocol across fibre connections up to 15 km, although NDI's use of the TCP protocol makes it less suitable for long-distance, high-latency connections due to factors such as bandwidth-delay product and TCP packet loss recovery.[33] Later versions of NDI introduced different transport protocols, including UDP, MultiTCP and reliable UDP (QUIC), which support more diverse network characteristics. NDI|HX uses a lower data rate, making it easier to use in bandwidth-limited connections.
Use in cloud-based infrastructure
[edit]NDI's compressed video and unicast transmissions make it suitable for cloud-based services like AWS and Azure. When operating in cloud environments, the NDI Discovery service provides a solution compatible with multicast restrictions common to cloud platforms. Starting with NDI 5, the Bridge tool allows you to connect remote locations together, be it over the open Internet or via a VPN. NDI Bridge uses NDI-HX2 at selectable bitrates and codec types (H.264 or HEVC) as the transmission format between locations (High Bandwidth NDI transmission is also possible), all other aspects of the NDI signal are preserved (metadata, alpha, tally, etc).
CPU architecture support
[edit]NDI, when running on x86 or x86-64 architectures, requires CPUs that include the SSSE3 instruction set. Most Intel CPU designs starting in 2006 have this instruction set, AMD added support starting in 2011. While not a requirement, NDI will take advantage of Advanced Vector Extensions (AVX) and AVX2 instruction sets for additional performance.
NDI can run on 32- or 64-bit CPU architectures, although performance is increased when using 64-bit.
NDI 4.x and earlier had limited support for ARM, generally offering encode-only support. The release of NDI 5 brings full support for encode and decode on ARM-based processors that include Neon instructions. This includes Apple silicon processors.
NDI|HX devices are typically transmit-only and based on proprietary platforms with hardware H.264 encoder chips. Examples of NDI|HX devices are PTZ cameras, and the NDI Connect Spark SDI to NDI|HX converter box.
With NDI 4.0 NewTek announced the addition of a Multi-TCP transport mode. This takes advantage of the hardware TCP acceleration in silicon, which helps lower-spec processors handle heavy network load, in contrast to UDP, which does not benefit from the hardware acceleration.[34]
Metadata and extensions to the NDI specification
[edit]NDI supports arbitrary metadata as XML blocks, embedded in video and audio frames, as well as stand-alone metadata frames. The content of these metadata blocks forms three families.
- Internal metadata used by NDI invisibly. These messages are used for connectivity and some other fundamental tasks, like Tally. They are typically invisible to NDI clients.
- Defined public metadata. These messages include things like the NDI PTZ Protocol. They are defined by NewTek as part of the NDI SDK.
- Third-party metadata schemas. These messages are implemented in the same way as defined NDI metadata, but the content is based on third-party designs.[35][36]
References
[edit]- ^ Coalition, ProVideo (8 September 2015). "NewTek announces NDI, an open protocol for IP production workflow by Allan Tépper".
- ^ "NDI Encoding/Decoding". NewTek Knowledge Base.
- ^ "NDI Network Bandwidth". NewTek Knowledge Base. Archived from the original on 2021-04-11. Retrieved 2017-03-09.
- ^ "NewTek Announces NDI – Open Protocol for IP Production Workflow - Technical Standards". Content-technology.com.
- ^ "NewTek NDI SDK". Newtek.com.
- ^ "NDI® license agreement". new.tk. Retrieved 3 June 2020.
- ^ "Numerical Index of Smpte Standards, recommended Practices, Engineering Guidelines and Registered Disclosure Documents" (PDF). Smpte.org. Retrieved 2017-05-23.
- ^ "ASPEN Community - Home". aspen-community.com.
- ^ "Sony launches Networked Media Interface in collaboration with leading broadcast industry organisations : Press : United Kingdom : Sony Professional". Sony.co.uk. Archived from the original on 2017-03-22. Retrieved 2017-03-21.
- ^ "Cloud Digital Interface". aws.amazon.com.
- ^ "SpeedHQ". wiki.multimedia.cx. Retrieved 2020-07-31.
- ^ "NMI Core - Nextera Video Inc". Nexteravideo.com.
- ^ "What are the Networked Media Open Specifications?". Nmos.tv.
- ^ "NewTek NDI SDK". NewTek.
- ^ "TICO Alliance - Technology". www.tico-alliance.org.
- ^ "SMPTE RDD 35 - TICO Lightweight Codec Used in IP Networked or in SDI Infrastructures - Engineering360". standards.globalspec.com.
- ^ "NewTek announces NDI, an open protocol for IP production workflow by Allan Tépper". Provideocoalition.com. 8 September 2015.
- ^ Kerschbaumer, Ken (September 19, 2016). "NewTek's IP Series Video-Production System Targets New Markets". Sports Video Group.
- ^ "NDI AirSend Updater-NewTek". pages.newtek.com.
- ^ Bridge, The Broadcast (18 April 2016). "StudioCoast vMix Adopts NewTek NDI Standard - The Broadcast Bridge - Connecting IT to Broadcast". Thebroadcastbridge.com.
- ^ "XSplit Broadcaster Adopts NewTek NDI for IP Production Workflow". Marketwired.com. 2016-04-12. Retrieved 2017-05-23.
- ^ "NewTek NDI® Version 3 Offers the Only End-to-End IP Video Solution for Product Manufacturers". NewTek.
- ^ "Make collaboration the heart of your live streams, podcasts, and videos with Skype for Content Creators". Skype Blogs. 6 April 2018.
- ^ Director, Ken Kerschbaumer, Editorial (5 September 2017). "EVS X-One Production System Takes Software-Based Tech to New Level". Sports Video Group.
{{cite web}}: CS1 maint: multiple names: authors list (link) - ^ "Avid Media Composer Integrates NewTek NDI®".
- ^ Baker, Chuck (December 6, 2017). "FFmpeg 3.4 Adds NDI I/O". NewTek.
- ^ "NewTek distributing non-free FFmpeg build". FFmpeg bug trac. December 3, 2018.
- ^ "Remove libndi_newtek". FFmpeg github repo. March 9, 2019.
- ^ a b Kurz, Phil (April 3, 2019). "NewTek's Andrew Cross Discusses Acquisition, NAB Show Plans".
- ^ "NDI Tools | NDI.tv".
- ^ "NVIDIA Broadcast Features Now Integrated Directly in Top Streaming Applications".
- ^ "NDI Tools | NDI.tv".
- ^ Gatarski, Richard (9 May 2016). "Documentary: Stockholm Kista cross production using NDI over fiber". Weststreamu.se.
- ^ "NewTek's Andrew Cross Discusses Acquisition, NAB Show Plans". 3 April 2019.
- ^ "NDI Metadata Standards". Sienna. Retrieved 2023-09-27.
- ^ NDIS Software
External links
[edit]Network Device Interface
View on GrokipediaFundamentals
Definition and Core Principles
Network Device Interface (NDI) is a royalty-free, software-based specification developed by NewTek, now part of Vizrt, that enables the transmission of high-quality, low-latency video, audio, and metadata over standard IP networks.[7] As an open protocol, NDI facilitates bi-directional communication between multimedia devices, allowing them to discover each other and share content in real-time without requiring proprietary hardware.[1] This approach contrasts with traditional hardware-dependent systems by leveraging commodity Ethernet infrastructure, such as gigabit switches, to support scalable live production workflows.[7] At its core, NDI operates on principles of efficiency and interoperability, using UDP and TCP protocols for both discovery and data transmission to achieve frame-accurate switching suitable for live environments.[1] Discovery is handled via multicast DNS (mDNS) over UDP, enabling zero-configuration device identification across the network, while transmission supports both unicast (point-to-point) and multicast (one-to-many) modes to optimize bandwidth usage.[7] Video, audio, and metadata are integrated into synchronized streams, with audio accommodating unlimited channels and metadata enabling features like tally lights or control signals, all encoded for low-latency delivery over existing LANs.[1] The basic workflow of NDI begins at the source device, where content is encoded—typically using efficient compression like SpeedHQ for full-bandwidth streams—before being packetized and sent via UDP for speed or TCP for reliability.[7] These packets traverse the IP network, utilizing standard Ethernet cabling and switches without the need for specialized video transport hardware, and arrive at receiver devices for decoding and playback.[1] This end-to-end process ensures high fidelity and minimal latency—in High Bandwidth mode, approximately 16 scan lines—making NDI ideal for professional broadcast and AV applications.[1]Key Features and Benefits
One of the primary advantages of NDI is its low-latency transmission capability; the High Bandwidth mode achieves sub-frame latency of approximately 16 scan lines, while NDI HX modes deliver under 100 ms glass-to-glass latency on average.[1][8] These performance levels facilitate real-time switching in live production environments and ensure frame-accurate video and audio synchronization, making it suitable for applications requiring immediate feedback without perceptible delays.[9] NDI's royalty-free licensing model, including the open SDK available for developers, promotes widespread adoption by eliminating financial barriers to integration.[10] Launched in 2015 by NewTek, this approach has enabled thousands of hardware and software products to incorporate NDI without ongoing royalty fees, fostering an expansive ecosystem.[11] The protocol's scalability stands out, supporting virtually unlimited sources and receivers on a single network segment, limited primarily by available bandwidth rather than inherent protocol constraints.[12] This design allows for expansive multi-camera setups where multiple devices can send and receive streams simultaneously over standard Ethernet infrastructure. Key benefits include significant cost savings through the elimination of traditional SDI cabling, as NDI leverages existing IP networks to reduce installation and maintenance expenses—often reported as one-tenth the cost of equivalent SDI systems.[13] It also offers enhanced flexibility in multi-camera configurations, enabling seamless reconfiguration without physical rewiring, and integrates natively with production software such as TriCaster and vMix for streamlined workflows.[14] Regarding quality preservation, NDI supports full-frame HD and 4K resolutions with minimal compression artifacts, particularly in its High Bandwidth variant, which uses efficient encoding to maintain visual fidelity comparable to uncompressed signals.[8]History and Development
Origins and Early Adoption
The Network Device Interface (NDI) originated from NewTek's AirSend technology, developed in the early 2010s as an IP-based tool to transmit video from external devices into NewTek's TriCaster production systems, addressing the constraints of traditional wired connections in live video workflows.[15][16] AirSend enabled initial experiments with network video delivery, particularly for integrating sources in broadcast environments where cabling limitations hindered flexibility.[17] This precursor laid the groundwork for NDI by focusing on low-latency IP transmission to overcome the rigidity of Serial Digital Interface (SDI) standards, which relied on dedicated coaxial cables and struggled with scalability in dynamic production settings.[18] NDI's development emphasized solving SDI's shortcomings in broadcast production, such as high cabling costs and limited multi-device connectivity, through an open IP protocol for uncompressed, high-quality video over standard networks. Internal testing at NewTek refined these capabilities in the lead-up to public release, culminating in the official launch announcement on September 8, 2015. The first public demonstration occurred at the IBC 2015 conference in Amsterdam, where NDI showcased real-time video sharing across devices, marking a pivotal shift toward IP-centric live production.[19][15] Early adoption faced challenges like network configuration complexities and compatibility with existing hardware, but it gained traction through NewTek's TriCaster systems, which integrated NDI as a core feature by 2016 to enable seamless input/output over IP.[17] Third-party tools accelerated initial uptake, with vMix adding NDI support in March 2016 to facilitate multi-source live switching over networks, followed by XSplit's integration later that year for enhanced streaming workflows. These implementations highlighted NDI's versatility beyond NewTek ecosystems, fostering experimentation in small-scale broadcasts and events. Key milestones included the formation of NDI user communities to share best practices and troubleshoot adoption hurdles, alongside strategic partnerships following Vizrt's 2019 acquisition of NewTek, which expanded NDI's reach into broader visual storytelling applications.[20][21][22]Version Timeline and Milestones
The development of Network Device Interface (NDI) has progressed through several major versions since its initial public release, each introducing enhancements to performance, compatibility, and functionality to meet evolving demands in video production workflows.[23][24] NDI version 1.0 was released in April 2016 as the initial software development kit (SDK), providing foundational support for high-definition (HD) video transmission over IP networks using standard Ethernet infrastructure.[17] This version established the core protocol for low-latency, uncompressed video sharing among compatible devices. In September 2016, version 2.0 followed, incorporating basic support for 4K resolution and high dynamic range (HDR) workflows, along with cross-subnet discovery capabilities via the NDI Access Manager tool.[25] These additions expanded NDI's applicability beyond single-network environments. Version 3.0, launched in August 2017, focused on refining network discovery mechanisms and introducing access control features to enhance security and reliability in multi-device setups.[26] By April 2019, version 4.0 debuted at the NAB Show, introducing the NDI Access Manager for advanced security configurations, unlimited NDI channel recording, and improved synchronization with time-stamped metadata.[24][27] Version 5.0 arrived in July 2021, enabling full 4K at 60 frames per second (4K60p) support and the NDI Bridge utility for wide-area network (WAN) transmission, facilitating remote production over the internet.[28][19] In April 2024, version 6.0 was unveiled at the NAB Show, adding native HDR metadata handling, 16-bit color depth for higher fidelity, and Linux support for the NDI Bridge.[29][30] Version 6.1, released in December 2024, extended 16-bit color support to field-programmable gate array (FPGA) platforms and introduced a dynamic bandwidth adjustment API for optimized transmission in variable network conditions.[23][31] Version 6.2, released in June 2025, incorporated the NDI Receiver API for enhanced device integration and an updated Discovery Server for improved network observability and management.[32][3] In September 2025, version 6.3 was released at IBC, building on the control layer from version 6.2 by enabling third-party applications to control NDI sources and receivers, further enhancing interoperability and management in complex workflows.[33] Key milestones in NDI's adoption include its integration with Amazon Web Services (AWS) infrastructure in 2022, enabling resilient video switching and routing within virtual private clouds for cloud-based broadcasts.[34] This advancement supported scalable, remote production environments, as demonstrated in high-profile events like the National Hockey League's cloud-based game productions.[35]Technical Specifications
Protocol Mechanics and Codec
The Network Device Interface (NDI) protocol employs UDP as its primary transport mechanism for delivering video and audio streams, enabling low-latency transmission over IP networks.[36] For device discovery, NDI utilizes multicast DNS (mDNS) to allow zero-configuration identification of sources and receivers on the local network without manual IP addressing.[37] Additionally, NDI supports multicast distribution, where a single source can efficiently deliver streams to multiple receivers using UDP, provided the network supports IGMP for subscription management; this reduces bandwidth overhead in multi-point scenarios.[38] At the core of NDI's video transmission is the proprietary SpeedHQ codec, also referred to in earlier documentation as the Scalable High Quality (SHQ) codec, which provides variants optimized for different bandwidth needs.[39] SHQ 0 is designed for lower bandwidth applications, while SHQ 2 and SHQ 7 target higher quality outputs, achieving visually lossless compression through spatial techniques using discrete cosine transform (DCT).[40] Unlike inter-frame codecs such as H.264, SpeedHQ emphasizes intra-frame spatial compression to minimize encoding and decoding delays, resulting in ultra-low latency suitable for live production—typically 16 scan lines or less in software implementations.[40] In terms of transmission mechanics, NDI encodes video frames at approximately 100 Mbit/s for 1080i resolutions in high-bandwidth mode, balancing quality and network efficiency.[41] Audio is transmitted as uncompressed pulse-code modulation (PCM) at 48 kHz with up to 16 channels in 32-bit floating-point format, ensuring synchronization with video.[42] Metadata, including tallies, PTZ controls, and timecode, is embedded via discrete XML-formatted packets sent over the same connection, allowing flexible integration without separate streams.[43] For reliability, NDI incorporates forward error correction (FEC) in UDP-based streams, particularly for multicast, by adding redundant data packets that allow receivers to reconstruct lost information without retransmission requests.[36] In NDI version 5 and later, Reliable UDP (RUDP) enhances this with selective retransmissions, sequencing, and congestion control, providing TCP-like reliability while preserving UDP's low latency for error-prone environments.[36]Supported Formats and Performance
NDI supports a wide range of video formats, accommodating resolutions from standard definition (SD) up to 8K and beyond, with frame rates scaling automatically to match source material, including progressive formats like 4K60p.[44][45] Color depths include 8-bit and 10-bit in YUV formats (e.g., UYVY with 4:2:2 subsampling) and RGBA for alpha channel support, with internal pipeline processing up to 16-bit for higher fidelity; as of NDI 6, 10-bit color transmission is natively supported in both High Bandwidth and HX formats.[46] In NDI 6, HDR is officially supported via HLG and PQ transfer functions, enabling 10+ bit color for enhanced dynamic range in both NDI High Bandwidth and HX formats. As of NDI 6.2 (2025), core specifications remain consistent with NDI 6, with added network monitoring features.[3][46] Audio transmission in NDI handles up to 16 channels of floating-point PCM audio, with a standard sample rate of 48 kHz for broadcast compatibility, though any input sample rate is accepted with automatic resampling as needed.[44][47] Performance metrics for NDI emphasize low latency and efficient bandwidth usage, particularly on local area networks (LAN). On Gigabit Ethernet LANs, end-to-end latency typically measures under 16 ms for High Bandwidth streams, equivalent to about one frame at 60 fps, achieved through a technical latency of 16 scan lines.[44][48] Over wide area networks (WAN) using tools like NDI Bridge, latency increases to approximately 100 ms due to additional encoding and network traversal, suitable for remote production but less ideal for real-time interactivity.[49] Data rates vary by resolution and format; for example, a 1080p60 High Bandwidth stream requires around 100-150 Mbit/s, while 4K60p can reach 250 Mbit/s, and lower resolutions like SD use as little as 20 Mbit/s.[44][41] In NDI HX variants, bandwidth scales down to 50 Mbit/s or less for 1080p60, enabling higher stream density.[45] Testing and scalability benchmarks require a minimum CPU with SSSE3 instruction set support, introduced in Intel processors from 2005, ensuring compatibility across modern x86 and ARM platforms with NEON.[50] On a standard Gigabit network, NDI High Bandwidth supports 6-8 simultaneous 1080p60 streams, while HX formats allow for 20+ streams; overall, systems can handle over 100 low-bandwidth or audio-only streams depending on hardware and configuration.[41] These rates leverage the SHQ codec for visually lossless compression with PSNR exceeding 70 dB.[44]| Format Example | Resolution/Frame Rate | Bandwidth (Mbit/s) | Typical Latency (LAN) |
|---|---|---|---|
| SD | 720x480p30 | ~20 | <16 ms |
| HD | 1080p60 | 100-150 | <16 ms |
| UHD | 4K60p | ~250 | <16 ms |
| Audio-only | 16 ch, 48 kHz | <1 per channel | ~6 ms |
Comparisons with Other Protocols
NDI vs. SMPTE Standards
Network Device Interface (NDI) differs from SMPTE standards like ST 2022-6/7 and ST 2110 in its approach to IP video transport, balancing accessibility with performance for diverse production environments. NDI employs lightweight compression to enable efficient distribution over standard networks, while SMPTE ST 2022-6/7 encapsulates uncompressed SDI signals into IP packets for direct migration from legacy systems, and ST 2110 separates video, audio, and metadata into independent essence streams for greater flexibility in professional setups.[51][52][53] The following table highlights core protocol differences:| Aspect | NDI | SMPTE 2022-6/7 | SMPTE 2110 |
|---|---|---|---|
| Compression | Yes, low-latency (e.g., SpeedHQ or H.264 for reduced bandwidth) | No, uncompressed video and audio | Variable: uncompressed (ST 2110-20), or low-latency compressed (ST 2110-22 with JPEG XS) |
| Multicast Support | Yes, over IP networks | Yes, IP multicast for distribution | Yes, native IP multicast for essence streams |
| Latency | Low, but elevated by compression processing | Low, akin to SDI transport | Sub-frame low, optimized for real-time synchronization via PTP |
Bandwidth and Latency Trade-offs
NDI employs lightweight compression via its SpeedHQ codec to significantly reduce bandwidth requirements compared to uncompressed video transport standards. For 1080p60 high-definition video, NDI typically consumes 100-150 Mbit/s per stream, enabling multiple simultaneous streams over a standard Gigabit Ethernet network.[41] In contrast, uncompressed HD video over IP, as in SMPTE ST 2110 workflows, demands approximately 1.5 Gbit/s, necessitating higher-capacity 10 Gbit/s infrastructure for efficient handling.[59] This compression efficiency allows NDI to support professional video distribution in bandwidth-limited environments without sacrificing visual fidelity, though it introduces a modest processing overhead. The primary trade-off in NDI arises in latency, where compression encoding and decoding add delay to achieve bandwidth savings. NDI's technical latency is equivalent to 16 video scan lines, which for 1080p60 corresponds to approximately 0.25 ms (one frame is 16.7 ms), with end-to-end latency often around one frame (16.7 ms) in optimized hardware implementations.[40] Total latency can be modeled as: NDI minimizes the encoding component by using intra-frame compression that avoids inter-frame dependencies, achieving technical latency equivalent to 16 scan lines.[40][60] Relative to alternatives, NDI offers lower latency than SRT, which typically exhibits 50-200 ms variable delay due to its adaptive buffering for unreliable networks, though SRT achieves this at much lower bandwidths of 2-8 Mbit/s for HD.[61] NDI's bandwidth remains higher than dedicated H.265 streaming solutions, which compress HD to 20-50 Mbit/s but incur 100-300 ms encoding/decoding latency from motion estimation.[62] For bandwidth-constrained scenarios, NDI supports optimization through modes like NDI HX3, which reduces usage to ~50 Mbit/s for 1080p60 while maintaining latency under 100 ms, or proxy streams via SpeedHQ profiles (e.g., lower-quality SHQ variants) to further limit consumption.[63][64]Applications and Use Cases
In Local and Broadcast Networks
In local networks, Network Device Interface (NDI) facilitates seamless multi-camera switching in controlled environments such as television studios and outside broadcast (OB) trucks, enabling the transmission of high-quality video, audio, and metadata over standard Ethernet infrastructure.[2] For instance, in news production workflows, NDI integrates with systems like Vizrt's TriCaster, allowing producers to route multiple camera feeds and graphics sources dynamically without physical reconfiguration, supporting up to 16 or more inputs in a single production pipeline.[65] This approach is particularly valuable in compact OB trucks, where space constraints demand efficient signal routing; a notable example is the world's first fully NDI-based OB vehicle developed by Kiloview and Youku, which handles 20 channels of 4K or 70 HD streams over Ethernet, streamlining operations for live events like sports broadcasts.[66] NDI's integration into broadcast environments often replaces traditional SDI matrices by leveraging IP workflows that support an effectively unlimited number of sources, limited only by network bandwidth rather than dedicated hardware ports.[67] In studio setups, this shift enables ad-hoc connections between devices, such as cameras, switchers, and monitors, fostering flexible production without the need for extensive routing switchers.[68] By consolidating audio, video, control, and tally signals into a single Ethernet cable, NDI significantly reduces cabling complexity and costs compared to SDI infrastructures, which require separate coaxial cables for each signal type.[69] A key advantage in these local and broadcast applications is the support for hot-swappable devices, allowing cameras or peripherals to be added or removed during live productions without interrupting the workflow, as NDI automatically discovers and integrates new sources on the network.[2] This feature, combined with NDI's low-latency performance, ensures reliable real-time switching essential for time-sensitive broadcasts.[70]In Wi-Fi, WAN, and Remote Production
NDI supports wireless transmission over Wi-Fi networks, particularly on the 5GHz band, which provides higher bandwidth suitable for short-range applications in environments free of significant obstacles.[71] This configuration enables low-latency video feeds from devices like cameras to receivers within close proximity, such as in small studios or event spaces, but the signal's range is limited compared to wired Ethernet, often extending only tens of meters indoors due to wall penetration issues.[72] Interference from other 2.4GHz or 5GHz devices, including overlapping Wi-Fi networks, can degrade performance, necessitating mitigation strategies like channel selection and dedicated access points to maintain stream stability.[73] For bandwidth-constrained Wi-Fi setups, the NDI HX codec reduces data rates to support reliable transmission over variable wireless connections.[74] To extend NDI beyond local networks to wide area networks (WAN) and enable remote production, NDI Bridge was introduced in 2021 as part of NDI 5. This tool facilitates secure tunneling of full NDI streams over the public internet using 256-bit AES encryption and RUDP transport, allowing remote interconnection of NDI infrastructures without requiring VPNs.[75] It supports transcoding to H.264 or HEVC for efficient bandwidth use, preserving features like audio, metadata, and KVM control, and is configured in host-join modes for point-to-point or multipoint connections.[76] In field production scenarios, such as sports events, NDI Bridge enables real-time collaboration between on-site crews and distant control rooms, eliminating the need for costly satellite or microwave links by optimizing for variable internet conditions.[77] Practical applications include remote interviews, where news teams use NDI Bridge to securely transmit high-quality feeds from global locations into central studios, ensuring synchronized audio and video with minimal latency.[75] For major events, NDI powered remote production at the 2024 Paris Olympics badminton competition, where Kiloview encoders distributed signals efficiently without overwhelming network bandwidth, supporting multi-camera setups for live coverage.[78] In remote production scenarios, RTMP streams from distant sources can be ingested using free tools like nginx with its RTMP module as a lightweight ingest server and OBS Studio with the NDI plugin to convert the stream to NDI output, enabling low-latency integration into NDI workflows over WAN connections. For detailed setup instructions, refer to the SDK, Tools, and Integration section.[79][80] These implementations highlight NDI's adaptability for WAN environments through features like dynamic buffer adjustments and codec selection, which maintain performance across fluctuating connections typical in remote field operations.[76]In Cloud-Based Systems
Network Device Interface (NDI) has become integral to cloud-based video production workflows, enabling seamless integration with major cloud providers for distributed and scalable operations. In Amazon Web Services (AWS), NDI support is provided through AWS Elemental MediaConnect, which allows for the conversion of MPEG transport streams into NDI outputs, facilitating low-latency live video contribution directly to the cloud. This integration supports streams up to 1080p60 in AVC/HEVC formats, making it suitable for high-quality delivery within virtual private clouds (VPCs).[81][82] Microsoft Azure supports NDI through virtual machines and infrastructure running NDI-enabled software, where instances handle video switching and routing over IP networks, as demonstrated in setups for broadcast-quality streaming using VMs such as Standard_B2s running Windows or Linux.[83] NDI's deployment in cloud virtual machines (VMs) enhances remote editing capabilities by allowing editors to access and manipulate high-definition video feeds in real-time from anywhere, without the need for physical hardware. Using Azure VMs, NDI tools like the Discovery Server and applications such as OBS Studio enable sender-receiver configurations for collaborative editing in virtual studios. This setup supports low-latency transmission for live events and post-production, bridging local capture with cloud processing. NDI Bridge can briefly extend WAN connectivity to these cloud environments for hybrid access.[83][34] Key use cases include cloud-based live streaming for esports and sports events, where NDI enables efficient production at scale. For example, Blizzard Entertainment utilized NDI alongside SRT for cloud-based esports productions, allowing real-time video routing and mixing in distributed environments to handle multiple sources during live tournaments.[84] Hybrid workflows combining on-premises capture with cloud processing are also common; as of 2024, NDI 6 introduced enhancements like improved metadata handling and scalability for seamless transitions between local and remote setups in applications like live playout from on-site archives distributed via cloud.[23][85] The benefits of NDI in cloud systems center on elastic scaling and global distribution, permitting dynamic resource allocation to match production demands without fixed infrastructure costs. This elasticity supports bursting to handle peak loads in live events, while global distribution leverages multi-region deployments—such as separate NDI flows in AWS across regions—for low-latency worldwide delivery and synchronization. Tools like dedicated NDI Discovery Servers in cloud VPCs ensure reliable multi-region sync, enhancing reliability for international collaborations.[86][82][87]Hardware and Software Support
CPU and Platform Compatibility
The Network Device Interface (NDI) requires CPUs supporting the SSSE3 instruction set as a minimum for x86 architectures, equivalent to Intel Core 2 Duo or later processors, ensuring basic encoding and decoding functionality.[50] For ARM-based systems, NEON SIMD extensions are mandatory to handle NDI's video processing demands.[50] While these minimums suffice for standard-definition workflows, higher resolutions such as 4K benefit from AVX2 instructions on Intel platforms, which optimize performance through 256-bit vector operations and reduce latency in multi-stream scenarios.[88] NDI maintains broad cross-platform compatibility across desktop, mobile, and embedded systems. It fully supports 64-bit versions of Windows 7 and later, macOS 10.13 (High Sierra) and newer, and Linux distributions including Ubuntu 18.04 LTS onward, with x64 providing optimal throughput on all OSes. Mobile integration is available via official apps for iOS and Android, enabling devices like smartphones to send or receive NDI streams directly.[6] Embedded platforms such as Raspberry Pi are compatible through ARM64 Linux builds, supporting lightweight NDI reception and transmission in resource-constrained environments.[11] A 2024 update introduced full native support for ARM64 architectures across macOS, Linux, iOS, and Android, enhancing efficiency on Apple Silicon and other ARM processors without emulation overhead.[11] For FPGA implementations, NDI 6.1 and later versions provide IP cores optimized for Xilinx and Intel (formerly Altera) devices, enabling hardware-accelerated encoding and decoding in broadcast-grade setups as of 2025.[89]SDK, Tools, and Integration
The NDI Software Development Kit (SDK) is a free resource provided by Vizrt for developers to integrate NDI functionality into their applications, enabling the sending and receiving of high-quality video and audio over IP networks.[11] Available for download from the official NDI website, the SDK supports multiple programming languages through its APIs, including C/C++ for low-level access, .NET for Windows-based development, and Java for cross-platform compatibility.[90] It includes core libraries for NDI send and receive operations, allowing developers to implement source discovery, stream transmission, and frame handling without proprietary hardware dependencies.[90] Complementing the SDK, NDI offers a suite of free tools to facilitate testing, management, and deployment in production environments. NDI Studio Monitor provides a simple application for viewing and monitoring NDI streams, supporting audio and video playback with low-latency preview capabilities on Windows and macOS.[91] NDI Access Manager enables administrators to control source visibility and permissions across networks, ensuring secure and selective discovery of NDI endpoints.[92] The NDI Discovery Server tool extends multicast discovery to larger or segmented networks by centralizing source announcements, which is particularly useful in enterprise setups where mDNS may be limited.[93] These tools are bundled in the NDI Tools package, version 6.2.1 as of late 2025, and can be downloaded directly from the NDI site.[91] For broader ecosystem adoption, NDI integrates seamlessly with third-party software through plugins and native support. Popular applications like OBS Studio incorporate NDI plugins for capturing and streaming NDI sources in live production workflows, while Wirecast from Telestream uses NDI for input and output in professional broadcasting.[91] Developers can embed NDI into custom applications via SDK plugins; for instance, the open-source KlakNDI plugin allows Unity projects to send and receive NDI streams, enabling real-time video feeds in AR/VR environments such as virtual production or immersive simulations.[94] This plugin-based approach simplifies integration, requiring minimal code changes to leverage NDI's plug-and-play interoperability.[95] A practical example of using free tools for RTMP ingest and conversion to NDI output involves nginx with its RTMP module as a lightweight ingest server and OBS Studio with the NDI plugin to pull and output the stream as NDI. This setup enables remote or local stream ingestion into NDI workflows with low latency. The steps are: 1. Install nginx with the RTMP module. 2. Add an RTMP block to nginx.conf, for example:rtmp {
server {
listen 1935;
chunk_size 4096;
application live {
live on;
allow publish all;
allow play all;
}
}
}
rtmp {
server {
listen 1935;
chunk_size 4096;
application live {
live on;
allow publish all;
allow play all;
}
}
}
Advanced Capabilities
Metadata Handling
NDI's metadata handling employs an extensible XML-based format to embed ancillary data directly into streams, ensuring synchronization with video and audio frames for seamless integration in professional production environments. This structure allows metadata packets to be attached to specific frames, maintaining frame-accurate timing without disrupting the primary audiovisual content. The XML adheres to standards where a single root element encapsulates the data, and multiple elements are grouped using the<ndi_metadata_group> container introduced in NDI 6.0 for video frames and 6.1 for generic metadata.[43][98][99]
Key supported metadata types include tally signals for indicating on-air or preview status, under monitor display (UMD) information for labeling sources on production monitors, timecode for precise synchronization across devices, and captions for accessibility features. Tally and UMD are typically conveyed through connection-oriented XML elements, such as <ndi_product> with attributes like session_name for UMD labels, while timecode utilizes timestamp fields in tracking or frame metadata to align with frame timing. Captions, including legacy closed captioning and CEA-708 support, are encoded by representing SDI vertical ancillary data packets directly in XML format, enabling bidirectional transmission for real-time processing. These elements facilitate interoperability in broadcast setups, where, for instance, tally metadata can trigger lighting cues or camera indicators automatically.[98][100][101]
In practice, metadata is inserted frame-accurately at the source using the NDI SDK, allowing senders to attach XML payloads to outgoing frames for immediate synchronization. Receivers extract this data upon ingestion, enabling automation workflows such as auto-switching in video mixers based on tally states or overlaying UMD text on multiviewers without additional network latency. This bidirectional capability extends to control signals, where extracted metadata informs downstream decisions, like routing based on timecode alignment or rendering captions in sync with video playback.[102][99]
Advanced features in NDI 6.0 and later enhance metadata for high-dynamic-range (HDR) workflows, incorporating the <ndi_color_info> element to specify transfer characteristics (e.g., BT.2100 HLG), color primaries, and luminance metadata synchronized per video frame. This ensures HDR streams maintain color accuracy across devices, with metadata payloads adding negligible overhead—typically up to 1 Mbit/s per stream—to the overall bandwidth while supporting complex professional applications. Integration with the NDI SDK simplifies embedding and parsing of these elements for custom automation.[103][98][41]
