nice to see an article from my industry! st2110 is such a complex standard which a lot of the hardware mentioned has been molded to deal with.
That rack cabling is a bit rough. Appreciate it's a live event (I've worked on them myself) but come on :)
Unfortunate typo in the headline, reproduced here and once in the article. This is not about email or spam.
> like why they use bundles of analog copper wire for audio instead of digital fiber
Good article. Got me to read the article because I was curious why...
Neat post. I wonder what the drift is on those clocks.
Fun to see.
PipeWire had some decent AES67 support for network audio. Some really fun interesting hardware already tested. Afaik no SMPTE 2110 (which is video) but I don't really know.
I know it's not the use case but I do wish compressed formats were more supported. Not really necessary for production, but these are sort of the only defacto broadly capable network protocols we have for AV, so it would expand the potential uses a lot IMO. There may be some very proprietary JPEG-XS compression, but generally the target seems to be uncompressed.
https://gitlab.freedesktop.org/pipewire/pipewire/-/wikis/AES...
I'm amused but not entirely surprised to see that live video production hasn't meaningfully progressed since I was involved 30+ years ago.
Yes, the technology has evolved – digital vs analog (partly – for example analog comms here because digital (optical) "isn't redundant" (lol, what?)); higher resolution; digital overlays and effects, etc. But the basic process of a bunch of humans winging it and yelling to each other hasn't changed at all.
This is an industry ripe for massive disruption, and the first to do it will win big.
Seems the classic legacy overengineered thing that costs 100x production costs because it's a niche system, is 10x more complex than needed for to unnecessary perfectionism and uses 10-100x more people than needed due to employment inerta.
A more reasonable thing is to just use high quality cameras, connect to the venue fiber Internet connection, use normal networked transport like H.265 with MPEG-TS over RTP (sports fans certainly don't care about recompression quality loss...), do time sync by having A/V sync and good clocks on each device and aligning based on audio loud enough to be recorded by all devices, then mix, reencode and distribute on normal GPU-equipped datacenter servers using GPU acceleration
You might also want to mention AMWA NMOS, which is increasingly used alongside SMPTE 2110 in setups like this. NMOS (Networked Media Open Specifications) defines open, vendor-neutral APIs for device discovery, registration, connection management, and control of IP media systems. In practice, it's what lets 2110 devices automatically find each other, advertise their streams, and be connected or reconfigured via software.
The specs are fully open source and developed in the open, with reference implementations available on GitHub (https://github.com/AMWA-TV)
The specs define REST API's, JSON schemas, certificate provisioning, and service discovery mechanisms (DNS-SD / mDNS), providing an open control framework for IP-based media systems.