FANDOM


Introduction to Broadcast Studio Workflows Edit

This is a short description of how Broadcast Studios work and what components are essential in order to record a technically good program. The components will also be documented with examples.

DescriptionEdit

A Broadcast Studio consists of many different components, some of them are optional and studios can produce programs without them, but some are essential in order to fully operate a studio. The table below lists the essential components and very briefly explains their meaning in a studio.

Component Why Essential?
Camera Channels A system camera is a very obvious component in a broadcast studio as it catches live images. Camera Channel means that the camera used here is most probably a system camera connected to a machine room with a single cable but two way signal flow. There is a video and audio signal going to a CCU (Camera Control Unit) from the camera and on the other side there are camera comms channels, data for racking and return video signals going to the camera from the CCU. These signals are carried on one cable (Triax/SMPTE Fibre/Multicore) rather than running separate cables for each of the signal. From the CCU a video signal is then taken to feed the Vision Mixer.
Vision Mixer A Vision Mixer is a machine that takes all the video signals (cameras, playback machines etc) and computer graphics signals for lower thirds and lets the director or the person vision mixing decide which video to take on programme. In almost every case every video signal going into the mixer has to be exact the same standard in order to produce a consistent video. Some state of the art vision mixers have build in up- and down-converter, so any video signal that is of lower resolution can be converted into a video that can be mixed with the other video signals or vice versa.
Audio Mixer An Audio Mixer as a machine that takes audio from various audio sources (i.e. microphones) and processes it in order to make it able to be broadcast. Also audio levels of different microphones and other audio sources are matched here in order to produce good quality audio in a programme.
Microphones A microphone takes voice and noise and converts it to an electrical signal that travels through a cable to a destination. A typical destination is an audio mixer but also cameras in which you can plugin microphones.
Comms (Camera/Presenter Talkback) The most important thing to make a programme happen successfully is communication between all participants. If the communication between the director and camera operators fails, the production is at big risk and if the communication between the producer/director and presenters fails then the presentation is at risk. A good mean of communication between all participants is a centralised Comms Matrix. A comms matrix will have all the cameras, presenter ear piece channels, desk talkback panels and perhaps radio basestations connected to it. The software of the matrix lets configure and set which participants should be able to speak to each other.
Monitor Screens Monitor screens are simply used to watch live video coming from all cameras, vision mixer and other video sources that need to be monitored. In modern days there are machines available called Multiviewers that take in many video inputs and display them all on one screen, all arranged in the order that the user wants. In broadcast studios there should also be a grading monitor to rack all the cameras. This monitor is normally connected to a switcher, ideally to the video matrix, so the vision engineer can switch between different cameras to rack them.
Speakers Speakers are sound output devices and are essential so the sound operator can hear what he is mixing and the production crew in the production control room knows what's happening in the show.
SPG (Sync Pulse Generator) A Sync Pulse Generator (SPG) is a machine that generates a timing signal for broadcast equipment like cameras, vision mixer and other kit connected to the broadcast infrastructure. Without a Sync Pulse the timing of each piece of equipment will be different which will cause the picture not being in tact with other pictures. This will create problems because pictures will start at a different point and end at a different point from each other. Sync issues can be seen easily, if the picture of a camera is rolling down on the vision mixer preview its not synced (horizontal line sync issue).
DAs (Distribution Amplifiers) As every piece of kit has a limited number of signal outputs and more outputs may be required or in some cases are definitely required, there are distribution amplifiers to multiply those signals. The outputs of a DA have the same strength of signal as the input signal. There are analogue video DAs, which boost the analogue video signal; however, if there is distortion or noise in the signal, it will be boosted as well and be worse at the DA output. There are also digital DAs that work different to analogue DAs and are used to DA SDI signals. They re-generate digital video signals with help of the error correction embedded in SDI and get rid of minor errors in the signal. A common use of DAs is to distribute the Sync Pulse across the kit because an SPG has usually no more then just a few outputs, but the signal is needed everywhere.
Video Matrix (Router) A Video Matrix, also called a Router, has a number of inputs and a number of outputs. Different video signals can be routed by just pressing buttons instead of changing cables. A single signal can also be routed to multiple destinations. This part is necessary because in a broadcast studio environment, signal routes can be changed remotely without having to access the machine room and physically changing cables. Also, as mentioned earlier, a single signal can be routed to multiple destinations.
All the above mentioned kit is essential to make a broadcast studio work and the way how they are connected to each other is displayed in the simple CAD drawing below. This CAD drawing has been made for a small broadcast studio's machine room with scalable core parts, such as the Router, Vision Mixer and Reference DAs. All the mentioned core parts that have been used in the system work with expansion cards, so additional cards can be acquired at a later point in case of an expansion. The system designed for this studio is not capable of recording programs, but to make live shows with the signal going to a Master Control Room (MCR).
Pathway1 1

ReferencesEdit

There are masses of other pieces of kits and systems to add to the infrastructure to make the workflow easier and add other functions to the system. For example a Teleprompter could be added so the presenter of a show could read out their lines off a screen right on the lens of the camera in front of them. Another device to be added could be a Tally Router, which would enable the red or green lights on top of the cameras to switch on when that specific cameras is on air/preview. This would show presenters and the crew inside the studio which camera is on air/preview. There is also a connection of Tally Routers to Multiviewers, which displays on the multiviewer which camera is on air/preview by adding a red or green light to the picture of that specific camera. Furthermore, the producers might want to display lower thirds or other graphics on the show. For that there is a CG (Computer Graphics) machine required to be connected to the Vision Mixer.

If the studio wants to record shows, which almost all studios always do, there is a mean of recording to be added. This can be an expensive VTR (Video Tape Recorder), which are old fashioned in todays digital world, or a tapeless recording system that records on SSDs (Solid State Drives). The transition from hard recordings to soft recordings has made workflows easier and much quicker. Tape recordings were expensive, as the VTRs required were very expensive to acquire and also tapes were not very cheap. Reusing tapes meant that the new recording wouldn't have the same quality as the first recording and also just playing out a tape would impair the quality of content on the tape. Tapes also had to be manually transported to MCRs and other departments, also the editing required two VTRs (at least one of them with editing functionality) as the editing was also tape based. However, as digitisation progressed, there were devises available to capture content on computer hard drives and edit them digitally. That also meant that content could be accessed by different departments easily if it was stored on a central server. This also made it easy for different users to access the same recording at the same time. Furthermore, the digitisation has reached the point were recordings don't have to be done on tapes anymore and can be made on SSD recorders or servers straight away. Recordings on SSDs can be copied and transferred to other machines quickly and easily and recordings on servers can be accessed just after the recording finishes. This has changed the workflow a lot because work can be done quicker and costs of expensive VTRs, their servicing and tapes are saved. Recordings can be made on same SSDs over and over again without losing quality, which saves a lot of money.

Another very meaningful development in broadcast workflows is integrating IP into systems. An example of what has been replaced by IP in broadcast is the Routing system. In earlier days video Routers could be controlled with buttons on the device itself, but at a point there were remote control panels made for routers which most commonly had a BNC cable connection to the Router. After some time the connection on most Routers to their remote control panels was made via CAT5 cables and also IP started evolving in this area. Now it is possible to change router settings and signal routings via IP remotely from anywhere. It is now also possible to control Vision Mixers, Audio Mixers, Multiviewers, Tally Routers and some other kit remotely via IP, which is game changing. That way it is possible to control many things easily and centrally from one PC or even a Smartphone (network or internet connection required) and more things are going towards IP. Another area in the broadcast chain, where IP is evolving, is the transmission of content. As mentioned earlier, most content is file based (digitised) now and can easily be transferred via IP over the internet or Fibre links to remote premises. Even broadcasting content has started happening over IP now, where internet providers who offer high-speed internet have started offering IPTV to their customers. A popular part of the services they offer is VOD (Video On Demand), where the viewer requests content to be streamed to them.

As the bandwidth requirement of content is rising with development of new video standards (i.e. UHD), alternative means of transmission have to be found. The bandwidth available at current terrestrial television transmission frequencies is not enough to handle an Ultra HD signal so an alternative could be Satellite Broadcast or IPTV. As UK's internet provider BT is already offering IPTV and has got the bandwidth to handle very high data rates, it is starting their new UHD channel (BT Sports UHD) in August 2015. This channel will be offered only on IPTV, being Europe's first UHD channel.

References Edit

http://www.bbc.co.uk/rd/projects/ip-studio

http://m.broadcastnow.co.uk/5089197.article