Computer Hardware - Video Output
Podcast looking at the computer video output connection from VGA to HDMI and brief details of before it became a standard
https://griffcomm.transistor.fm/episodes/chat-atari-st
Full episodes https://griffcomm.tv/
- Todays podcast is about the video out connection used on computers, laptops and gaming consoles
- The early years the use of old radio terrestrial methods allowing the TV to see the signal as a broadcast station
- An RF (Radio Frequency) encoder was used to connect the TV, spliced in to the antenna cable via an adapter box
- It emulated the signals that would of been received by the antenna on the roof, in the UK channel 36 was used
- By todays standards super low quality as the video signal was encoded to radio then decoded by the TV
- On research i was surprised to see the 1979 Atari 400 computer also had a dedicated video out connection
- Had audio, composite video, chroma and luminance used for the higher S-Video display method.
- Although still analogue, as the RF encode was not used it was much better quality
- The first Atari ST computer in 1986 also had a video out connection, was able to be used on standard compatible VGA displays via an adapter cable
- A year later in 1987 VGA (Video Graphics Array) connection became a standard
- The VGA signals were still analogue, having Red, Green, Blue, horizontal and virtual sync separate connections
- It also had a few extra connections for data, used to pass information such as the display make and model and resolution compatibility back to the computer
- VGA was able to support up to 2048 x 1536 resolutions, higher than HDMI which is 1920 x 1080
- Being analogue, is not as clear as todays HDMI connector method, although some prefer the analogue look
- In April 1999 DVI (Digital Visual Interface) appeared and as the name suggests this was the first digital video connection
- It was white and wider than VGA blue connection, were 3 types, DVI-D, DVI-A and DVI-I
- DVI-A was the older style VGA blue connection in the new DVI white size
- DVI-I supported both analogue and digital displays
- DVI-D was digital only
- Both DVI-A and DVI-I were able to be used with an adapter to convert them to the blue VGA connection type
- DVI-D supported up to 3840 x 2400, which is just over todays 4k size being 3840 x 2160 pixels
- Their were other standards within DVI such as dual links, will not cover those as DVI is no longer used
- December 2002 saw the introduction to the HDMI connector and is still used today
- It was electrically able to use DVI-D signals so an adapter could be used from DVI-D to HDMI, we have some here, although no longer used
- HDMI expanded on DVI-D in that it was more a data connection rather than just video
- HDMI supports many data protocols, including CEC (consumer electronics control) where the TV could send commands back to the PC
: Such as pressing the power button could also power cycle the PC
- There are many others, which would require its own podcast
- Display Port also appeared around the same time as was more aimed at the PC market directly
- HDMI had many other uses, including CEC and copyright protection to stop recoding the signal
- Display port was display, audio and some data only, it looked almost identical, only with one edge being square
- HDMI and Display port are not interchangeable (hence the different connectors)
- Display Port also had a ++ version denoted by DP++), allows it to emulate HDMI (a Display Port to HDMI cable could be used)
- However Display Port displays can not be used with a HDMI PC, ive been burned on this once, learned fast.
- Today both USB C and Thunderbolt can be used for a display if the PC hardware supports it via a cheap adapter cable
- Our small terminal PC's support 4 displays via 2 HDMI and 2 USB C connections all up to 4k resolution with no lag
- The USB c port turns in to a video port rather than it seen as USB emulating a display, this allow for higher frame rates such as games would require
