Over the last couple of months I’ve been considering buying a TV to also use as a PC monitor. I’ve been surprised to find relatively very little information online about it so here’s what I’ve discovered and my experiences so far.
My experience has been great, so far.
You don’t have a telly?!
I’ve often been amazed by people’s reactions whenever I’ve told them that I don’t own a TV.
“You don’t have a telly?!”
Until recently I’ve not felt that I’ve really need one. I do have a TV licence as you still need one to watch shows on the BBC iPlayer but I watch those on my PC or Android tablet.
This is fine on my own, but watching films with my boys has been tricky. Most often we’ve been lying down, scrunched up, watching my laptop at the foot of the bed.
Moving out of hall into my own wee house added new dynamics, so I finally gave in and decided to buy a TV.
As I don’t have much space in my house and wouldn’t have anywhere to put it other than on my desk, I knew that it would have to double as a PC monitor.
So I knew my research had to cover two areas:
- Graphics cards
- 4K Ultra HD televisions
Put simply, a graphics card is the technology inside a computer that creates the images to output to a screen. Generally speaking, the better the card, the better quality the images and the higher resolution it can handle. This is especially true if you play computer games. The better the graphics card, the more detail you see and the smoother the games appear to play.
I had quite an old graphics card, an NVIDIA GeForce GTX 660 (2 GB) which at the time I bought it was only a few notches down from the top-level gaming cards available. It used a PCI Express 3.0 slot, had two GB of RAM and supported a maximum digital resolution of 4K (4096 × 2160) which was fine for my dual-HD monitors setup (3840 × 1080). It served me well for four or five years and was exactly what I needed then.
But it was beginning to struggle with a few of the more modern games. Forza Horizons 3, for example, we were running on the lowest specs available within the game and even then it was complaining that the graphics card wasn’t coping.
This article from PC Gamer was very helpful in helping me identify the kind of graphics card to upgrade to:
You can use any TV with HDMI inputs in place of a computer display. If you’re looking at 4K TVs, you’ll want a graphics card with at least an HDMI 2.0 port (HDMI 2.0a or later for HDR10 displays). That allows for 4K at 60Hz, with 24-bit color. For GPUs, all the Nvidia 10-series parts and the GTX 950/960 can do this, and AMD’s RX Vega 56/64 and 500-series cards also support HDMI 2.0 or later.Should I use a 4K TV as a computer monitor?
After a little research, I upgraded to an NVIDIA GeForce GTX 1060 with 6 GB of RAM. Another PCI Express 3.0 card but this time with an overclocked processor (so that it runs faster), three times as much memory, and support for 8K resolution (7680 × 4320 pixels). It also had the outputs that I needed (DVI and HDMI), giving me plenty of options.
That should do the job.
Top tip: I used PC Part Picker to identify graphics cards that would be compatible with my motherboard.
It took me about 10 minutes to remove the old card and install the new. I then downloaded and installed the latest drivers.
What a difference it made. On my 24″ HD monitor, I was able to run Forza Horizons with everything cranked up to high or ultra!
Time to choose a television.
4K Ultra HD TVs
While researching whether you could feasibly use a 4K TV as a monitor, besides the advice about using HDMI 2.0a, I discovered two further considerations:
- Signal lag
- Chroma subsampling 4:4:4
One of the biggest differences between traditional PC monitors and televisions is input lag. This makes sense as the only input a TV generally needs to consider most of the time is from the remote control, and most of us will happily wait for a second or two while the channel changes.
The only other time that most TVs need to consider input lag is when you plug a games console into it. When you tap left on your gamepad stick you want your character to move left immediately.
As a lot of modern TVs process the video signal to make the action in films look smoother, you need to switch this off when plugging in a games console or computer. To enable this, most TV manufacturers now include a ‘game’ mode.
This is important when connecting a PC to a TV because without game mode enabled even simple things like dragging your mouse across the screen shows a noticeable lag: you move the mouse—there is a slight delay—the pointer eases itself across the screen. It doesn’t take too long for this to get annoying.
4:4:4 chroma subsampling
Because TVs are designed for watching fast-action images they are not so good at displaying the sharp text that you might want to manipulate while using a computer.
So if you want to use a 4K TV as a monitor, we need to make sure it is capable of displaying sharp text. TVs manage this using something called ‘chroma subsampling’, although most of the documentation and specifications of TVs—disappointingly—won’t call it this. You may have to do some digging around in user manuals.
This article “chroma subsampling 4:4:4 vs 4:2:2 vs 4:2:0” explains the importance of 4:4:4 chroma subsampling better than I ever could, but the thing to remember if you want to use a 4K (ultra HD) TV as a monitor is that it must support 4:4:4 chroma subsampling.
For moving images such as TV shows and movies this isn’t critical, 4:2:2 and 4:2:0 do just fine, but for displaying crisp text—which most of us use our PCs for manipulating—it is essential that you have a screen that supports 4:4:4.
I had to do a lot of detective work to figure this out for most of the TV models that I looked at. It is rarely included in the at-a-glance specifications for each TV. I often had to find the model on the manufacturers’ websites, download the manual and search through it.
It turns out that most modern television sets will support 4:4:4. After a lot of reading, I discovered that each manufacturer has a particular way of describing it.
- Sony TVs call it “HDMI enhanced format” and require you to set the picture mode to “graphics”.
- Samsung TVs call it “HDMI UHD color” and require you to select “PC” mode.
- LG TVs called it “HDMI ULTRA HD deep color” and require you to set the picture mode to “game”.
My experience so far…
I have plugged into it:
- HDMI 1: PC
- HDMI 2: Blu-ray/DVD player
- HDMI 3: Google Chromecast Ultra
So far, my experience with the TV as a monitor has been great.
Picture mode is set to “Game” and “HDMI ULTRA HD deep color” is set to on for the input to ensure 4:4:4 chroma subsampling.
Within Windows’ display settings, I also have the HDR (high dynamic range) and WCG (wide colour gamut) setting set to on.
Forty-three inches is a good size for my desk. If anything, it is a little large and a curved model may have been better (though I can’t see any smaller than 49″).
Picture quality has been superb. For writing and reading text documents and browsing I have no issues.
There is no discernible input lag on any of the games we’ve played (mostly LEGO, Call of Duty Black Ops, Fortnite, Overwatch and Forza).
As indicated above, with game mode enabled there is little noticeable lag when moving the mouse. Until the recent WebOS update on the TV (v4.10.04), sticking the TV into any other mode (eco, sports, vivid, etc.) made it look like the mouse was slowly gliding through water; the latest update seems to have reduced the latency on other picture modes.
And with the NVIDIA GeForce GTX 1060 pushing the graphics the quality has been wonderful, with every game on high and/or ultra settings.
I would definitely recommend making the switch from HD monitors to a 4K ultra HD TV.