The vast majority of video displays all use the same logic, which I'm guessing is optimized for HDMI, whether their target audience is for watching TV and movies or connecting them to computers as "monitors".
The difference I've found is that the "smart OS" added to devices used as TVs has some logic that lets them perform better when you're viewing content that's constantly updating the entire screen, like when you're watching most TV shows and movies. The screen of a computer only updates a very small portion of the screen, even when scrolling. To use a device intended for TV and movies on a computer, all you need to do is go into the settings and adjust a few of them to turn off the stuff that, oddly enough, makes a lot of on-screen motion appear smoother while making static content (spreadsheets, program code, etc) look more blurry.
Also, things called "computer monitors" cost a lot more than things called "TVs" even though they have less logic on them.
Personally, I use a 55" 4k TV as my computer monitor. I have 14 "spaces" (desktops) set up where I keep open windows of things that are related, and the screen is big enough where I can spread things out and see parts of most windows rather than having everything shrunk down or stacked on top of each other.
The resolution of a 55" 4k TV is the same as a 2x2 arrangement of 27" HD monitors, at a fraction of the price.
Also, if you want to use a TV for your computer, you should buy one that's "dumb" -- it, one that's really cheap because it doesn't have a "smart OS" on it. You don't need any of that additional crap unless you also want to use it to watch TV and movies and get the best possible viewing quality. (I watch a lot of YouTube videos on my computer, mainly b/c I can't get YT on my TV. They look great in an HD-sized (1/4 screen) window.
Seriously, the only difference between a "TV" and a "monitor" is there's more stuff inside of the "TV" devices, even though they're typically a lot cheaper.