Edit: I guess this post is free advertising for these shitters, so I will describe what I previously linked.
There is this TV you can get for free but it has a ads on screen constantly and it has a camera pointed at you to record your reactions to ads and to ensure you don't cover up the ad portion of the screen.
Exactly, what's the use of a smart TV when I have a game console capable of streaming everything a "Smart TV" can AND playing games/browsing the Web?
9/10 times people use a fire stick or cable box to watch TV anyway, all I need is volume, input selection, and power.
The use of a smart TV is to make manufacturers money by selling your personal data to advertisers as part of their post purchase monetisation strategy. Literally admitted by Vizio's CEO (who are really following other brands).
And if it upscaled well, it's already got the processing power to have all the apps installed. So, cool it doesn't have that, but it's kind of a waste you don't... And seems a lot more likely you're overestimating it.
But I have a feeling if you look up your TV on rtings you'll be surprised, or find out you bought a weird Black Friday model that's been feature stripped to be as cheap and low quality as possible while marking a couple highlights to advertise on the box.
Smart TV defines the extra functionality, not the processing hardware.
If a smart TV has the Wi-Fi and streaming services disabled it is by definition NOT a smart TV.
If a smart TV has the Wi-Fi and streaming services disabled
If it's disabled that doesn't mean it disappears...
And no one has linked or provided the model number of a good 4k that doesn't have wifi.
Maybe that's what happening? People think not connecting WiFi means something doesn't have wifi?
Is that what you think?
Edit:
Oh, you're the one who said they own a TV like that...
You could easily prove your point by providing the model number, but you're not, you're just trying to argue.
I don't see the point in someone acting like that unless your trolling, so if you don't want to say the model, I don't see any point in trying to help anymore.
It's good to see Sceptre is still kicking around, I couldn't find any info on what kind of upscaling they're using. But it's probably decent. Good dumb screens have been their niche for at least 20 years.
Everything nowdays includes microcontrollers or microprocessors, and often even in-silico (i.e. as hardware not software) implementations of things like decoders.
However it's a huge range of those things and the vast majority doesn't have the processing power and/or hardware petipherals to support "Smart TV" functionality.
For example that upscalling functionality can just be implemented in-silico in a separate chip or as part of the die of a SoC microcontroller for pennies, whilst the actual programmable part doesn't have anywhere the power or memory needed for the code implementing a fancy UI (such as that for a VOD provider such as Netflix) because that would cost tens or hundreds of dollars more (just go check the price of a standalone TV box).
The economics of the thing nowadays do make it worth it for a TV manufacturer to add the extra hardware needed to make the thing a Smart TV (the kinda crap one only costs maybe $10 - $20 or so more) especially if they can use it to shove adverts in front of people to recoup it or sell "Smart TV" as a premium functionality, but that's not at all the same as the HW for stuff like hardcoded algorithms such as upscalling being capable of running the software implementing Smart TV functionality.
Your "argument" is built on top of a serious misunderstanding of modern digital systems.
Hardware upscalling isn't needed in a monitor (unless maybe in really really special situations) because it's almost invariably connected via a digital connection that supports multiple resolutions to a device (such as a computer) with more than enough processing power to do the upscalling itself.
The only situation I can think of were upscalling would be useful in a monitor is one with a VGA connection (mainly to use with really old computers) since that protocol is analog so pretty much any random resolution can come down the pipe quite independently of the monitor's native resolution so the digital side of the monitor is forced to adjust it (and a proper upscalling algorithm is a lot nicer than something like double sampling the analog signal)
The previous poster was wrong about upscalling being common in computer monitors nowadays (I vaguelly remember it in the early LCD monitor days because of that VGA problem), but that doesn't mean you're right about upscalling support being present in a device beimg the same as there being everything in there necessary for full Smart TV functionality - I'm pretty sure upscalling comes as an hardware implementation of an algorithm, so it's not a generic computing unit with e the right peripherals, computing power and memory to run an Android system or equivalent that just so happens to be running software upscalling.