• Please review our updated Terms and Rules here

New Micro PC fom Dell.

Dell is clearly promoting it as something you can connect to your living room TV to play games and watch movies... but doesn't everyone who would be interested in such a thing already have a game console which can do that?
 
No.

I've got a fifteen year old Dell that I use for that. (Movies, no games).

It's tempting, but, do I really need it?
 
I use an old Athlon Opteron x4 system for watching Netflix (its a big white desktop ATX Enlight case I should be using for something else). A small out of the way box like the dell would look better for TV. I did have a small black box mini ATX machine for Netflix but the CPU fan was too loud. Home theater cases cost as much as that dell does complete.
 
I wouldn't do anything fun with it, but it sure would be nice to have something dead quiet and not a big clunky thing like I have now, which may not even work anymore (power spike a couple weeks ago).
 
I've had something similar to this for a long time, the issue with it is presumably lack of 4K capability though.
 
If you needed something in that form factor with VGA out some of the Intel NUC models have VGA.

The NUC5PGYH currently runs a bit more, somewhere around $230

Bought one of those for my kid for Xmas, added a 128Gb SSD and it's a pretty responsive machine. Perfect for her needs. I'd totally get one for my TV in the living room, except it won't output 4k, so I'll just stick with the quad core AMD system I have connected to it, though I am planning on adding a GTX 950 to it so I can get it to push out a full 4K.
 
Bought one of those for my kid for Xmas, added a 128Gb SSD and it's a pretty responsive machine. Perfect for her needs. I'd totally get one for my TV in the living room, except it won't output 4k, so I'll just stick with the quad core AMD system I have connected to it, though I am planning on adding a GTX 950 to it so I can get it to push out a full 4K.

Unless you have an unusually large TV and/or are viewing it from an unusually close distance, you won't even see the difference between 1080p Full HD and 2160p Ultra HD (so-called "4K" even though it's really only 3.84K):

resolution_chart.jpg
 
I have an 82" TV...prices are dropping so much my next one will probably be 100". Also some people have projectors.
 
you won't even see the difference between 1080p Full HD and 2160p Ultra HD

You'd have to be nearly blind not to see the difference. I have a 65" 4K Sony Bravia, it replaced a 47" 1080p TV I had before, there is a significant noticeable difference between the 2 TVs. The 4K TV has a way sharper image, almost life like. The difference is akin to that of SD vs 720p. So I think this graph is wrong, or at least quite flawed. At the store I compared a 65" 1080p TV to that of the 4K in the same size, and even there you could see the difference, it's staggering and noticeable from quite a ways away (across the store).

The reason why 4K is "not worth it" right now has more to do with the fact that there is really very little content out there for it. Nearly nothing is 4K native right now, and streaming has obvious limitations in this space due to the bandwidth required to stream the amount of data you'd need for it (although I did put up some 4K youtube streams on my TV and they looked pretty damn nice). There is also no real standard for 4K either, but that is coming and once it does there will be a marked increase in native 4K content.

so-called "4K" even though it's really only 3.84K

When 1080p first started appearing, it was sometimes referred to as 2K by people in the AV industry. So the same thing happened when 4K was emerging. They needed something to call it, and 4K was close enough and much shorter than 2160p so it just stuck. 720p could technically be called 1K. I would be willing to bet that when the standard is ratified and agree upon it will most likely be called 2160p and UHD.
 
Last edited:
Why do you convert to NTSC? That's gonna be a pretty fuzzy image.

NTSC is what I use, and no reason to "upgrade". I watch a total of about 100 hours of TV a year, and most of that is football, sometimes on a 13" black&white. :D
 
You'd have to be nearly blind not to see the difference. I have a 65" 4K Sony Bravia, it replaced a 47" 1080p TV I had before, there is a significant noticeable difference between the 2 TVs. The 4K TV has a way sharper image, almost life like. The difference is akin to that of SD vs 720p. So I think this graph is wrong, or at least quite flawed. At the store I compared a 65" 1080p TV to that of the 4K in the same size, and even there you could see the difference, it's staggering and noticeable from quite a ways away (across the store).

The real test will be to see if you can tell the difference between real "4K" program material and upscaled 1080p program material from your normal viewing distance.

It is also possible that TV manufacturers really don't care about the picture quality of 1080p TVs anymore since that's not the hot market segment right now, so they're putting cheaper LCD panels and cheaper image processing into 1080p sets in order to sell them at a lower price, and putting better quality components into the "4K" sets.

I've heard claims that the last generation of plasma TVs were also purposely cheapened-out in order to make LCD sets look better when viewed side-by-side in stores, in order to shake off the still-commonly-held belief that "plasma TVs look better than LCD TVs".

When 1080p first started appearing, it was sometimes referred to as 2K by people in the AV industry. So the same thing happened when 4K was emerging. They needed something to call it, and 4K was close enough and much shorter than 2160p so it just stuck. 720p could technically be called 1K. I would be willing to bet that when the standard is ratified and agree upon it will most likely be called 2160p and UHD.

There is actually a real 4K standard used in movie production: 4096x2160. I guess they had some computer geeks on hand who remembered that 4096 bytes = 4K, so that's where the name came from. But that's a slightly wider aspect ratio than 16:9, so when they cropped it down to 16:9 for consumer TVs it became 3840x2160 -- but they kept calling it "4K" anyway even though it's no longer 4000+ pixels across.

Anyway, a really big problem in the lack of true 4K program material is that ever since movie production switched from 35mm film to digital in the early 2000s, they've been using 2K resolution for the cinema masters. Even many movies being released today are still only produced in 2K. So it will be forever impossible to create true 4K versions of many movies from the 2000s and 2010s -- not like older movies where they can just re-scan the film at a higher resolution.

It's just like all the music in the '80s and '90s that was recorded in 16-bit/48 kHz digital. The big push today in the audiophile world is 24-bit/96 kHz "hi-res" music, but how are you going to get 24/96 out of material that was 16/48 to begin with? You can't, not like older albums where they can go back to the analog master tapes and re-transfer them at a higher sampling rate.
 
Why do you convert to NTSC? That's gonna be a pretty fuzzy image.

NTSC is fine if you use connections that don't multiplex the signal like straight coax or composite video. Component, RGB and Scart can give you nearly a crystal clear image if you use nice cables.
 
The real test will be to see if you can tell the difference between real "4K" program material and upscaled 1080p program material from your normal viewing distance..

I suppose, but if you can't tell the difference between upscaled 4K and native 4K material then what's the problem? This would mean that you your upscaling algorithm is working as it should be and will still be a better image than viewing the content in it's native HD resolution. The Bravia TV I have does this really really well, especially with 720p and 1080p content (lower than that will really depend on the actual content). The image is really quite awe inspiring.

Adding a 4K GPU into my machine hooked to the computer really only shifts which device is doing the actual upscaling, and I am not actually 100% sure that an NVidia GPU will do it much better than the TV itself, but it does mean that any native 4K content I get will be pushed to the display at 4K and not downscaled to 1080 by the computer, then upscaled back to 4K by the TV.
 
Back
Top