TV's are already in place in every home, as the viewing device for tv shows. Being used as a console display is an optional feature, but regardless if used for that or not, it will always be there for the TV viewing.
PC monitors on the other hand are exclusively used for PC displays, so are a connected package. They won't be used without a PC.
Of course anybody upgrading the PC can transfer their monitor over. However this does lead to something I've been meaning to get off my chest. Shouldn't the cost of the OS be factored into the PC? Unlike the display, the OS can't be transferred over to the new rig. Maybe that was possible pre-2000, but when XP came along Microsofr introduced the activation needed to keep the OS running, where each copy of Windows is attached to a hardware profile of the machine it was installed on.
So if I understand correctly, if someone took their XP, Vista, or Win7 disc and tried to install it on another machine, then tried to activate it, a discrepenacy would show up in the data sending process that shows it had already been previously activated on a machine with completely different hardware specs (CPU, GPU, RAM, MoBo specs, etc...) and preventing its successful activation.
MS set things up that each machine needs its own copy of Windows, so a new one would have to be bought for a new machine. Maybe that's not the case with Linux, but lets not kid ourselves here, that OS hardly has the usage that Windows does, and games aren't made for that OS, just as surely as PC fans are quick to point out they aren't made for MacOS. OK, some games are, but not anywhere near as much as what Windows has available.
So? Does the cost of a new copy of Windows factor into the cost of a newly home built PC?
Log in to comment