I think it's safe to say that ANYBODY who remotely understands how digital video works is horrified by the way the American broadcast and TV industry has mishandled and mindlessly mangled a good thing. What follows are changes that would result in a superior TV product for everyone:
* the system
would have 4 layers, spread among 2-4 physical components:
The transport layer. Its job would be to somehow acquire a 19.2mbit/second bitstream from somewhere and feed it to the decoder layer. Put in layman's terms, "the tuner". For example, one implementation might receive the over-the-air 8VSB signal and demodulate it to its 19.2mbit/second bitstream. Another implementation might perform a similar job, but decode a QAM-encoded bitstream delivered by the cable company. Hollywood would never allow it, but a DVD player would do the same thing if it were allowed to just read the bits off the disc, decode them, and spew them at the decoder layer. Or, getting a bit bolder, users could give Hollywood the finger and have a peripheral that plugged into the home LAN and gave the TV an IP address so the bitstream could come from a PC located somewhere else in the house. Oh, I almost forgot... for legacy applications, a shitty NTSC tuner that would essentially videocapture the channel (or two, or more for PIP applications, or the composite and/or S-video inputs) to MPEG-2 and mux them all into the 19.2mbit bitstream being fed to the decoder layer.
The controller layer. This layer basically acts like a cross between an ethernet switch and remote control interface. All the transport layer components plug into it via firewire, as does the decoder layer. The remote control's IR sensor (or bluetooth transceiver, or whatever) is part of this component. Of course, this component enjoys full 2-way communication among all the components (with daisy-chained layers employing the chain of command design pattern to pass messages back and forth to components further out the chain). Just to give a few obvious communication examples, the remote control could ask the controller layer, "Is the display turned on? Which input source is currently selected? Is the volume muted? etc.". 2-way querying means the remote could adapt itself in a context-sensitive manner (lighting and un-lighting relevant/irrelevant buttons, perhaps, or maybe embedding a LCD display onto each button so labels could change on the fly).
For most consumers, the transport and controller layers would be combined into a single component: "the cable box"
The decoder layer. Its job would be to demux the various MPEG bitstreams from the main 19.2mbit one, render the one(s) of interest into a bitmap, and send them on to the display layer. The demux function is important, because it's what enables more than one video stream to share a single 19.2mbit data pool. Put another way, when you tune a HDTV to channel "8.1" or "8.2", the decimal part is the demultiplexed subchannel. This is also the logical layer for deinterlacing to take place. Current HDTVs actually do deinterlacing at the display layer. More on this in a moment. This is also the layer responsible for transcoding framerates and resolution (in other words, making 24fps pretend to work at 60fps).
The display layer. "The Screen". Basically, this is the hardware that takes the high-bandwidth decoded bitstream and somehow makes it visible. Low-end sets would obviously support 480i60 and 480p60. However, I'd have mid-priced sets include 480p72... and 540p50, 540p72, and 540p75 in addition to 1080i60. High-end displays, of course, would natively support 720p60 as well... but while we're at it, we might as well add 540p100, 540p96, and 480p96.
For most consumers, the decoder and display layers would be combined into a single component: "the TV"
The decoder itself would be smart enough to know when you're channel surfing and when you've settled down to watch something. When surfing, it would transcode everything to the display layer's current active mode. In other words, if the display layer is running 480p60, the decoder layer would transcode everything to 480p60. HOWEVER, when you're through surfing (perhaps indicating it by hitting a button on the remote), the decoder layer would ascertain the BEST display mode available to it for the currently-chosen content and THEN tell the display layer to change modes if appropriate. This is important. If the display had to change modes while surfing, tuning to or from a 480p60 channel to a 1080i60 channel might, and tuning to or from a 720p60 channel definitely would, require CRT-based displays to switch flyback transformer coils (480p60 requires 31.5kHz; 1080i60 and 540p60 require 33.75KHz; 720p60 requires a whopping 45KHz bandwidth) -- with the accompanying loud relay "thunk" and momentary screen blacking just like a computer monitor switching between the normal Windows display and a full-screen command prompt. Current HDTVs avoid this by ALWAYS leaving the display in a single mode -- usually 1080i60 -- and mangling EVERYTHING into it.
I also included a 72hz 640/704 x 480 display mode. The inability of current HDTV displays to handle 480p72 is nothing short of criminal. Adding the capability would cost next to nothing (480p72 is close enough to 1080i60 to fudge the difference and let the two share a single flyback transformer), and would enable judder-free DVD playback -- the Holy Grail of Home Theater. Instead of showing film's 24 frames per second by alternating between showing each film frame two and three times in a row, it could cleanly show each frame 3 times in a row and avoid the visual artifacts that make film look different, and "unreal" compared to video.
Astute readers have by now realized that I've included a few PAL-friendly 540p modes. Aside from making it easy to watch foreign videos, it would serve a far more important purpose that Hollywood wouldn't necessarily approve of... If you've got a DVD player hacked to be region free (or just ripped, decrypted, and de-interlaced a region 2 DVD outright into nice, clean, 24fps progressive source), you've got a FAR better source video to show on a high-end HDTV system. Regardless of whether it's burned to DVD as PAL or NTSC, the source framerate is 24fps. As it happens, PAL discs are encoded with higher resolution because they duplicate each frame twice instead of duplicating half the frames three times to fill 60 frames per second.
Here are a few typical consumer setups as I envision them:
"Joe Sixpack Special" -- Cable Box/control combo, connected to decoder/display combo via a single firewire cable. The DVD player has a S-Video cable plugged into the cable box (or maybe firewire). The cable box/control has a power cord and normal 75-ohm coax cable.
"Soccer Mom Workhorse" -- same as above... bigger screen. Maybe a videogame console or two plugged into the cable box as well -- with a kludgy switchbox mechanism the family endures because they're too lazy to read the manual that came with the Controller their college-age older son bought them for Christmas. He's pissed and horrified, and can't believe his family and younger siblings can all be so stupid and would rather put up with an ugly kludge than read the 30 page manual, 2/3 of which was nothing but universally-ignored Hollywood-mandated warnings against using the product in ways they disapprove of.
"Single 20/30something Male with Money to Burn" -- cable box, PVR, DVD player, and videogame console all plugged into the Controller via firewire. Controller connected to decoder/display via firewire. Single coax cable w/RCA connector from decoder/display to Home Theatre Amp (could be optical, but audio is pretty low-bandwidth; for a 3-foot cable, going optical is basically ego masturbation. In theory, the 5.1 decoding could be handled by the decoder/display and fed via a half-dozen RCA audio cables, but why?...) Oh, yeah... and the old PC with DVD-ROM drive and firewire card he uses to rip, decode, and transcode Region 2 PAL discs into nice, clean, higher-res progressive 24fps source. It's definitely NOT realtime (he needs to rip the disc and let it do the transcoding overnight), and Hollywood officially regards him as a dangerous terrorist worthy of execution, but his friends rave about how much better movies look on his system. Oh, and the new crop of Congressmen and Senators elected in '06 under "Contract With America II" kept their promise and didn't stop with abolishing the DMCA... out for Hollywood's blood, they ruthlessly gutted American copyright law in general, stopping just short of outright legalizing file-sharing... so it's legal :-)
"Rich Home Theatre fanatic" -- this is a guy who thinks nothing about dropping $30,000 for a military flight-simulator-grade 8,000 lumen front projector capable of 1920x1080@72fps. The decoder is a separate $4,000 component. Hey, a decade earlier, this guy paid $5,000 for a turntable that couldn't even play 45rpm records...
How is this different from CURRENT systems?
* current systems combined my proposed transport layer with the decoder layer, but move the deinterlacing to the display. The ONLY benefit of this approach is that it means you can sell someone a display without a HDTV tuner that can still make a half-assed attempt at making NTSC look less bad. On the other hand, it means adding ANOTHER deinterlacing circuit to the progressive DVD player (usually, settling for two mediocre subsystems instead of one GOOD one). Likewise, the MPEG decoding has to be duplicated for the tuner, DVD, etc. It also means that TV-based deinterlacers have to work from an analog signal, even if it was a nice, clean, digital bitstream just a few feet and nanoseconds earlier. Yuck.
* REAL, honest-to-god 720p60 is still almost unheard of. Why? Stupid consumers think 1080i60 is better because 1080 is a bigger number than 720, and 720p60 costs a LOT more to implement than 1080i60. Why? 1080i60 allegedly has 1920 horizontal pixels vs 480p60's 640 or 704 horizontal pixels, but in the Real World, most of 1080i's horizontal detail is smeared and blurred down to 1000-1400 anyway. It doesn't matter today, though... no display smaller than 8 feet diagonal with current dotpitch or DLP/LCD resolution can really show more than 1280 anyway. Likewise, 1080i60 draws 540 lines each 1/60th of a second, while 480p60 draws 480 lines every 1/60th of a second. In other words, in any given 1/60th of a second, 1080i60 is only drawing 20% more lines than 480p60... and much of THAT extra detail is filtered out anyway to prevent horizontal lines that are a single line high from appearing in the display and flickering badly (think: radar weather map). As a result, 1080i60 requires 33.75KHz of horizontal bandwidth vs 480p60's 31.5KHz. On the other hand, in 1/60th of a second, 720p displays need to draw 720 lines. We've already established that 1080i60's real horizontal detail is only 1000-1400 pixels (vs 720p's 1280 horizontal pixels) anway, so NOW it should be obvious why 720p60 requires 45KHz of horizontal bandwidth and takes a lot more work for the display to render. 720p60 might require comparable MPEG-2 bitwidth to encode as 1080i60, but you can't fool Mother Nature. She knows which one conveys more REAL information.
* 72hz video modes don't exist. Manufacturers' main excuse is that there are no 72hz broadcast modes. Big f**king deal. All the ATSC spec means is that a network can't BROADCAST 72hz video over the public airwaves. If somebody wants to feed it to their display from a DVD player, or a cable company wants to send premium pay-per-view movies to customers at 72hz over privately-owned cable, it's none of the FCC's business. If displays capable of 480p72 existed, I GUARANTEE that 72hz progressive DVD players would appear within a year, and the display would be an INSTANT cult success among HTPC owners (who can get 72hz output now, if only there were a display bigger than 21" capable of displaying it).