When people talk about how technology has changed our lives, the best-kown examples are usually runaway successes: The internet, the smartphone, Wi-Fi, or perhaps video chatting (given that we’re all doing a lot more of that recently). But for every invention that has made a successful splash, there are tons that simply went kerplunk.
You can find these tech fails in almost every category, but we’ve decided to focus on the ones from the TV world. With TV’s lengthy role in our culture, it’s like a history lesson of what hasn’t worked.
Beta vs. VHS
In what might be the most famous format battle of all-time, the Beta versus VHS competition of the 1980s was epic. Sony’s Beta format (also known as Betamax) was arguably superior to JVC’s VHS videotape format. Beta was smaller and offered higher audio and video quality, but Sony steadfastly refused to affordably license Beta to other consumer electronics companies, whereas JVC took the opposite approach.
This led to a massive surge in the number of VHS machines on the market, which in turn depressed demand for rentals of Beta-format movies. That progression, plus the adoption of VHS by a certain adult-oriented industry we won’t name here, eventually put the final nail in Beta’s consumer coffin.
Curiously, Beta’s superiority kept the format alive and well in the professional broadcast community, where it remained the dominant tape-based medium for decades after its demise in the living room.
When it comes to TVs, size has always mattered. But cathode-ray tube (CRT) technology, which dominated the TV industry well into the 1990s, became exorbitantly expensive and difficult to produce at screen sizes larger than 32-inches. Plasma TVs were a solution to this problem, but early models were priced well beyond the reach of most buyers and they suffered from poor brightness and bad burn-in. LCD TVs weren’t viable yet either.
Into this void popped rear-projection TVs. In theory, they were a brilliant solution: Stick an RGB video projector at the back of a cabinet and get it to fire a reversed video image at a translucent screen. The result was like a miniature movie theater, and it cost way less to manufacture on an inch-by-inch basis than any of the competing technologies.
Unfortunately, these rear-projection TVs suffered from terrible off-angle viewing, an odd rainbow-effect that was sometimes caused by the use of color wheels, and a projection system that could be easily bumped out of alignment. By the mid-2000s, LCD and plasma started to show up in sizes and prices that made rear-projection TVs look like the stop-gap solution that they were, relegating the technology to history’s trash heap.
Honorary mention: Widescreen CRT TVs. Immediately before CRT TVs took their final bow, a few TV makers introduced 16:9 format tube-based TVs. They looked great — especially with widescreen DVDs — but they were expensive and couldn’t compete with rear-projection, plasma, and LCD on image size.
HD-DVD vs. Blu-ray
With the advent of high-def resolutions like 720p and 1080p, the writing was on the wall for the highly successful DVD format. It was going to be replaced by a new disc-based medium that could handle these higher resolutions, and as with Beta vs. VHS, it became clear we were in for another format war. In one corner was HD-DVD, the Toshiba-led high-def disc. In the other, Sony’s Blu-ray. Sony had learned its lesson from the Betamax fiasco, and it embarked on a campaign to enlist support for Blu-ray from all of the major studios.
Though the fight was heated at times (Microsoft infamously elected to back HD-DVD for its next-gen Xbox console), by the time CES 2008 rolled around, it was obvious that Sony had won the war this time, and despite some advantages, HD-DVD died a relatively quick death.
In movie theaters, modern 3D projection has been a game-changer. Especially when combined with the higher brightness and larger image size of the IMAX format, 3D gave movie lovers a whole new reason to see flicks on the big screen. So it stood to reason that if home TVs could offer the same experience, it would be met with an equal amount of enthusiasm.
Well, not quite. Despite a huge push into 3D by virtually every TV maker, 3D TV failed spectacularly. While 2010 marked the year 3D TVs became mainstream, it was already clear by 2013 that the technology was in trouble. As of 2019, there wasn’t a single 3D TV on the market.
Why did it die? A number of factors played roles. There were two kinds of 3D technologies (never a good thing): Active and passive. Active 3D used an expensive set of glasses to achieve stereoscopic vision by syncing the projection of left/right images on screen with a matching “flicker” of the lenses on the 3D glasses. It cut down on the available brightness but preserved resolution. It also suffered from cross-talk when that sync process didn’t stay true.
Passive 3D is what cinemas use and it relies on inexpensive polarized lenses to separate the left/right images that were projected simultaneously on the screen. Brightness was better than Active, but resolution took a hit.
Neither system worked very well if you weren’t seated dead-center, and most folks started to wonder why they needed a 3D TV when most of the content they watched was in 2D.
Curved TVs showed up right around the time that 3D TV was gasping its final breaths. The idea was that if you could warp the edges of the screen toward the viewer, it would create a more immersive — almost 3D — experience by making all portions of the image equally distant from your eyes.
The reality for most folks was “meh.” Visually striking to look at as a design, curved TVs didn’t really achieve the immersion they promised, plus they introduced an awkward asymmetry for anyone who wasn’t sitting dead-center on the couch. We tried out a lot of them and found it hard to recommend them over their flat-screened brethren.
You might still be able to find a curved TV from Samsung if you really want one, but you’d better act fast: Curved TV is now an example of “just because you can, doesn’t mean you should.”
Though essentially extinct in the TV world, curved screens are still a hot commodity in the computer monitor realm.
21:9 aspect ratio TVs
TV screens are still a compromise of sorts. At 16:9, they’re now the same ratio as all high-def formats like 720p, 1080p, 4K, and 8K. However, they’re still taller and narrower than 21:9 Cinemascope, the ratio used by some of the most epic films in history. Raiders of the Lost Ark, Jaws, The Matrix, Alien, and Blade Runner are all examples.
When viewed in their original format on a 16:9 TV, these classics produce small black bars at the top and bottom of the screen. While hardly noticeable in a darkened room on an OLED TV, it was believed that there was a demand for a TV that required no such sacrifice, thus a few manufacturers created “ultrawide” 21:9 models.
Unfortunately, most video content is not shot at 21:9, which means a 21:9 TV will still give you black bars. They appear on the sides instead of at the top and bottom, and they appear a lot unless you’re strictly screening movies shot at 21:9. Needless to say, most people decided they could live with the occasional top-and-bottom bars.
It’s worth noting that the ultrawide 21:9 format — like curved screens — has proven very popular in computer monitors, especially for those who like to game or multitask.
Cameras in TVs
If every laptop, tablet, and smartphone on the planet comes with a forward-facing camera, then why not put them on a smart TV too? That was the thinking back when smart TVs were emerging as a category and their always-on internet connections meant they could offer services like Skype. Early gesture-recognition software was baked into these TVs too, which let folks control various functions just by waving their hands in the air.
Unfortunately, the gesture recognition was a bit flaky, and it wasn’t long before security experts noted that those cameras were all too easily hacked by resourceful bad actors. This combination put something of a chill on early camera-equipped smart TVs and they quickly fell out of favor. That said, new models are beginning to appear — with a twist. Instead of a camera lens always pointed into the room (bedroom?), cameras are now being added as motorized modules that disappear when not in use, giving owners a bit more confidence that they’re TVs aren’t perpetually spying on them.
Perhaps this isn’t so much a fail as a feature whose time has now finally come, after a painful early prototype stage.
As we alluded to above, gesture control was made possible with the addition of cameras to smart TVs. But the most popular gesture control system, by far, was Microsoft’s Xbox Kinect. The motorized, camera-based Xbox accessory became the most rapidly adopted consumer technology in history (beating the DVD), with 35 million of the units having been sold from 2010 to 2017.
Its depth-sensing camera system made it possible to play video games with your whole body, and it didn’t require any other gear like sensor mats, balance boards, or handheld controllers.
The Kinect’s eventual demise was a classic case of overpromising and underdelivering on Microsoft’s part. Early ad campaigns showed gamers scanning their real-world skateboards to be used in simulated skating games and women excitedly trying on virtual outfits. None of these magical scenarios were realized. In the end, only a handful of games ever made good use of the Kinect’s abilities.
The last straw is a familiar one: Concerns over the Kinect’s always-on mic, when used with the Xbox One, proved to be one downside too many for Xbox fans.
This probably belongs in the same category as TV cameras. Early voice-control systems — and even some current ones, ahem, Bixby — were pretty horrible. Limited actions combined with hit-and-miss voice recognition that could often be more miss than hit, didn’t do much to win fans.
Thankfully, Apple, Google, and Amazon have all created excellent voice-recognition devices that can be connected to myriad gadgets including TVs. Not all TV-based voice systems were problematic. We have to give props to both LG and Roku. These companies launched their voice command systems later than their competitors, but they both work much more effectively.