Yes, we have been through this with;
-sound cards
-joystick interface
-memory standards
-graphic cards (2d and 3d)
..... And even keyboards!
That's why we had hoped the lesson had been learned!
But the lesson was learned. Don't rush into forcing a standard before an adequate phase allowing maximum innovation.
All these things were annoying for consumers and developers at the time, but the end result was better for it because it allowed companies to experiment without having to stick to the lowest common denominator.
This happens with most technology, it starts off very proprietary at the start, then becomes more compatible with other brands as the best options become clear and as consumer interest increases.
TVs and monitors did the same thing, they definitely weren't agnostic!
We had SECAM in france, russia and parts of africa. PAL was in europe and australia. NTSC was in america.
If you had an amiga, playing an american game on a PAL tv resulted in slight slow motion (NTSC was 60Hz, PAL was 50Hz, but games didn't use variable frame rate code, they assumed a fixed rate based on vsync). European games would run 20% faster on an NTSC tv, but you'd lose the bottom of the screen (PAL had a vertical resolution of 576, NTSC was 480).
On a TRS-80 Colour Computer 2, American games looked very different on a PAL tv. Due to resolution and artifacts, on NTSC if you did alternating black and white pixels, they would appear solid red or blue, depending on the order (such as BWBWBW looked red while WBWBWB appeared blue). But on PAL they looked like grey. Some american games relied on that, on PAL it was hard to see what was going on.
Then we had RF video, composite video, component video, s-video and SCART (huge socket which potentially had them all). We had multisync and fixed sync monitors. We had CGA, EGA and VGA monitors. Then VGA, DVI, HDMI and DP cables.
Even now we still have differences. Gsync only works with Nvidia gfx cards. Freesync only works with AMD cards. (By "work" I mean do it's actual sync feature thingy. They both work as just a normal monitor if you connect the wrong card). Some monitors are 10 bit per channel (over a billion colours), some are 8 bit per channel (16 million colours), some are pathetic 6 bit with dithering (65,536 colours).
Then we could get into the entire aspect ratio thing. One common resolution on 4:3 monitors was 1280x1024. But that's 5:4 ratio, so pixels weren't square. Movies came as wide screen, letter boxed or pan and scan. Playing a game on a wide screen monitor could be Horizontal+ (you see more horizontally) or Vertical- (you see less vertically). Most 3d games were H+, but Bioshock was V- (and boy did the wide screen monitor owners whine about it!)
There's also HDCP, the DRM that stops shows from running on your TV if they don't like it. That's pretty similar to the VR situation on games not working on unapproved HMDs.
TV is definitely not the role model for VR to follow.
VR HMDs should be agnostic, eventually. But they need time to not be held back from making new discoveries.