AJH wrote:The Rift installs an always on piece of software that is monitoring your computer even when you aren't using it and has a ToS that allows it to report about your activities back to Facebook so they can track what you are doing.
Just FYI ... one does
not need to ever login to use the Rift software. It's that most people just do, without knowing they don't need to.
But yes, Facebook's Terms are utter non-sense. The lawsuits are already beginning, because of the original promises Oculus made, which Facebook is still bound by (don't get me started on the legalese). I hope Zuckerberg et al. gets smacked, and hard.
AJH wrote:Peripheral exclusivity has never, not once, been done before in the PC space in any way I'm aware of
Actually, that's an utterly false statement, at odds with longstanding PC reality. Heck, even Apple is an "exclusive" PC today, Intel's #1 reseller, long taking over from Dell back in the '00s, as Dell became more subsidized by Microsoft, than Intel (for a long time, Dell was the only one more subsidized by Intel).
There is a crapload of AMD v. nVidia always going on, Microsoft even encourages it while the OpenGL Architecture Review Board (ARB) has trouble "getting the kids to sit at the table" ... in various titles to even hardware things like G-sync v. FreeSync, and that's hardly the first.
BTW, this is not limited to the Windows world either. Oracle is doing this in the Linux world as well, while Red Hat, Attachmate (SuSE AG) and others agree to work together, even when they start separate initiatives.
But ... I agree that most peripheral vendors who have tried to do exclusivity have found themselves either "giving in" to otherwise, or going out of business within a year or two. So do not make claims are not true at all ...
focus on the facts.
I.e., Demonizing things to make them far worse than reality only undermines the real, valid arguments some of us in the industry are trying to make. I have the same issue with people who demonize Microsoft with untrue statements, when there are clear, real issues that should be the focus, and the untrue statements just undermine those real ones.
AJH wrote:and certainly no major way and is EXTREMELY bad for consumers and game makers alike.
Of course! Zuckerberg et al. are applying their Facebook non-sense to the PC. And they will get smacked in the US courts. Money doesn't always win, especially once a class action lawsuits start. And the US has a crapload of lawyers.
Roger Wilco Jr wrote:If by PC you mean IBM clone
And that's really the thing ... there is really no such thing as an "IBM clone," and not for a long time ... really not into the 21st Century at all, not even for compatibility since '04 at the latest.
► Show Spoiler
I.e., Even compatibility of the old, COM object, 16-bit (64KiB), 20-bit (1MiB) rationalized Real86 mode addressing of the PC BIOS died early last decade with the x86-64 platform that could no longer link and load 64KiB COM or even 1MiB EXE Real86 mode objects any more when in 48-bit (256TiB) "Long Mode" addressing ... and that's when a lot of Win16/Win32s/Win32e (Really Win16, CPU shunted between Real86 and Protected386, aka 386Enhanced Mode) tied by XP.
That's why 64-bit uEFI really came about, even if most people only use the Compatibility Support Modules (CSM) so it provided the 16-bit BIOS interfaces for initial Power On Self Test (POST) and boot compatibility until NTKERNEL.EXE loaded. It's really been capable since 2009, and even NT6.0 (Vista x64) and NT6.1 (7/2008 x64) has native support, but the PC OEMs didn't "get on board" until Microsoft forced it with NT6.2 (Windows 8). Also, it has nothing to do with SecureBoot, except SecureBoot requires uEFI (people get a lot of those details mixed up too).
That's when even any "boot" time compatibility with the old IBM BIOS finally died, even though it's really been dead since the 20th century.
And BTW, that's all ignoring the Advanced RISC Computing (ARC) efforts that came out of the very late '90s, and part of the early '90s, when "everyone was going to standardize on Stanford MIPS" (don't get me started -- when I worked with MIPS in '07, they were just a tiny shadow of their former self). Those firmware and APIs were reborn as solutions for Itanium, then Intel's EFI and now consortium uEFI that all PCs had by the late '00s. Very little has changed, but were required for 64-bit, let alone was already going there for 32-bit in the '90s.
Only Microsoft Windows required such compatibility, not other OSes ... yes, even on the 32-bit (IA-32), later 64-bit (x86-64/AMD64 and IA32e/EM64T), x86 compatible Instruction Set Architecture (ISA) known traditionally as an x86 PC.
► Show Spoiler
E.g., Linux PC ran on 32 x86 and 64-bit non-x86 "PCs" (virtually no different) with ARC (and other) firmware in the '90s. In the Windows world, both Digital and SGI even put the effort in 32-bit Windows NT4-5 (4/2000/XP) NTLDR (NT Boot Loader, now BOOTMGR in NT6+/Vista+) and the Hardware Abstraction Layer (HAL) support ran on 32-bit PCs with 64-bit ARC firmware adopted. Yes, legacy BIOS free, because NT was originally designed for non-BIOS too, even if Microsoft dropped it altogether by the late '90s, and is now paying a price (e.g., Windows phones as the x86 compatible Atom is really behind, badly, to 64-bit, speculative, out-of-order ARMv8 designs, and Intel has had to admit it).
So it's really all about x86-specific, 16-bit (Win16/Win32s/Win32e) and, later, 32-bit Windows (Win32/x86) that still uses 16-bit boot for the last, dozen years, with NT now using a Win64/x86-64 kernel (but not shipping many, native "Long Mode" objects, long story). At most, the only lasting legacy of anything IBM ...
► Show Spoiler
... was the "API" of the original, 8-bit ST506 disk interface and 16-bit [IBM PC/]AT Attachment (ATA) that has been extended into 32-bit logical, with enhanced addressing to the point we're starting to hit it's limits.
But even PCIe/NVMe is finally replacing ATA/AHCI. It's precursor was on Linux over a dozen years ago, and CE-Windows picked up some support (with many, fixed requirements), but NT-Windows did not until Intel formalized NVMe, and uEFI became mainstay. Now we're finally seeing Windows move into the 21st century, so the commodity hardware components are.
Again, sorry to be anal (not aiming this on anyone), but 100% of the "IBM PC" really isn't it, and it's really Windows that hasn't moved into the 21st century until just recently. Linux has been 64-bit clean since 1994 (thanx to Alpha), and has had many other options, especially once AMD x86-64 hit in '04. That's when everyone doing high-end game development switched to GNU/Linux, which is why OpenGL still rules and we've had "PCs" with lots of hardware capabilities that "Windows PCs" are only now getting.
It's only the commodity pricing of AMD x86-64 that finally pushed Microsoft to "care." Not even 5 years earlier, in 1999, when they lost virtually every moderate to high-end engineering app to GNU/Linux and POSIX/OpenGL APIs, did they care. Only when EPIC Megagames literally lambasted Windows on AMD 64-bit v. GNU/Linux did they "wake up," and and they are still trying to "catch up" since. Even the latest Windows 10 and Windows Graphics Foundations (WGFs) looks like crap to Linux's Cairo vector rendering that has been around since '03, and widely implemented in widges and icons since '05. Windows is just so behind.
And only survives on Microsoft's control over OEM and retail distribution, because that's where 90% of consumers get their hardware from. But not so much their software.
Roger Wilco Jr wrote:I can't remember exactly, but leaving out old Nintendos and Comodores, I think there have been plenty of peripherals that I've bought for my clones through the decades that wouldn't work on Macs or Amigas and etc. I recently bought a driving wheel and there were PC only versions that would not work on an Xbox or PS4 - but there were other versions for those. I've never been much of a platform gamer and I guess cross platform compatibility, in either games or hardware, just isn't that important to me, although it may be a stupid business decision and I probably wouldn't do it myself if it was easily avoidable.
Yeah, he's literally making crap up. But I have an EE with a focus on computer architecture, and "grew up" around a lot of '90s to early '00s developments, including some working either right in the semiconductor industry, as well as dealing with "sourcing" when doing a lot of embedded.
E.g., Just today I was just "discussing" with someone why Rijndael was picked as the Advanced Encryption Standard (AES) by the US NIST ...
► Show Spoiler
... after he claimed the US NSA that picked it, so tired of total ignorance and guilt-by-association, which included him claiming Red Hat -- an American company -- could not be 100% Open Source and was a puppet of the NSA. I was working at a revolutionary, fabless asynchronous (clockless) design firm at the time of AES' selection, and it had everything to do with smart cards. I.e., cryptography that uses lots of adds (which includes subtracts) throws off one heck of an EMF ... because "ripple adders" in clocked boolean logic (CBL) are so predictable, and why our work would be used in various solutions (even if we were victims of the .COM-bust and got downsized, the legacy lives on). Being an Netscape iPlanet/Red Hat Certificate System (Upstream "Dogtag") SME today, I really get tired of this "the NSA has your keys" non-sense, mixing in about 12 different concepts and conspiracy theories that have nothing to do with each other, when it comes to software and open systems, let alone open source.
The semiconductory industry is extremely predictable ... it's utterly supply-side economics (don't get me started). That's why when the Semiconductor Industry Association (SIA) predicts something, they are spot-on, because demand matters little. Demand only defines price, not so much adoption, with few exceptions. Tablet sales were easily predictable years before Apple even created the iPad, especially since most early designs were either sporting GNU/Linux (the original Cyrix reference design in the late '90s) or Pen Computing (the latter finally got Microsoft to settle, and then even absorb them, after Microsoft blatantly misappropriated their IP while in negotiations prior). They just didn't penetrate consumers, because the hardware is never the seller.
It's the software ... which
where the app store comes in ...
And why Apple quickly realized
what Cyrix et al. had done for other industries by 1999 (e.g., heavily logistics tracking, military applications, etc...), and realized the few music players that came out of that on the market weren't penetrating just needed a virtual, on-line storefront. The Tablet was always the next step, after testing with an initial, smaller device that could be more cheaply made for consumers (most "industry tablets" were many thousands at the time).
Ergo ... the simple, music player with a simple MP3 ASIC decoding logic that ARM had already created for other people, prior to Apple's entry.
If I wasn't scrambling for money in my late 20s, and running back to NASA for a public sector job after getting laid off not once (March 2001 after the late 2000 .COM downturn) but twice (after 9/11 and sales went flat at my friend's place), I honestly wish I would have put money into developing Digital Rights Management ... using an Open Source solution, offering myself as an SME for implementation. E.g., the people who came up with UltraViolet, especially seeing the MPAA go for it for movies, were brilliant and earned consumer trust, and I honestly wish the RIAA would have come up with something similar for music.
But instead, Apple got control, along with Amazon, and then Google started leveraging its on-line monopoly to make its push ... including into other areas where Facebook is at too.
Hence, the bigger point ...Roger Wilco Jr wrote:AJH wrote:... but they also have previously said they wouldn't do what they did, so it doesn't carry a lot of weight with anyone in the enthusiast community since they've already shown they have no problem screwing over their paying customers and developers if it is in their own self interest. That's why there is shattered trust with the critical early adopter market ...
This probably bothers me most, but then again they didn't make and break any promises to me, so I'm not taking it so personally. So if two years from now Oculus comes out with the most kick-ass headset that beats all the competitors hands down, then I'll be taking a strong look at their TOS.
Again, American lawyers are already making this an issue in the US. I expect Zuckerberg, no matter how rich, will feel that eventually.
Roger Wilco Jr wrote:AJH wrote:Yes, once touch gets here they'll have a partial answer to Vive's roomscale, but they also backed the wrong horse there. The industry is rapidly showing that room scale matters a lot more than seated experiences and while Rift does have a slight advantage at seated, Vive has a considerable advantage, even after Touch at doing room scale.
I've heard people ask if VR is just a fad. If anything is a fad, I think roomscale may be the 3DTV of the VR marketplace. Once they get touch feed back gloves figured out, that let you see "your hands" truly interacting with 3D objects, like keyboards and flight controllers, as well as door knobs and light switches - basically the holodeck experience inside a headset - then I'll think they really have something. I'll I've seen of room scale and hand controllers is people throwing their hands around apparently shooting at things (from the hip or thug style). I mean it may be a great new experience, but I'm pretty sure the sit down simulator experience is here to stay.
Of course, I haven't even tried it yet, so I may be (am) talking out my ass.
VR is the final shift that 3Dtv was never able to make. 3Dtv just added depth. VR "changes the living room."
It will be slow, change many times, but it's coming. The
days of the TV --
as a distribution solution for entertainment and media --
are already very limited. Heck, even "Big Media" here in the US has been under constant attack. From the Big 4 networks to ESPN, they are feeling it.
After the Baby Boomer generation is gone from the US, who currently has the most purchasing power, even in their retiring ages, small Gen-X will be at the mercy of the newer Gen-X/Millennial trends, who are even bigger than the Boomers. That's when you're finally going to see the traditional US media -- which is a massive industry (monetarily-wise), which drives everything world-wide (even if the US is not the center of the universe, the most money in traditional media is US-heavy) -- flop and consolidate fully, while the US Congress will try to find some way to tax on-line services to make up for the shortfall in advertising and related, traditional services.
Just look at the top 5 Internet companies in the US. They suck up 80% of all advertising on the Internet. The app store is about control. And that's why Facebook is using Oculus to try to push exclusives. It's really about control.
Apple was first to realize this, even though they were way, way late to the music player and, then, Tablet. People think otherwise, just like they think Microsoft was first with the Office Suite, Internet Browser and Themed Music Player for the PC (all developed on and released for UNIX or Linux, first). No, they were last (fifth or sixth), bought products and used their distribution channels to push it to consumers better. The app store is the new distribution channel, and why Microsoft is becoming a cloud company, as their OEM and retail distribution locks have been long broken.
And why Facebook is desperately trying to make the Rift "the device," as it feels VR is the forthcoming entertainment change for the "new living room" of 2025+. It isn't going be a "viewing" room. It's going to be an "interactive play room."