Andrew Kemendo’s article on Virtual Reality Pop “Here’s What It’s Going to Take For Augmented Reality to Take Over The World” explores the technology trajectory of AR and what is needed for consumers to widely adopt and use it.
With rumors swirling about Apple’s plans in Augmented Reality (AR), as well as recent teeth gnashing about the state of Magic Leap, it’s beyond time for the AR community to have a real discussion about what it’s going to take for Augmented Reality to become the primary computing environment worldwide.
I’ve spoken in the past about what I call the “AR stack.” That is the set of technologies that are necessary to get to wide functionality and applicability for the average consumer. However I wanted expand on that and build a very basic reference that identifies key concepts necessary for AR, that people can easily cite when trying to understand this topic. Read the complete article on Virtual Reality Pop.
Virtual reality (VR) typically refers to computer technologies that use software to generate the realistic images, sounds and other sensations that replicate a real environment (or create an imaginary setting), and simulate a user’s physical presence in this environment. VR has been defined as “…a realistic and immersive simulation of a three-dimensional environment, created using interactive software and hardware, and experienced or controlled by movement of the body” or as an “immersive, interactive experience generated by a computer”.
A person using virtual reality equipment is typically able to “look around” the artificial world, move about in it and interact with features or items that are depicted on a screen or in goggles. Most 2016-era virtual realities are displayed either on a computer monitor, a projector screen, or with a virtual reality headset (also called head-mounted display or HMD). HMDs typically take the form of head-mounted goggles with a screen in front of the eyes. Read more about virtual and augmented reality on Wikipedia.