Excellent! 100% valid and up to date GCFW Cheatsheet and valid answers

killexams.com is the supply of the latest and legitimate GCFW PDF Download with Practice test Questions plus Answers for applicants to just download, read and complete the GCFW exam. All of us recommend to Exercise our Real GCFW Queries and real questions to enhance your own knowledge of GCFW goals and pass your own examination with Higher Marks. You will certainly not feel any kind of difficulty in determining the GCFW Cheatsheet in the actual exam, hence solving the questions in order to get a good rating.

Exam Code: GCFW Practice exam 2022 by Killexams.com team
GIAC Certified Firewall Analyst
GIAC Certified reality
Killexams : GIAC Certified reality - BingNews https://killexams.com/pass4sure/exam-detail/GCFW Search results Killexams : GIAC Certified reality - BingNews https://killexams.com/pass4sure/exam-detail/GCFW https://killexams.com/exam_list/GIAC Killexams : Behind the Doors of Meta's Top-Secret Reality Labs

Mark Zuckerberg sat across from me, controlling objects on a screen with small motions of his fingers. Taps, glides, pinches. On his wrist was a chunky band that looked like an experimental smartwatch: It's Meta's vision of our future interactions with AR, VR, computers and just about everything else.

"It'll work well for glasses…I think it'll actually work for everything. I think in the future, people will use this to control their phones and computers, and other stuff…you'll just have a little band around your wrist," Zuckerberg said, right before he demoed the neural wristband. His hand and finger movements seemed subtle, almost fidgety. Sometimes nearly invisible.

Neural input devices are just one part of Meta's strategy beyond VR, and these wristbands were among the tech I got to see and try during a first-ever visit to Meta's Reality Labs headquarters in Redmond, Washington. The trip was the first time Meta's invited journalists to visit its future tech research facility, located in a handful of nondescript office buildings far north of Facebook's Silicon Valley headquarters. 

Entrance to a red building

Entering Meta Reality Labs in Redmond, Washington.

Scott Stein/CNET

The last time I visited Redmond, I was trying Microsoft's HoloLens 2. My trip to Meta was a similar experience. This time, I was demoing the Meta Quest Pro, a headset that blends VR and AR together into one device and aims to kick off Zuckerberg's ambitions to a more work-focused metaverse strategy. 

Meta's newest Connect conference news is focused on the Quest Pro, and also on new work partnerships with companies like Microsoft, Zoom, Autodesk and Accenture, targeting ways for Meta to maybe dovetail with Microsoft's mixed reality ambitions.

I also got to look at a handful of experimental research projects that aren't anywhere near ready for everyday use but show glimpses of exactly what Meta's shooting for next. These far-off projects, and a more-expensive Quest Pro headset, come at a strange time for Meta, a company that's already spent billions investing in the future of the metaverse, and whose most popular VR headset, the Quest 2, still has less than 20 million devices sold. It feels like the future isn't fully here yet, but companies like Meta are ready for it to be.

I experienced a number of mind-bending demos with a handful of other invited journalists. It felt like I was exploring Willy Wonka's chocolate factory. But I also came away with the message that, while the Quest Pro looks like the beginning of a new direction for Meta's hardware, it's nowhere close to the end goal.

Researchers wearing prototype wristbands controlling a video game

A demo of EMG wristbands measuring motor neurons, at Meta Reality Labs Research

Meta

Neural inputs: Wristbands that adapt to you

"Co-adaptive learning," Michael Abrash, Meta's Reality Labs' chief scientist, told me over and over again. He was describing the wristbands that Meta has discussed multiple times since acquiring CTRL-Labs in 2019. It's a hard concept to fully absorb, but Meta's demo, shown by a couple of trained researchers, gave me some idea of it. Wearing the bulky wristbands wired to computers, the wearers moved their fingers to make a cartoon character swipe back and forth in an endless-running game. Then, their movements seemed to stop. They became so subtle that their hands barely twitched, and still they played the game. The wristbands use EMG, or electromyography (the electrical measurement of muscles) to measure tiny muscle impulses.

A feedback-based training process gradually allowed the wearers to start shrinking down their actions, eventually using only a single motor neuron, according to Thomas Reardon, Reality Labs' Director of Neuromotor Interfaces and former CEO of CTRL-Labs, who talked us through the demos in Redmond. The end result looks a little like mind reading, but it's done by subtly measuring electrical impulses showing an intent to move. 

Mark Zuckerberg demos a neural input wristband with a computer

Mark Zuckerberg using the EMG wristband in a demo in front of a handful of journalists during my visit. 

Meta

When Zuckerberg demonstrated the wristband, he used a similar set of subtle motions, though they were more visible. The wristband's controls feel similar to a touch-based trackpad or air mouse, able to identify pressure-based pinches, swipes and gestures.

"In the long run, we're going to want to have an interface that is as natural and intuitive as dealing with the physical world," Abrash said, describing where EMG and neural input tech is aiming. 

Typing isn't on the table yet. According to Zuckerberg, it would require more bandwidth to get to that speed and fidelity: "Right now the bit rate is below what you would get for typing quickly, but the first thing is just getting it to work right." The goal, at some point, is to make the controls do more. Meta sees this tech as truly arriving in maybe five to six years, which feels like an eternity. But it will likely line up, should that timeframe hold, with where Meta sees its finalized AR glasses becoming available.

Someone wearing a neural input wristband

The EMG wristband looks like a large prototype smartwatch, with sensors in the segmented strap.

Scott Stein/CNET

Zuckerberg says the wristbands are key for glasses, since we won't want to carry controllers around, and voice and hand tracking aren't good enough. But eventually he plans to make these types of controls work for any device at all, VR or otherwise. 

The controls look like they'll involve an entirely different type of input language, one that might have similarities to existing controls on phones or VR controllers, but which will adapt over time to a person's behavior. It seems like it would take a while to learn to use. 

"Most people are going to know a whole lot about how to interact in the world, how to move their bodies," Reardon said to me. "They're going to understand simple systems like letters. So let's meet them there, and then do this thing, this pretty deep idea called co-adaptation, in which a person and a machine are learning together down this path towards what we would call a pure neural interface versus a neural motor interface, which blends neural decoding with motor decoding. Rather than saying there's a new language, I'd say the language evolves between machine and person, but it starts with what people do today."

Demonstrating an EMG wristband in front of a computer screen showing feedback

A demonstration showing how feedback can lead to the wristband sensing smaller and smaller motions.

Meta

"The co-adaptation thing is a really profound point," Zuckerberg added. "You don't co-adapt with your physical keyboard. There's a little bit of that in mobile keyboards, where you can misspell stuff and it predicts [your word], but this is a lot more."

I didn't get to wear or try the neural input wristband myself, but I got to watch others using them. Years ago at CES, I did get to briefly try a different type of wrist-worn neural input device for myself, and I got a sense of how technologies like this actually work. It's different from the head-worn device by Nextmind (since acquired by Snap) I tried a year ago, which measured eye movement using brain signals. 

The people using the Meta wristbands seemed to make their movements easily, but these were basic swiping game controls. How would it work for more mission-critical everyday use in everyday AR glasses? Meta's not there yet: According to Zuckerberg, the goal for now is to just get the tech to work, and show how adaptive learning could eventually shrink down response movements. It may be a while before we see this tech in action on any everyday device, but I wonder how Meta could apply the principles to machine learning-assisted types of controls that aren't neural input-based. Could we see refined controllers or hand tracking combinations arrive before this? Hard to tell. But these bands are a far-off bet at the moment, not an around-the-corner possibility.

A man wearing a mask and headphones in a testing room with speakers

I wear a spatially-trackable headset which creates audio effects I can't distinguish from the speakers in the room.

Meta

Super-real 3D audio 

A second set of demos I tried, demonstrating next-generation spatial audio, replicated research Meta talked about back in 2020 -- and which it originally planned on showing off in-person before COVID-19 hit. Spatial audio is already widely used in VR headsets, game consoles and PCs, and on a variety of everyday earbuds such as AirPods. What Meta's trying to do is not just have audio that seems like it's coming from various directions, but to project that audio to make it seem like it's literally coming from your physical room space.

A visit to the labs' soundproof anechoic chamber -- a suspended room with foam walls that block reflections of sound waves -- showed us an array of speakers designed to help study how sounds travel to individual ears, and to explore how sounds move in physical spaces. The two demos we tried after that showed how ghostly-real the sounds can feel.

In a soundproof room with a tower full of speakers

Inside Meta's anechoic chamber, where a massive speaker array is used to help create spatial audio profiles.

Scott Stein/CNET

One, where I sat down in a crowded room, involved me wearing microphones in my ears while the project leads moved around me, playing instruments and making noises at different distances. After 40 seconds of recording, the project leads played back the audio to me with over-ear headphones… and parts of it sounded exactly like someone was moving around the room near me. What made it convincing, I think, were the audio echoes: The sense that the movement was reverberating in the room space. 

A second demo had me wearing a 3D spatial-trackable pair of headphones in a room with four speakers. I was asked to identify whether music I heard was coming from the speakers, or my ears. I failed. The music playback flawlessly seemed to project out, and I had to take off the headphones to confirm which was which as I walked around.

According to Michael Abrash's comments back in 2020, this tech isn't far away from becoming a reality as neural wristbands. Meta's plans are to have phone cameras eventually be able to help tune personal 3D audio, much like Apple just added to its newest AirPods, but with the added benefit of realistic room-mapping. Meta's goal is to have AR projections eventually sound convincingly present in any space: It's a goal that makes sense. A world of holographic objects will need to feel anchored in reality. Although, if future virtual objects sound as convincingly real as my demos were, it might become hard to distinguish real sounds from virtual, which brings up a whole bunch of other existential concerns.

Wearing a VR headset and talking with a head on a computer screen

My conversation with an avatar so realistic it felt like I was in the same room with them.

Meta

Talking to photo-real avatars

I'm in a dark space, standing across from a seemingly candle-lit and very real face of someone who was in Meta's Pittsburgh Reality Labs Research offices, wearing a specially built face-tracking VR headset. I'm experiencing Codec Avatars 2.0, a vision of how realistic avatars in metaverses could get.

How real? Quite real. It was uncanny: I stood close and looked at the lip movement, his eyes, his smiles and frowns. It felt almost like talking with a super-real PlayStation 5 game character, then realizing over and over again this is a real-time conversation with a real person, in avatar form. 

I wondered how good or limited face tracking could be: After all, my early Quest Pro demos using face tracking showed limits. I asked Jason, the person whose avatar I was next to, to make various expressions, which he did. He said I was a bit of a close-talker, which made me laugh. The intimate setting felt like I had to get close and talk, like we were in a cave or a dimly lit bar. I guess it's that real. Eventually, the realism started to feel good enough that I started assuming I was having a real conversation, albeit one with a bit of uncanny valley around the edges. It felt like I was in my own living video game cutscene.

Meta doesn't see this coming into play for everyday headsets any time soon. First of all, standalone VR headsets are limited in their processing power, and the more avatars you have in a room, the more graphics get taxed. Also, the tracking tech isn't available for everyone yet. 

Man with a VR headset, chatting with a realistic avatar on a screen

Trying out a chat with an Instant Codec Avatar, created with a phone-made head scan.

Meta

A more dialed-down version was in my second demo, which showed an avatar that had been created by a face scan using a phone camera using a new technology called Instant Codec Avatars. The face looked better than most scans I'd ever made myself. But I felt like I was talking with a frozen and only slightly moving head. The end result was less fluid than the cartoony Pixar-like avatars Meta uses right now.

Actor being scanned by a room full of cameras

An actor who was 3D scanned ahead of time using an array of cameras. I saw his rendered avatar layered with digital clothing.

Scott Stein/CNET

One final demo showed a full-body avatar (legs, too!) that wasn't live or interactive. It was a premade 3D scan of an actor using a special room with an array of cameras. The demo focused on digital clothes that could realistically be draped over the avatar. The result looked good up close, but similar to a realistic video game. It seems like a test drive for how digital possessions could someday be sold in the metaverse, but this isn't something that would work on any headset currently available.

3D scanning a shoe with a phone

My sneaker gets 3D scanned using Meta's new phone-based capture tech.

Scott Stein/CNET

3D scanning my shoes (plus, super-real cacti and teddy bears)

Like a volunteer in a magic show, I was asked to remove one of my shoes for a 3D scanning experiment. My shoe ended up on a table, where it was scanned with a phone camera -- no lidar needed. About half an hour later, I got to look at my own shoe in AR and VR. 3D scanning, like spatial audio, is already widespread, with lots of companies focused on importing 3D assets into VR and AR. Meta's research is aiming for better results on a variety of phone cameras, using a technology called neural radiance fields. Another demo showed a whole extra level of fidelity.

A tablet showing a 3D image of a shoe

My shoe, after being scanned, in AR.

Scott Stein/CNET

A couple of prescanned objects, which apparently took hours to prepare, captured the light patterns of complex 3D objects. The results -- which showed furry, spiky, fine-detailed objects including a teddy bear and a couple of cacti -- looked seriously impressive on a VR headset. The curly fur didn't seem to melt or matte together like most 3D scans; instead it was fluffy, seemingly without angles. The cactus spines spread out in fine spiky threads.

Of all the demos I tried at the Reality Labs, this was maybe the least wowing. But that's only because there are already, through various processes, lots of impressive 3D-scanned and rendered experiences in AR and VR. It's not clear how instant or easy it could be to achieve Meta's research examples in everyday use, making it hard to judge how effective the function is. For sure, if scanning objects into virtual, file-compatible versions of themselves gets easier, it'll be key for any company's metaverse ambitions. Tons of businesses are already aiming to sell virtual goods online, and the next step is letting anyone easily do it for their own stuff. Again, this is already possible on phones, but it doesn't look as good…yet.

Meta's Michael Abrash stands in front of a wall of VR headsets

Chief Scientist Michael Abrash talks to us in front of a wall of prototype VR and AR headsets.

Meta

What does it all mean?

The bigger question on my mind, as my day ended at Meta's facilities and I called a Lyft from the parking lot, was what it all added up to. Meta has a brand-new Quest Pro headset, which is the bleeding-edge device for mixing AR and VR together, and which offers new possibilities for avatar control with face tracking.

The rest of the future remains a series of question marks. Where Meta wants to spread out its metaverse ambitions is a series of roads that are still unpaved. Neural inputs, AR glasses, blends of virtual and real sounds, objects and experiences? These could still be years away. 

In a year where Meta has seen its revenue drop while making sizable bets on the metaverse, despite inflation and an economic downturn, are these projects all going to be fulfilled? How long can Meta's long-game metaverse visions be sustained?

Prototype glasses-sized VR headset

Meta's prototype VR sunglasses, the "North Star" for what the tech aims to become.

Scott Stein/CNET

Abrash talks to us once more as we gather for a moment before the day's end, bringing back a connecting theme, that immersive computing will be a true revolution, eventually. Earlier on, we had stopped at a wall full of VR and AR headsets, a trophy case of all the experimental prototypes Meta has worked on. We saw mixed reality ones, ones with displays designed to show eyes on the outside, and ones so small they're meant to be the dream VR equivalent of sunglasses. 

It made me think of the long road of phone design experimentation before smartphones became mainstream. Clearly, the metaverse future is still a work in progress. While big things may be happening now, the true "smartphones" of the AR and VR future might not be around for a long while to come.

"The thing I'm very sure of is, if we go out 20 years, this will be how we're interacting," Abrash said in front of the headset wall. "It's going to be something that does things in ways we could never do before. The real problem with it is, it's very, very hard to do this."

Sat, 15 Oct 2022 00:00:00 -0500 See full bio en text/html https://www.cnet.com/tech/computing/behind-the-doors-of-metas-top-secret-reality-labs/
Killexams : 11 Best Virtual Reality Stocks to Buy

In this piece, we will take a look at the 11 best virtual reality stocks to buy. For more stocks, head on over to 5 Best Virtual Reality Stocks to Buy.

Advances in semiconductor fabrication and manufacturing have enabled chip makers to squeeze unthinkable amounts of computing power into pieces of silicon the size of a human thumbnail. This growth has also spurred industries of its own, and one such sector is the virtual reality segment of the broader technology industry.

Virtual reality, as the name suggests, refers to technologies that create an artificial representation of reality for users to immerse themselves into - whether for entertainment or productivity needs. This is achieved through headsets, processors, and software, with different companies providing different technologies for the processes.

The virtual reality industry was estimated to be worth $4.4 billion in 2020, and through a massive compounded annual growth rate (CAGR) of 44.8%, the segment can be worth a whopping $84 billion in 2029, according to a research report from Fortune Business Insights. Driving this growth will be several factors, such as the demand for virtual training platforms in industries, that let firms prepare their employees for complex tasks without investing in physical infrastructure. This allows companies in industries such as automobile manufacturing to reduce worker injuries and conduct factory personnel training safely.

Another research report, this time from Valuates Report, analyzes both the virtual and augmented reality markets. Augment reality is a subset of virtual reality that serves as a 'bolt on' to existing reality instead of rendering a completely new environment. This research firm believes that the markets were worth $14 billion in 2020 and through a strong CAGR of 41%, they will grow to sit at $454 billion by the end of 2030.

Therefore, looking at these estimates, it's clear that virtual reality has a bright future ahead of it, despite the bloodbath in technology stocks this year. Today's piece will look at the key players in the industry and some well known firms in the list are Advanced Micro Devices, Inc. (NASDAQ:AMD), Meta Platforms, Inc. (NASDAQ:META), and Microsoft Corporation (NASDAQ:MSFT).

Photo by mahdis mousavi on Unsplash

Our Methodology

We took a look at the virtual reality industry and current trends to pick out which firms are currently offering creative products and services in the industry. We preferred companies with strong financial performance, technological advantages, and relevance to current industry dynamics. These stocks are ranked via hedge fund sentiment gathered through Insider Monkey's 895 fund survey for this year's second quarter.

11. Tencent Holdings Limited (OTCMKTS:TCEHY)

Number of Hedge Fund Holders: 2

Tencent Holdings Limited (OTCMKTS:TCEHY) is a Chinese conglomerate that owns several companies including the video game developer Epic Games. The firm is headquartered in Shenzhen, the People's Republic of China.

Epic Games is one of the most well known game developers in the world, which rose to fame due to its Fortnite gaming brand. The company, like other game developers, is also targeting the metaverse industry which is seeing strong interest from large firms. Sony and The Lego Group invested a whopping $2 billion in Epic Games in 2022 to spur metaverse development.

Additionally, Epic Games' Unreal Engine, which is used by video game developers to develop their products, is capable of developing assets that support 3D visualization and augmented and virtual realities. Insider Monkey's Q2 2022 survey of 895 hedge funds revealed that two had invested in Tencent Holdings Limited (OTCMKTS:TCEHY).

Along with Meta Platforms, Inc. (NASDAQ:META), Advanced Micro Devices, Inc. (NASDAQ:AMD), and Microsoft Corporation (NASDAQ:MSFT), Tencent Holdings Limited (OTCMKTS:TCEHY) is a top virtual reality stock.

10. MicroVision, Inc. (NASDAQ:MVIS)

Number of Hedge Fund Holders: 4

MicroVision, Inc. (NASDAQ:MVIS) is an American company that develops sensors used in automobiles. Additionally, it also develops a scanning technology that enables the creation of large images for a full field of view. It also develops displays concepts, designs, and modules that are used in augmented and virtual reality headsets. The firm is headquartered in Redmond, Washington.

MicroVision, Inc. (NASDAQ:MVIS)'s lidar systems scored a big win in September 2022, when chip giant NVIDIA Corporation announced that the MAIN DR dynamic view system would be supported by NVIDIA's Drive AGX platform. This will Strengthen highway safety for vehicles.

By the end of its second fiscal quarter, MicroVision, Inc. (NASDAQ:MVIS) had $93 million in cash, which is important given the company's weak operating income profile. The firm has invested some of this into treasury securities, and its latest quarterly operating costs stood at $9.7 million - giving it plenty of runway room. Four out of the 895 hedge funds polled by Insider Monkey for their June quarter of 2022 portfolios had invested in the company.

MicroVision, Inc. (NASDAQ:MVIS)'s largest investor in our database is Daniel S. Och's Sculptor Capital which owns 572,200 shares that are worth $2.1 million.

9. Matterport, Inc. (NASDAQ:MTTR)

Number of Hedge Fund Holders: 7

Matterport, Inc. (NASDAQ:MTTR) is an American company that caters to the front end of the virtual reality space. Its software applications allow developers to capture the depth and imagery of a physical space to create a virtual reality environment. The firm is headquartered in Sunnyvale, California.

Matterport, Inc. (NASDAQ:MTTR) reported a strong second fiscal quarter earlier this year, which despite negative revenue growth, saw the firm expand its presence in the market. At the earnings, the firm announced that its subscribers grew by a massive 52% annually to stand at 616,000 during the quarter.

Matterport, Inc. (NASDAQ:MTTR) also counts some of the largest companies in the world as its customers, with firms such as Proctor & Gamble, Sealy, and Netflix part of the 23% of the Fortune 1000 firms that use the company's products. Additionally, the firm's latest quarter also saw it grow its services revenue by 74% and its subscription revenue by 20%.

Insider Monkey took a look at 895 hedge funds for their second quarter of 2022 holdings to discover that 7 had invested in Matterport, Inc. (NASDAQ:MTTR).

Matterport, Inc. (NASDAQ:MTTR)'s largest investor is Chase Coleman and Feroz Dewan's Tiger Global Management LLC which owns 3.6 million shares that are worth $13 million.

8. Unity Software Inc. (NYSE:U)

Number of Hedge Fund Holders: 23

Unity Software Inc. (NYSE:U) is a software platform provider whose products allow its customers to develop 2D and 3D content for a wide variety of gadgets and devices such as smartphones, tablets, computers, gaming consoles, and virtual and augmented reality platforms. The firm is headquartered in San Francisco, California.

Unity Software Inc. (NYSE:U) is also aggressively targeting growth, with its research and development expenses during its second fiscal quarter representing close to 73% of its revenue. This opens up a large opportunity for explosive growth in the future, should these investments bear fruit.

Needham set a $50 share price target for the company in October 2022, stating that its software platform is one of the best in the world and will benefit from the strong growth in the demand for 3D content. 23 out of the 895 hedge funds polled by Insider Monkey during the second quarter of this year had invested in Unity Software Inc. (NYSE:U).

Out of these, Jim Davidson, Dave Roux, and Glenn Hutchins's Silver Lake Partners is Unity Software Inc. (NYSE:U)'s largest investor. It owns 34 million shares that are worth $1.2 billion.

7. Roblox Corporation (NYSE:RBLX)

Number of Hedge Fund Holders: 26

Roblox Corporation (NYSE:RBLX) is an online operating platform operator and developer whose studio allows developers to create and operate virtual 3D environments. The firm is headquartered in San Mateo, California, the United States.

Roblox Corporation (NYSE:RBLX) posted record high revenues of $600 million in its second fiscal quarter, which enabled it to cross $1 billion in revenue for the first half of this year. The company's extreme focus on its products led it to develop a voice chat feature for months before it finally rolled it out to users. Additionally, it has a creative advertising strategy, which creates a unique environment that lets users interact with the ad and then make potential purchases.

Roblox Corporation (NYSE:RBLX)'s platforms are also attractive for advertisers since they provide a large user base of young users that are yet to cement their buying preferences. Needham reduced the company's share price target to $53 from $55 in September 2022, stating that its advertising platform is one of a kind. Insider Monkey's Q2 2022 895 hedge fund survey saw 26 having held a stake in the company.

Roblox Corporation (NYSE:RBLX)'s largest investor is Jim Simons' Renaissance Technologies which owns 11.5 million shares that are worth $380 million.

6. Sony Group Corporation (NYSE:SONY)

Number of Hedge Fund Holders: 26

Sony Group Corporation (NYSE:SONY) is a Japanese multinational platform that designs and sells consumer electronics products and owns video game development platforms. The company is headquartered in Tokyo, Japan.

Sony Group Corporation (NYSE:SONY) operates in the hardware side of the virtual reality ecosystem, as it designs and sells the PlayStation PS VR headset. This headset has two modes, 3D and 2D modes. The former lets users view content in HDR resolution at 90Hz or 120Hz, and the latter lets them play games in HDR at 24Hz, 60Hz, and 120Hz.

When compared to some other virtual reality companies that have weak financials, Sony Group Corporation (NYSE:SONY) is an established player that has sold millions of units of its gaming consoles and brings in close to $100 billion in revenue each year. By the end of this year's second quarter, 26 of the 895 hedge funds surveyed by Insider Monkey had bought the company's shares.

Out of these, Sony Group Corporation (NYSE:SONY)'s largest investor is Mario Gabelli's GAMCO Investors which owns 1.7 million shares that are worth $146 million.

Advanced Micro Devices, Inc. (NASDAQ:AMD), Meta Platforms, Inc. (NASDAQ:META), and Microsoft Corporation (NASDAQ:MSFT), Sony Group Corporation (NYSE:SONY) is a VR stock you must look at.

Click to continue memorizing and see 5 Best Virtual Reality Stocks to Buy.

Suggested Articles:

Disclosure: None. 11 Best Virtual Reality Stocks to Buy is originally published on Insider Monkey.

Fri, 14 Oct 2022 05:37:00 -0500 en-US text/html https://finance.yahoo.com/news/11-best-virtual-reality-stocks-170307344.html
Killexams : Augmented Reality Must Live Up To Its Name

For artists, technologists, engineers, advertisers and dreamers, augmented reality (AR) is the holy grail of digital experience. This tech promises to make magic real: to manifest whatever we can imagine in physical space.

But we're not there yet. Today, most of what is called AR is not worthy of the name. Rather than being an augmentation of reality, it is a poor facsimile of a powerful idea.

So much more is possible.

In the past few weeks, the world has woken up to the boundless potential of AR to transform how we live, learn, work and interact. Apple's CEO Tim Cook says we will end up wondering how we ever lived without it.

But as we take the first steps into this bright future, it is more critical than ever that we wake up to a fundamental truth: No matter how vivid our digital creations, AR will fall short of its full promise unless and until these can be accurately placed in the real world and, more importantly, fully shareable with others.

It's Not 'Real' if It Can't Be Shared

Imagination is hard-wired into the human psyche. From early childhood, we embellish our outer worlds with elements of our inner lives. But since there is no way for those around us to tap into those private imaginings, they remain wholly subjective and unverifiable.

Whether or not a sensory experience is shared by others has a critical impact on whether we ourselves believe it to be real. If you are the only one in a crowded room to hear a whispering voice, you will feel isolated and strange. You may start to question your own perception — perhaps even your sanity.

But if others around you say they've heard it too, you're back on solid ground. What you've experienced is valid and therefore must be real.

This is what is known as intersubjectivity, the process of sharing knowledge and experiences with others.

Today, the vast majority of AR tech does not support intersubjective experiences. Indeed, it is often little more than gimmicky filters on our solitary devices that are difficult to share.

If I conjure up a fire-breathing dragon in my back garden, there is no way for me to photograph myself with it or to impress you with the breadth of its wingspan. And if I can't share the magic, it becomes no more satisfying than watching a YouTube video that can't be shared or scrolling through Facebook alone.

Shared Magic Is Real Magic

And while AR does have the potential to work real magic — to port the products of our imaginations into the physical world — the examples we have access to today are often no more remarkable than the artificial backgrounds on Zoom.

If we want AR to enable a true augmentation of reality, we need to use tech that supports shared digital experiences in the physical world.

Given AR's potential to transform everything from how we train fighter pilots to how doctors collaborate on cases, it is crucial that we address the issue of shared AR now or important interactive experiences will not be possible.

Positioning, Positioning, Positioning

The answer is surprisingly simple. It all boils down to precise positioning.

Many assume that objects and environments that exist in AR are automatically anchored in a fixed location and that it should be easy for multiple people to experience the same things in the same places. The truth is this is never the case.

There are apps that offer rough estimates of where AR objects are placed in physical spaces but these are nowhere near accurate enough. You and your friend may be viewing the same AR unicorn in all its sparkling detail. But while your device may show it standing solidly by the door, hers may show it floating near the ceiling.

When positioning fidelity is this low, intersubjectivity simply isn't possible. While you may be together, your experience cannot meaningfully be described as a shared reality.

This shortcoming becomes particularly jarring when you and a friend or colleague try to engage in a shared physical activity involving digital equipment. Virtual tennis is an impossibility when the ball is in one place for you and somewhere else for your opponent. The same goes for racing digital cars. The list goes on.

Precise Location Is Key to Shared Experiences in AR

The reason it's been so difficult to position AR objects in physical space until now is that our mobile devices don't share a consistent, precise coordinate system.

It's true that smartphones come equipped with GPS, which does make it possible to establish shared geographical parameters to some degree. But for a host of reasons, GPS is far from exact enough for true intersubjectivity.

GPS may be able to establish that an AR object is in a given house, but not whether it is in the bedroom or bathroom. Never mind whether it is sitting on top of a table or under it.

The logical solution to this would be a more precise version of GPS. That, however, would mean a system that is completely unaffected by those factors that hinder GPS fidelity, which range from signal blockage by physical obstacles to poor weather or even solar storms. Smartphone GPS is usually accurate to within a 4.9-meter radius, but only under a clear sky and away from buildings, bridges and trees.

The near-term fix for AR's location problem is much simpler, and billions of dollars less expensive. Rather than spending years on creating a hyper-accurate coordinate system, we should move away from geographical anchors altogether.

Instead of two devices trying to pinpoint their respective locations on a map, they merely need to establish where they are relative to one another. In other words, rather than relying on a fixed set of coordinates, devices should be equipped with technology that can create shared, one-off coordinate systems on an as-needed basis.

Say you and I want to race our digital Ferraris along a beach. With this technology, all we'd need to do is synchronize our devices so they "agree" on their relative positions. Once they have an accurate sense of where they are in an ephemeral space, shared reality is possible.

The larger-scale, more complex AR environments I foresee in the future may well one day require a universal 3D positioning system that uses powerful consensus algorithms and persistent location anchors.

But for today's augmented reality to be more than a buzzword, we need to focus on precise positioning and the technologies we can use right now to precisely share location and invite others into our enhanced reality. With these tools, we can transform AR from a gimmick into a technology that enhances all of our lives.

(Johannes Davidsson is the Head Of Business Development at Auki Labs, an AR tech company creating a decentralized protocol for collaborative spatial computing.)

The augmented reality glasses can provide information on objects AFP / Josep LAGO
Thu, 13 Oct 2022 20:39:00 -0500 en-US text/html https://www.ibtimes.com/augmented-reality-must-live-its-name-3624070
Killexams : Save $350 on this Fitness Reality 4000MR Magnetic Rower for Prime Day

Save a bundle on great fitness equipment this Prime Day - including $350 off this Fitness Reality 4000MR Magnetic Rower (opens in new tab), a massive 41% percent reduction on the full price. The 400MR offers 10 Preset workout programs and 5 customizable programs to keep you busy and has chain-driven dual transmission mechanism o provide the strength needed for intense workout.

The large contoured cushion seat (13. 5” L x 10” W) provides extra comfort for a long workout and the raised seat height of up to 22. 5” makes it easy get on and off the rower. Ball-bearing seat rollers make for smooth rowing strokes and the highly visible backlit 5” LCD displays distance, time, count, calories burned, RPM, watts, and tension levels to help you track your progress. This rower is suitable for users up to 6’5” and 300lbs, making it a perfect addition to any home gym.