Guarantee your career with 000-M40 Question Bank and Actual Questions 000-M40 dumps questions offers each of you that you have to take the Certified exam. We offer 100% free 000-M40 Exam Braindumps to download and evaluate. Our IBM 000-M40 Exam will give you exam questions with valid answers that reflect the real exam. We at are made game plans to draw in you to finish your 000-M40 test with good grades.

Exam Code: 000-M40 Practice exam 2022 by team
IBM Info Mgmt InfoSphere Warehouse Technical Mastery v2
IBM InfoSphere reality
Killexams : IBM InfoSphere reality - BingNews Search results Killexams : IBM InfoSphere reality - BingNews Killexams : 11 Best Virtual Reality Stocks to Buy

In this piece, we will take a look at the 11 best virtual reality stocks to buy. For more stocks, head on over to 5 Best Virtual Reality Stocks to Buy.

Advances in semiconductor fabrication and manufacturing have enabled chip makers to squeeze unthinkable amounts of computing power into pieces of silicon the size of a human thumbnail. This growth has also spurred industries of its own, and one such sector is the virtual reality segment of the broader technology industry.

Virtual reality, as the name suggests, refers to technologies that create an artificial representation of reality for users to immerse themselves into - whether for entertainment or productivity needs. This is achieved through headsets, processors, and software, with different companies providing different technologies for the processes.

The virtual reality industry was estimated to be worth $4.4 billion in 2020, and through a massive compounded annual growth rate (CAGR) of 44.8%, the segment can be worth a whopping $84 billion in 2029, according to a research report from Fortune Business Insights. Driving this growth will be several factors, such as the demand for virtual training platforms in industries, that let firms prepare their employees for complex tasks without investing in physical infrastructure. This allows companies in industries such as automobile manufacturing to reduce worker injuries and conduct factory personnel training safely.

Another research report, this time from Valuates Report, analyzes both the virtual and augmented reality markets. Augment reality is a subset of virtual reality that serves as a 'bolt on' to existing reality instead of rendering a completely new environment. This research firm believes that the markets were worth $14 billion in 2020 and through a strong CAGR of 41%, they will grow to sit at $454 billion by the end of 2030.

Therefore, looking at these estimates, it's clear that virtual reality has a bright future ahead of it, despite the bloodbath in technology stocks this year. Today's piece will look at the key players in the industry and some well known firms in the list are Advanced Micro Devices, Inc. (NASDAQ:AMD), Meta Platforms, Inc. (NASDAQ:META), and Microsoft Corporation (NASDAQ:MSFT).

Photo by mahdis mousavi on Unsplash

Our Methodology

We took a look at the virtual reality industry and current trends to pick out which firms are currently offering creative products and services in the industry. We preferred companies with strong financial performance, technological advantages, and relevance to current industry dynamics. These stocks are ranked via hedge fund sentiment gathered through Insider Monkey's 895 fund survey for this year's second quarter.

11. Tencent Holdings Limited (OTCMKTS:TCEHY)

Number of Hedge Fund Holders: 2

Tencent Holdings Limited (OTCMKTS:TCEHY) is a Chinese conglomerate that owns several companies including the video game developer Epic Games. The firm is headquartered in Shenzhen, the People's Republic of China.

Epic Games is one of the most well known game developers in the world, which rose to fame due to its Fortnite gaming brand. The company, like other game developers, is also targeting the metaverse industry which is seeing strong interest from large firms. Sony and The Lego Group invested a whopping $2 billion in Epic Games in 2022 to spur metaverse development.

Additionally, Epic Games' Unreal Engine, which is used by video game developers to develop their products, is capable of developing assets that support 3D visualization and augmented and virtual realities. Insider Monkey's Q2 2022 survey of 895 hedge funds revealed that two had invested in Tencent Holdings Limited (OTCMKTS:TCEHY).

Along with Meta Platforms, Inc. (NASDAQ:META), Advanced Micro Devices, Inc. (NASDAQ:AMD), and Microsoft Corporation (NASDAQ:MSFT), Tencent Holdings Limited (OTCMKTS:TCEHY) is a top virtual reality stock.

10. MicroVision, Inc. (NASDAQ:MVIS)

Number of Hedge Fund Holders: 4

MicroVision, Inc. (NASDAQ:MVIS) is an American company that develops sensors used in automobiles. Additionally, it also develops a scanning technology that enables the creation of large images for a full field of view. It also develops displays concepts, designs, and modules that are used in augmented and virtual reality headsets. The firm is headquartered in Redmond, Washington.

MicroVision, Inc. (NASDAQ:MVIS)'s lidar systems scored a big win in September 2022, when chip giant NVIDIA Corporation announced that the MAIN DR dynamic view system would be supported by NVIDIA's Drive AGX platform. This will Excellerate highway safety for vehicles.

By the end of its second fiscal quarter, MicroVision, Inc. (NASDAQ:MVIS) had $93 million in cash, which is important given the company's weak operating income profile. The firm has invested some of this into treasury securities, and its latest quarterly operating costs stood at $9.7 million - giving it plenty of runway room. Four out of the 895 hedge funds polled by Insider Monkey for their June quarter of 2022 portfolios had invested in the company.

MicroVision, Inc. (NASDAQ:MVIS)'s largest investor in our database is Daniel S. Och's Sculptor Capital which owns 572,200 shares that are worth $2.1 million.

9. Matterport, Inc. (NASDAQ:MTTR)

Number of Hedge Fund Holders: 7

Matterport, Inc. (NASDAQ:MTTR) is an American company that caters to the front end of the virtual reality space. Its software applications allow developers to capture the depth and imagery of a physical space to create a virtual reality environment. The firm is headquartered in Sunnyvale, California.

Matterport, Inc. (NASDAQ:MTTR) reported a strong second fiscal quarter earlier this year, which despite negative revenue growth, saw the firm expand its presence in the market. At the earnings, the firm announced that its subscribers grew by a massive 52% annually to stand at 616,000 during the quarter.

Matterport, Inc. (NASDAQ:MTTR) also counts some of the largest companies in the world as its customers, with firms such as Proctor & Gamble, Sealy, and Netflix part of the 23% of the Fortune 1000 firms that use the company's products. Additionally, the firm's latest quarter also saw it grow its services revenue by 74% and its subscription revenue by 20%.

Insider Monkey took a look at 895 hedge funds for their second quarter of 2022 holdings to discover that 7 had invested in Matterport, Inc. (NASDAQ:MTTR).

Matterport, Inc. (NASDAQ:MTTR)'s largest investor is Chase Coleman and Feroz Dewan's Tiger Global Management LLC which owns 3.6 million shares that are worth $13 million.

8. Unity Software Inc. (NYSE:U)

Number of Hedge Fund Holders: 23

Unity Software Inc. (NYSE:U) is a software platform provider whose products allow its customers to develop 2D and 3D content for a wide variety of gadgets and devices such as smartphones, tablets, computers, gaming consoles, and virtual and augmented reality platforms. The firm is headquartered in San Francisco, California.

Unity Software Inc. (NYSE:U) is also aggressively targeting growth, with its research and development expenses during its second fiscal quarter representing close to 73% of its revenue. This opens up a large opportunity for explosive growth in the future, should these investments bear fruit.

Needham set a $50 share price target for the company in October 2022, stating that its software platform is one of the best in the world and will benefit from the strong growth in the demand for 3D content. 23 out of the 895 hedge funds polled by Insider Monkey during the second quarter of this year had invested in Unity Software Inc. (NYSE:U).

Out of these, Jim Davidson, Dave Roux, and Glenn Hutchins's Silver Lake Partners is Unity Software Inc. (NYSE:U)'s largest investor. It owns 34 million shares that are worth $1.2 billion.

7. Roblox Corporation (NYSE:RBLX)

Number of Hedge Fund Holders: 26

Roblox Corporation (NYSE:RBLX) is an online operating platform operator and developer whose studio allows developers to create and operate virtual 3D environments. The firm is headquartered in San Mateo, California, the United States.

Roblox Corporation (NYSE:RBLX) posted record high revenues of $600 million in its second fiscal quarter, which enabled it to cross $1 billion in revenue for the first half of this year. The company's extreme focus on its products led it to develop a voice chat feature for months before it finally rolled it out to users. Additionally, it has a creative advertising strategy, which creates a unique environment that lets users interact with the ad and then make potential purchases.

Roblox Corporation (NYSE:RBLX)'s platforms are also attractive for advertisers since they provide a large user base of young users that are yet to cement their buying preferences. Needham reduced the company's share price target to $53 from $55 in September 2022, stating that its advertising platform is one of a kind. Insider Monkey's Q2 2022 895 hedge fund survey saw 26 having held a stake in the company.

Roblox Corporation (NYSE:RBLX)'s largest investor is Jim Simons' Renaissance Technologies which owns 11.5 million shares that are worth $380 million.

6. Sony Group Corporation (NYSE:SONY)

Number of Hedge Fund Holders: 26

Sony Group Corporation (NYSE:SONY) is a Japanese multinational platform that designs and sells consumer electronics products and owns video game development platforms. The company is headquartered in Tokyo, Japan.

Sony Group Corporation (NYSE:SONY) operates in the hardware side of the virtual reality ecosystem, as it designs and sells the PlayStation PS VR headset. This headset has two modes, 3D and 2D modes. The former lets users view content in HDR resolution at 90Hz or 120Hz, and the latter lets them play games in HDR at 24Hz, 60Hz, and 120Hz.

When compared to some other virtual reality companies that have weak financials, Sony Group Corporation (NYSE:SONY) is an established player that has sold millions of units of its gaming consoles and brings in close to $100 billion in revenue each year. By the end of this year's second quarter, 26 of the 895 hedge funds surveyed by Insider Monkey had bought the company's shares.

Out of these, Sony Group Corporation (NYSE:SONY)'s largest investor is Mario Gabelli's GAMCO Investors which owns 1.7 million shares that are worth $146 million.

Advanced Micro Devices, Inc. (NASDAQ:AMD), Meta Platforms, Inc. (NASDAQ:META), and Microsoft Corporation (NASDAQ:MSFT), Sony Group Corporation (NYSE:SONY) is a VR stock you must look at.

Click to continue memorizing and see 5 Best Virtual Reality Stocks to Buy.

Suggested Articles:

Disclosure: None. 11 Best Virtual Reality Stocks to Buy is originally published on Insider Monkey.

Fri, 14 Oct 2022 05:37:00 -0500 en-US text/html
Killexams : Augmented Reality Must Live Up To Its Name

For artists, technologists, engineers, advertisers and dreamers, augmented reality (AR) is the holy grail of digital experience. This tech promises to make magic real: to manifest whatever we can imagine in physical space.

But we're not there yet. Today, most of what is called AR is not worthy of the name. Rather than being an augmentation of reality, it is a poor facsimile of a powerful idea.

So much more is possible.

In the past few weeks, the world has woken up to the boundless potential of AR to transform how we live, learn, work and interact. Apple's CEO Tim Cook says we will end up wondering how we ever lived without it.

But as we take the first steps into this bright future, it is more critical than ever that we wake up to a fundamental truth: No matter how vivid our digital creations, AR will fall short of its full promise unless and until these can be accurately placed in the real world and, more importantly, fully shareable with others.

It's Not 'Real' if It Can't Be Shared

Imagination is hard-wired into the human psyche. From early childhood, we embellish our outer worlds with elements of our inner lives. But since there is no way for those around us to tap into those private imaginings, they remain wholly subjective and unverifiable.

Whether or not a sensory experience is shared by others has a critical impact on whether we ourselves believe it to be real. If you are the only one in a crowded room to hear a whispering voice, you will feel isolated and strange. You may start to question your own perception — perhaps even your sanity.

But if others around you say they've heard it too, you're back on solid ground. What you've experienced is valid and therefore must be real.

This is what is known as intersubjectivity, the process of sharing knowledge and experiences with others.

Today, the vast majority of AR tech does not support intersubjective experiences. Indeed, it is often little more than gimmicky filters on our solitary devices that are difficult to share.

If I conjure up a fire-breathing dragon in my back garden, there is no way for me to photograph myself with it or to impress you with the breadth of its wingspan. And if I can't share the magic, it becomes no more satisfying than watching a YouTube video that can't be shared or scrolling through Facebook alone.

Shared Magic Is Real Magic

And while AR does have the potential to work real magic — to port the products of our imaginations into the physical world — the examples we have access to today are often no more remarkable than the artificial backgrounds on Zoom.

If we want AR to enable a true augmentation of reality, we need to use tech that supports shared digital experiences in the physical world.

Given AR's potential to transform everything from how we train fighter pilots to how doctors collaborate on cases, it is crucial that we address the issue of shared AR now or important interactive experiences will not be possible.

Positioning, Positioning, Positioning

The answer is surprisingly simple. It all boils down to precise positioning.

Many assume that objects and environments that exist in AR are automatically anchored in a fixed location and that it should be easy for multiple people to experience the same things in the same places. The truth is this is never the case.

There are apps that offer rough estimates of where AR objects are placed in physical spaces but these are nowhere near accurate enough. You and your friend may be viewing the same AR unicorn in all its sparkling detail. But while your device may show it standing solidly by the door, hers may show it floating near the ceiling.

When positioning fidelity is this low, intersubjectivity simply isn't possible. While you may be together, your experience cannot meaningfully be described as a shared reality.

This shortcoming becomes particularly jarring when you and a friend or colleague try to engage in a shared physical activity involving digital equipment. Virtual tennis is an impossibility when the ball is in one place for you and somewhere else for your opponent. The same goes for racing digital cars. The list goes on.

Precise Location Is Key to Shared Experiences in AR

The reason it's been so difficult to position AR objects in physical space until now is that our mobile devices don't share a consistent, precise coordinate system.

It's true that smartphones come equipped with GPS, which does make it possible to establish shared geographical parameters to some degree. But for a host of reasons, GPS is far from exact enough for true intersubjectivity.

GPS may be able to establish that an AR object is in a given house, but not whether it is in the bedroom or bathroom. Never mind whether it is sitting on top of a table or under it.

The logical solution to this would be a more precise version of GPS. That, however, would mean a system that is completely unaffected by those factors that hinder GPS fidelity, which range from signal blockage by physical obstacles to poor weather or even solar storms. Smartphone GPS is usually accurate to within a 4.9-meter radius, but only under a clear sky and away from buildings, bridges and trees.

The near-term fix for AR's location problem is much simpler, and billions of dollars less expensive. Rather than spending years on creating a hyper-accurate coordinate system, we should move away from geographical anchors altogether.

Instead of two devices trying to pinpoint their respective locations on a map, they merely need to establish where they are relative to one another. In other words, rather than relying on a fixed set of coordinates, devices should be equipped with technology that can create shared, one-off coordinate systems on an as-needed basis.

Say you and I want to race our digital Ferraris along a beach. With this technology, all we'd need to do is synchronize our devices so they "agree" on their relative positions. Once they have an accurate sense of where they are in an ephemeral space, shared reality is possible.

The larger-scale, more complex AR environments I foresee in the future may well one day require a universal 3D positioning system that uses powerful consensus algorithms and persistent location anchors.

But for today's augmented reality to be more than a buzzword, we need to focus on precise positioning and the technologies we can use right now to precisely share location and invite others into our enhanced reality. With these tools, we can transform AR from a gimmick into a technology that enhances all of our lives.

(Johannes Davidsson is the Head Of Business Development at Auki Labs, an AR tech company creating a decentralized protocol for collaborative spatial computing.)

The augmented reality glasses can provide information on objects AFP / Josep LAGO
Thu, 13 Oct 2022 20:39:00 -0500 en-US text/html
Killexams : Behind the Doors of Meta's Top-Secret Reality Labs

Mark Zuckerberg sat across from me, controlling objects on a screen with small motions of his fingers. Taps, glides, pinches. On his wrist was a chunky band that looked like an experimental smartwatch: It's Meta's vision of our future interactions with AR, VR, computers and just about everything else.

"It'll work well for glasses…I think it'll actually work for everything. I think in the future, people will use this to control their phones and computers, and other stuff…you'll just have a little band around your wrist," Zuckerberg said, right before he demoed the neural wristband. His hand and finger movements seemed subtle, almost fidgety. Sometimes nearly invisible.

Neural input devices are just one part of Meta's strategy beyond VR, and these wristbands were among the tech I got to see and try during a first-ever visit to Meta's Reality Labs headquarters in Redmond, Washington. The trip was the first time Meta's invited journalists to visit its future tech research facility, located in a handful of nondescript office buildings far north of Facebook's Silicon Valley headquarters. 

Entrance to a red building

Entering Meta Reality Labs in Redmond, Washington.

Scott Stein/CNET

The last time I visited Redmond, I was trying Microsoft's HoloLens 2. My trip to Meta was a similar experience. This time, I was demoing the Meta Quest Pro, a headset that blends VR and AR together into one device and aims to kick off Zuckerberg's ambitions to a more work-focused metaverse strategy. 

Meta's existing Connect conference news is focused on the Quest Pro, and also on new work partnerships with companies like Microsoft, Zoom, Autodesk and Accenture, targeting ways for Meta to maybe dovetail with Microsoft's mixed reality ambitions.

I also got to look at a handful of experimental research projects that aren't anywhere near ready for everyday use but show glimpses of exactly what Meta's shooting for next. These far-off projects, and a more-expensive Quest Pro headset, come at a strange time for Meta, a company that's already spent billions investing in the future of the metaverse, and whose most popular VR headset, the Quest 2, still has less than 20 million devices sold. It feels like the future isn't fully here yet, but companies like Meta are ready for it to be.

I experienced a number of mind-bending demos with a handful of other invited journalists. It felt like I was exploring Willy Wonka's chocolate factory. But I also came away with the message that, while the Quest Pro looks like the beginning of a new direction for Meta's hardware, it's nowhere close to the end goal.

Researchers wearing prototype wristbands controlling a video game

A demo of EMG wristbands measuring motor neurons, at Meta Reality Labs Research


Neural inputs: Wristbands that adapt to you

"Co-adaptive learning," Michael Abrash, Meta's Reality Labs' chief scientist, told me over and over again. He was describing the wristbands that Meta has discussed multiple times since acquiring CTRL-Labs in 2019. It's a hard concept to fully absorb, but Meta's demo, shown by a couple of trained researchers, gave me some idea of it. Wearing the bulky wristbands wired to computers, the wearers moved their fingers to make a cartoon character swipe back and forth in an endless-running game. Then, their movements seemed to stop. They became so subtle that their hands barely twitched, and still they played the game. The wristbands use EMG, or electromyography (the electrical measurement of muscles) to measure tiny muscle impulses.

A feedback-based training process gradually allowed the wearers to start shrinking down their actions, eventually using only a single motor neuron, according to Thomas Reardon, Reality Labs' Director of Neuromotor Interfaces and former CEO of CTRL-Labs, who talked us through the demos in Redmond. The end result looks a little like mind reading, but it's done by subtly measuring electrical impulses showing an intent to move. 

Mark Zuckerberg demos a neural input wristband with a computer

Mark Zuckerberg using the EMG wristband in a demo in front of a handful of journalists during my visit. 


When Zuckerberg demonstrated the wristband, he used a similar set of subtle motions, though they were more visible. The wristband's controls feel similar to a touch-based trackpad or air mouse, able to identify pressure-based pinches, swipes and gestures.

"In the long run, we're going to want to have an interface that is as natural and intuitive as dealing with the physical world," Abrash said, describing where EMG and neural input tech is aiming. 

Typing isn't on the table yet. According to Zuckerberg, it would require more bandwidth to get to that speed and fidelity: "Right now the bit rate is below what you would get for typing quickly, but the first thing is just getting it to work right." The goal, at some point, is to make the controls do more. Meta sees this tech as truly arriving in maybe five to six years, which feels like an eternity. But it will likely line up, should that timeframe hold, with where Meta sees its finalized AR glasses becoming available.

Someone wearing a neural input wristband

The EMG wristband looks like a large prototype smartwatch, with sensors in the segmented strap.

Scott Stein/CNET

Zuckerberg says the wristbands are key for glasses, since we won't want to carry controllers around, and voice and hand tracking aren't good enough. But eventually he plans to make these types of controls work for any device at all, VR or otherwise. 

The controls look like they'll involve an entirely different type of input language, one that might have similarities to existing controls on phones or VR controllers, but which will adapt over time to a person's behavior. It seems like it would take a while to learn to use. 

"Most people are going to know a whole lot about how to interact in the world, how to move their bodies," Reardon said to me. "They're going to understand simple systems like letters. So let's meet them there, and then do this thing, this pretty deep idea called co-adaptation, in which a person and a machine are learning together down this path towards what we would call a pure neural interface versus a neural motor interface, which blends neural decoding with motor decoding. Rather than saying there's a new language, I'd say the language evolves between machine and person, but it starts with what people do today."

Demonstrating an EMG wristband in front of a computer screen showing feedback

A demonstration showing how feedback can lead to the wristband sensing smaller and smaller motions.


"The co-adaptation thing is a really profound point," Zuckerberg added. "You don't co-adapt with your physical keyboard. There's a little bit of that in mobile keyboards, where you can misspell stuff and it predicts [your word], but this is a lot more."

I didn't get to wear or try the neural input wristband myself, but I got to watch others using them. Years ago at CES, I did get to briefly try a different type of wrist-worn neural input device for myself, and I got a sense of how technologies like this actually work. It's different from the head-worn device by Nextmind (since acquired by Snap) I tried a year ago, which measured eye movement using brain signals. 

The people using the Meta wristbands seemed to make their movements easily, but these were basic swiping game controls. How would it work for more mission-critical everyday use in everyday AR glasses? Meta's not there yet: According to Zuckerberg, the goal for now is to just get the tech to work, and show how adaptive learning could eventually shrink down response movements. It may be a while before we see this tech in action on any everyday device, but I wonder how Meta could apply the principles to machine learning-assisted types of controls that aren't neural input-based. Could we see refined controllers or hand tracking combinations arrive before this? Hard to tell. But these bands are a far-off bet at the moment, not an around-the-corner possibility.

A man wearing a mask and headphones in a testing room with speakers

I wear a spatially-trackable headset which creates audio effects I can't distinguish from the speakers in the room.


Super-real 3D audio 

A second set of demos I tried, demonstrating next-generation spatial audio, replicated research Meta talked about back in 2020 -- and which it originally planned on showing off in-person before COVID-19 hit. Spatial audio is already widely used in VR headsets, game consoles and PCs, and on a variety of everyday earbuds such as AirPods. What Meta's trying to do is not just have audio that seems like it's coming from various directions, but to project that audio to make it seem like it's literally coming from your physical room space.

A visit to the labs' soundproof anechoic chamber -- a suspended room with foam walls that block reflections of sound waves -- showed us an array of speakers designed to help study how sounds travel to individual ears, and to explore how sounds move in physical spaces. The two demos we tried after that showed how ghostly-real the sounds can feel.

In a soundproof room with a tower full of speakers

Inside Meta's anechoic chamber, where a massive speaker array is used to help create spatial audio profiles.

Scott Stein/CNET

One, where I sat down in a crowded room, involved me wearing microphones in my ears while the project leads moved around me, playing instruments and making noises at different distances. After 40 seconds of recording, the project leads played back the audio to me with over-ear headphones… and parts of it sounded exactly like someone was moving around the room near me. What made it convincing, I think, were the audio echoes: The sense that the movement was reverberating in the room space. 

A second demo had me wearing a 3D spatial-trackable pair of headphones in a room with four speakers. I was asked to identify whether music I heard was coming from the speakers, or my ears. I failed. The music playback flawlessly seemed to project out, and I had to take off the headphones to confirm which was which as I walked around.

According to Michael Abrash's comments back in 2020, this tech isn't far away from becoming a reality as neural wristbands. Meta's plans are to have phone cameras eventually be able to help tune personal 3D audio, much like Apple just added to its newest AirPods, but with the added benefit of realistic room-mapping. Meta's goal is to have AR projections eventually sound convincingly present in any space: It's a goal that makes sense. A world of holographic objects will need to feel anchored in reality. Although, if future virtual objects sound as convincingly real as my demos were, it might become hard to distinguish real sounds from virtual, which brings up a whole bunch of other existential concerns.

Wearing a VR headset and talking with a head on a computer screen

My conversation with an avatar so realistic it felt like I was in the same room with them.


Talking to photo-real avatars

I'm in a dark space, standing across from a seemingly candle-lit and very real face of someone who was in Meta's Pittsburgh Reality Labs Research offices, wearing a specially built face-tracking VR headset. I'm experiencing Codec Avatars 2.0, a vision of how realistic avatars in metaverses could get.

How real? Quite real. It was uncanny: I stood close and looked at the lip movement, his eyes, his smiles and frowns. It felt almost like talking with a super-real PlayStation 5 game character, then realizing over and over again this is a real-time conversation with a real person, in avatar form. 

I wondered how good or limited face tracking could be: After all, my early Quest Pro demos using face tracking showed limits. I asked Jason, the person whose avatar I was next to, to make various expressions, which he did. He said I was a bit of a close-talker, which made me laugh. The intimate setting felt like I had to get close and talk, like we were in a cave or a dimly lit bar. I guess it's that real. Eventually, the realism started to feel good enough that I started assuming I was having a real conversation, albeit one with a bit of uncanny valley around the edges. It felt like I was in my own living video game cutscene.

Meta doesn't see this coming into play for everyday headsets any time soon. First of all, standalone VR headsets are limited in their processing power, and the more avatars you have in a room, the more graphics get taxed. Also, the tracking tech isn't available for everyone yet. 

Man with a VR headset, chatting with a realistic avatar on a screen

Trying out a chat with an Instant Codec Avatar, created with a phone-made head scan.


A more dialed-down version was in my second demo, which showed an avatar that had been created by a face scan using a phone camera using a new technology called Instant Codec Avatars. The face looked better than most scans I'd ever made myself. But I felt like I was talking with a frozen and only slightly moving head. The end result was less fluid than the cartoony Pixar-like avatars Meta uses right now.

Actor being scanned by a room full of cameras

An actor who was 3D scanned ahead of time using an array of cameras. I saw his rendered avatar layered with digital clothing.

Scott Stein/CNET

One final demo showed a full-body avatar (legs, too!) that wasn't live or interactive. It was a premade 3D scan of an actor using a special room with an array of cameras. The demo focused on digital clothes that could realistically be draped over the avatar. The result looked good up close, but similar to a realistic video game. It seems like a test drive for how digital possessions could someday be sold in the metaverse, but this isn't something that would work on any headset currently available.

3D scanning a shoe with a phone

My sneaker gets 3D scanned using Meta's new phone-based capture tech.

Scott Stein/CNET

3D scanning my shoes (plus, super-real cacti and teddy bears)

Like a volunteer in a magic show, I was asked to remove one of my shoes for a 3D scanning experiment. My shoe ended up on a table, where it was scanned with a phone camera -- no lidar needed. About half an hour later, I got to look at my own shoe in AR and VR. 3D scanning, like spatial audio, is already widespread, with lots of companies focused on importing 3D assets into VR and AR. Meta's research is aiming for better results on a variety of phone cameras, using a technology called neural radiance fields. Another demo showed a whole extra level of fidelity.

A tablet showing a 3D image of a shoe

My shoe, after being scanned, in AR.

Scott Stein/CNET

A couple of prescanned objects, which apparently took hours to prepare, captured the light patterns of complex 3D objects. The results -- which showed furry, spiky, fine-detailed objects including a teddy bear and a couple of cacti -- looked seriously impressive on a VR headset. The curly fur didn't seem to melt or matte together like most 3D scans; instead it was fluffy, seemingly without angles. The cactus spines spread out in fine spiky threads.

Of all the demos I tried at the Reality Labs, this was maybe the least wowing. But that's only because there are already, through various processes, lots of impressive 3D-scanned and rendered experiences in AR and VR. It's not clear how instant or easy it could be to achieve Meta's research examples in everyday use, making it hard to judge how effective the function is. For sure, if scanning objects into virtual, file-compatible versions of themselves gets easier, it'll be key for any company's metaverse ambitions. Tons of businesses are already aiming to sell virtual goods online, and the next step is letting anyone easily do it for their own stuff. Again, this is already possible on phones, but it doesn't look as good…yet.

Meta's Michael Abrash stands in front of a wall of VR headsets

Chief Scientist Michael Abrash talks to us in front of a wall of prototype VR and AR headsets.


What does it all mean?

The bigger question on my mind, as my day ended at Meta's facilities and I called a Lyft from the parking lot, was what it all added up to. Meta has a brand-new Quest Pro headset, which is the bleeding-edge device for mixing AR and VR together, and which offers new possibilities for avatar control with face tracking.

The rest of the future remains a series of question marks. Where Meta wants to spread out its metaverse ambitions is a series of roads that are still unpaved. Neural inputs, AR glasses, blends of virtual and real sounds, objects and experiences? These could still be years away. 

In a year where Meta has seen its revenue drop while making sizable bets on the metaverse, despite inflation and an economic downturn, are these projects all going to be fulfilled? How long can Meta's long-game metaverse visions be sustained?

Prototype glasses-sized VR headset

Meta's prototype VR sunglasses, the "North Star" for what the tech aims to become.

Scott Stein/CNET

Abrash talks to us once more as we gather for a moment before the day's end, bringing back a connecting theme, that immersive computing will be a true revolution, eventually. Earlier on, we had stopped at a wall full of VR and AR headsets, a trophy case of all the experimental prototypes Meta has worked on. We saw mixed reality ones, ones with displays designed to show eyes on the outside, and ones so small they're meant to be the dream VR equivalent of sunglasses. 

It made me think of the long road of phone design experimentation before smartphones became mainstream. Clearly, the metaverse future is still a work in progress. While big things may be happening now, the true "smartphones" of the AR and VR future might not be around for a long while to come.

"The thing I'm very sure of is, if we go out 20 years, this will be how we're interacting," Abrash said in front of the headset wall. "It's going to be something that does things in ways we could never do before. The real problem with it is, it's very, very hard to do this."

Sat, 15 Oct 2022 00:01:00 -0500 See full bio en text/html
Killexams : IBM’s former CEO downplays the importance of a college degree for six-figure earning ‘new collar’ jobs that now make up half of its workers

A four-year bachelor’s degree has long been the first rung to climbing America’s corporate ladder.

But the move to prioritize skills over a college education is sweeping through some of America’s largest companies, including Google, EY, Microsoft, and Apple. Strong proponents say the shift helps circumvent a needless barrier to workplace diversity.

“I really do believe an inclusive diverse workforce is better for your company, it’s good for the business,” Ginni Rometty, former IBM CEO, told Fortune Media CEO Alan Murray during a panel last month for Connect, Fortune’s executive education community. “That’s not just altruistic.”

Under Rometty’s leadership in 2016, tech giant IBM coined the term “new collar jobs” in reference to roles that require a specific set of skills rather than a four-year degree. It’s a personal commitment for Rometty, one that hits close to home for the 40-year IBM veteran.

When Rometty was 16, her father left the family, leaving her mother, who’d never worked outside the home, suddenly in the position to provide.

“She had four children and nothing past high school, and she had to get a job to…get us out of this downward spiral,” Rometty recalled to Murray. “What I saw in that was that my mother had aptitude; she wasn’t dumb, she just didn’t have access, and that forever stayed in my mind.”

When Rometty became CEO in 2012 following the Great Recession, the U.S. unemployment rate hovered around 8%. Despite the influx of applicants, she struggled to find employees who were trained in the particular cybersecurity area she was looking for.

“I realized I couldn’t hire them, so I had to start building them,” she said.

In 2011, IBM launched a corporate social responsibility effort called the Pathways in Technology Early College High School (P-TECH) in Brooklyn. It’s since expanded to 11 states in the U.S. and 28 countries.

Through P-TECH, Rometty visited “a very poor high school in a bad neighborhood” that received the company’s support, as well as a community college where IBM was offering help with a technology-based curriculum and internships.

“Voilà! These kids could do the work. I didn’t have [applicants with] college degrees, so I learned that propensity to learn is way more important than just having a degree,” Rometty said.

Realizing the students were fully capable of the tasks that IBM needed moved Rometty to return to the drawing board when it came to IBM’s own application process and whom it was reaching. She said that at the time, 95% of job openings at IBM required a four-year degree. As of January 2021, less than half do, and the company is continuously reevaluating its roles.

For the jobs that now no longer require degrees and instead rely on skills and willingness to learn, IBM had always hired Ph.D. holders from the very best Ivy League schools, Rometty told Murray. But data shows that the degree-less hires for the same jobs performed just as well. “They were more loyal, higher retention, and many went on to get college degrees,” she said.

Rometty has since become cochair of OneTen, a civic organization committed to hiring, promoting, and advancing 1 million Black individuals without four-year degrees within the next 10 years.

If college degrees no longer become compulsory for white-collar jobs, many other qualifications—skills that couldn’t be easily taught in a boot camp, apprenticeship program, or in the first month on the job—could die off, too, University of Virginia Darden School of Business professor Sean Martin told Fortune last year.

“The companies themselves miss out on people that research suggests…might be less entitled, more culturally savvy, more desirous of being there,” Martin said. Rather than pedigree, he added, hiring managers should look for motivation.

That’s certainly the case at IBM. Once the company widened its scope, Rometty said, the propensity to learn quickly became more of an important hiring factor than just a degree.

This story was originally featured on

More from Fortune:

A 2007 flashback: home flippers are in trouble again

Managing Gen Z is like working with people ‘from a different country’

The Renault Nissan empire once held together by fugitive Carlos Ghosn may slowly be unraveling

PayPal tells users it will fine them $2,500 for misinformation, then backtracks immediately

Sun, 16 Oct 2022 06:27:00 -0500 en-US text/html
Killexams : How will Meta handle a dose of Apple Reality? No result found, try new keyword!Apple is preparing to introduce its first-generation mixed-reality device following years of development. Meta is upset and is trying to score a few punches before its competitor enters the ring. Fri, 14 Oct 2022 05:26:00 -0500 en text/html Killexams : Meta’s virtual reality project will finally have legs – literally

A year after changing its name, the company formerly known as Facebook has revealed its plans to supply the metaverse legs – literally.

Mark Zuckerberg’s virtual reality project is getting a raft of additions including a $1,499 (£1,356) “pro” headset, integration with Microsoft Office and the sitcom The Office, and, yes, the ambulatory appendages.

Legs join shoulders and knees, though not yet toes, as part of an upcoming visual overhaul of the avatars in Meta’s Horizon virtual worlds, Zuckerberg revealed. Currently, other users simply hover slightly above the ground, with heads, arms and torsos rendered in a cartoony style but bodies ending at the waist. As a result, legs are “probably the most requested feature on our roadmap”, the chief executive and co-founder said. “But seriously, legs are hard, which is why other virtual reality systems don’t have them either.”

The company’s systems will now try to guess the position of users’ legs and feet using a number of inputs, from direct visual tracking using front-facing cameras to more advanced attempts to predict their movement with just the motion of the head and hands, based on models of human anatomy.

Derision of Horizon’s avatars has prompted irritation from Zuckerberg in the past. In August, a post from the Facebook founder of his glassy-eyed figure standing in front of a virtual Eiffel Tower went viral on social media, with people mocking the vaguely soulless appearance of the virtual world. In response, he shared a render of a more realistic version of his virtual face a few days later. “I know the photo I posted earlier this week was pretty basic – it was taken very quickly to celebrate a launch,” he said. “The graphics in Horizon are capable of much more.”

While legs may have been the most-requested feature, the star of Meta’s Connect event was the Quest Pro headset, a new business-focused device that will sell for $1,499 and push what is possible in virtual reality forward. The headset introduces two new headline features to Meta’s VR lineup: eye tracking and “passthrough” mixed reality.

The former uses tiny cameras mounted on the inside of the headset to track where in the virtual world a user is looking. That lets developers offer experiences that respond to a user’s attention, from virtual characters that react to being looked at to interfaces that can be activated with a glance. But it also enables whole new levels of surveillance, with advertisers potentially able to assess exactly who has looked at what promotions for how long.

Passthrough mixed reality attempts to offer a similar experience to devices such as Magic Leap and Microsoft’s HoloLens AR glasses, layering a virtual experience on top of the real world. But rather than experimenting with holographic lenses like those two devices, the Quest Pro uses high-resolution front-facing cameras to simply record the real world and then display it on the interior screens. That turns a display technology challenge into one of computing speed, since the device needs to be able to process and display the live footage rapidly enough to have zero lag, or users would get horrible motion sickness.

Neither feature is cheap, and Zuckerberg implied the Quest Pro will be sold at a loss despite costing $1,100 more than its mainstream Quest 2 headset. “The strategy overall is not to make money on the hardware,” he told tech site The Verge, though “there are lots of different ways to basically do the accounting on this.”

But the market for the Pro is professional users. “If I could supply all of our engineers a device and have them be 3% more productive, I’d supply them a $1,500 device, for sure,” he added. To that end, the company announced new deals with partners including NBC, which will bring experiences based on The Office sitcom to the platform, as well as Microsoft, which is a version of Office, Teams and even Xbox Game Pass for the Quest platform.

Wed, 12 Oct 2022 01:43:00 -0500 Alex Hern en text/html
Killexams : How MTV’s ‘The Challenge’ Became the Reality Show for Sports Fans Killexams : MTV’s ‘The Challenge’ is the reality show for sports fans - Sports Illustrated Skip to main content Fri, 14 Oct 2022 08:14:00 -0500 en-us text/html Killexams : See Which Of The Latest 13F Filers Holds IBM No result found, try new keyword!In terms of shares owned, we count 6 of the above funds having increased existing IBM positions from 06/30/2022 to 09/30/2022, with 2 having decreased their positions. Looking beyond these ... Thu, 13 Oct 2022 02:26:00 -0500 text/html Killexams : Better Buy: IBM Stock vs. 2-Year Treasury Notes

Investors this year increasingly turned away from dividend stocks in favor of the rising yields being offered on bonds. Given that investors can now earn a 4.3% return on a 2-year Treasury note, many prefer that guaranteed return to the risks of putting money into the stock market.

International Business Machines (IBM 1.51%) offers a dividend yield that exceeds that bond return. But with a bear market in progress, are investors better served to take a chance on the cloud stock or to take the 4.3% return at virtually zero risk?

IBM and its dividend

IBM didn't participate in the bull market of the 2010s. The stock dropped as its tech businesses suffered a considerable growth slowdown. In an effort to change that, IBM pivoted into the cloud computing sector aggressively, in part via its $34 billion purchase of Red Hat in 2019. Grand View Research forecasts a compound annual growth rate of 16% through 2030 for the cloud industry. Growth like that could certainly help both IBM and its stock.

Also, IBM spun off its managed infrastructure business into a new public company, Kyndryl. This business was less of a fit with the parent company amid its pivot to the cloud. Separating it off should make it easier for IBM to grow its revenue.

Time will tell if these moves can help the stock price recover. Nonetheless, IBM currently pays its shareholders $1.65 per share every quarter, or $6.60 per share annually. At the current stock price, that adds up to a yield of 5.6% per year. Moreover, depending on your financial situation, the IRS may tax your dividends at a lower capital gains rate, which can offer an added advantage.

Additionally, IBM hiked its payout annually for 27 consecutive years, making it a Dividend Aristocrat. That status carries some importance as many income investors will be more inclined to buy and hold IBM stock because of this status. Also, since abandoning Dividend Aristocrat status tends to hurt a stock, management will probably prioritize maintaining it by continuing to raise those payouts.

Investors also can also reinvest their dividend payments into more IBM stock. However, such newly purchased shares will pay you the dividend yield at that time. The return will rise if the stock falls since investors can buy the exact cash return at a lower price. Conversely, cash yields will drop if the stock rises, but those investors still benefit since the stock has increased in value.

What to know about 2-year Treasury notes

U.S. Treasury notes offer more stability than stocks such as IBM. Investors who purchase the 2-year Treasury note receive semiannual interest payments. At the current interest rate of 4.3%, investors will receive a 2.15% cash return on their invested amount in each of the subsequent three six-month periods. In the fourth period, when the note matures, investors receive the final 2.15% payment along with the return of their principal.

Investors should also be aware that bond values can fluctuate. If interest rates drop, the value of the bond will fall; the opposite will happen if rates rise. This affects investors if they decide to sell the bond early. Upon maturity, the note will return to its par (or nominal) value.

Additionally, bond interest payments are subject to federal income tax but exempt from state and local taxes. In some cases, this is higher than taxes on dividends. Still, bond issuers are obligated to make such payments. In contrast, IBM faces no legal obligation to continue its dividend.

Also, like with a stock, investors can reinvest their interest payments into more notes or other forms of Treasury bonds. However, those purchases will be subject to the prevailing interest rates at that time.

IBM or the 2-year Treasury note?

Investors who lack much risk tolerance should choose the Treasury note. Given its guaranteed return, they will not have to worry about volatility.

Nonetheless, for investors comfortable with buying stocks, IBM is a surprisingly strong buy. The cloud industry is in growth mode, which should propel IBM stock to a long-awaited turnaround. Moreover, IBM has repeatedly shown it wants to hold on to its Dividend Aristocrat status. This should supply its income investors returns that are not only larger than the bonds offer, but also likely to increase in size.

Will Healy has no position in any of the stocks mentioned. The Motley Fool has no position in any of the stocks mentioned. The Motley Fool has a disclosure policy.

Thu, 13 Oct 2022 17:25:00 -0500 Will Healy en text/html
Killexams : IBM veteran joins Red Hat C-suite in major executive shakeup No result found, try new keyword!IBM subsidiary Red Hat is making key changes to its executive leadership – shifting current CFO Carolyn Nash into the Chief Operating Officer role as it gives its finance and operations ... Wed, 12 Oct 2022 02:51:00 -0500 text/html
000-M40 exam dump and training guide direct download
Training Exams List