There is no better option than our PL-200 braindumps and examcollection

You will get the exactly same replica of PL-200 real exam questions that you are going to attempt in actual test. has maintained database of PL-200 examcollection that is big questions bank highly pertinent to PL-200 and served by test takers who attempt the PL-200 exam and passed with high score.

Exam Code: PL-200 Practice test 2022 by team
PL-200 Microsoft Power Platform Functional Consultant

Exam ID : PL-200
Exam Name : Microsoft Power Platform Functional Consultant

Candidates for this test perform discovery, capture requirements, engage subject matter experts and stakeholders, translate requirements, and configure Power Platform solutions and apps. They create application enhancements, custom user experiences, system integrations, data conversions, custom process automation, and custom visualizations.

Candidates implement the design provided by and in collaboration with a solution architect and the standards, branding, and artifacts established by User Experience Designers. They design integrations to provide seamless integration with third party applications and services.

Candidates actively collaborate with quality assurance team members to ensure that solutions meet functional and non-functional requirements. They identify, generate, and deliver artifacts for packaging and deployment to DevOps engineers, and provide operations and maintenance training to Power Platform administrators.

Power Platform Functional Consultants should be familiar with Dynamics 365 model-driven applications and should have experience using the Power Platform components to extend and customize Dynamics 365 model-driven applications.

Configure the Common Data Service (25-30%)
Create apps by using Power Apps (20-25%)
Create and manage Power Automate (15-20%)
Implement Power Virtual Agents chatbots (10-15%)
Integrate Power Apps with other apps and services (15-20%)

Configure the Common Data Service (25-30%)
Manage an existing data model
 assign a type for an entity including standard, activity, or virtual
 configure entity ownership
 create new entities or modify existing entities
 determine which type of relationship to implement including 1: N and N: N
 configure entity relationship behaviors including cascading rules
 create new relationships or modify existing relationships
 create new fields or modify existing fields
 create alternate keys for entities
 configure entity properties
Create and manage processes
 define requirements for business rules
 define and implement business rule logic
 define the scope for business rules
 configure and test business rules
 configure a synchronous classic workflow
Configure Common Data Service settings
 configure Relevance Search
 configure auditing
 perform data management tasks
 configure duplicate detection settings
Configure security settings
 create and manage business units
 create and manage security roles
 create and manage users and teams
 create and manage field security profiles
 configure hierarchy security
Create apps by using Power Apps (20-25%)
Create model-driven apps
 create and configure forms
 create and configure views
 create and configure charts
 create and configure dashboards
 configure site maps
 select applicable assets for an app including entities, forms, views, business process
flows, dashboards, and charts
 share a model-drive app
Create canvas apps
 create a canvas app
 configure the Common Data Service as a data source for an app
 create canvas app screens
 implement form navigation, formulas, variables and collections, and error handling
 build reusable components and component libraries
 configure offline capabilities for apps
 run Power Automate flows based on actions that occur in a canvas app
 interpret App Checker results and resolve identified issues
 test, monitor, and share apps
Create portal apps
 create a portal app
 expose Common Data Service data
 configure portal web pages, forms, and navigation
 configure portal security including web roles and page access
Create and manage Power Automate (15-20%)
Create flows
 describe types of flows and flow components
 trigger a flow by using Common Data Service connectors
 run actions by using the Common Data Service connector
 implement logic control
 implement dynamic content and expressions
 interpret and act on Flow Checker results
 activate and deactivate flows
 interpret flow analytic data
Create and manage business process flows
 configure a business process flow
 add business rules, workflows, and action steps to a business process flow
 define stages and steps
 configure parallel branches
 manage the business process flow entity for a business process flow
Build UI flows
 describe types of UI flows
 identify use cases for UI flows
 differentiate between attended and unattended UI flows
 record business process tasks
Implement Power Virtual Agents chatbots (10-15%)
Create chatbot
 assign a chatbot to an environment
 publish a chatbot
 share a chatbot
 add chatbots to Teams and other channels
 monitor and diagnose bot performance, usage, and course usage
Configure courses
 define course conversation triggers
 create questions, messages, and conditions
 extract courses from a web page
 implement greetings, escalations, error messages, and statuses
 call a Power Automate flow to run an action
Configure entities
 create custom entities
 implement entities in conversations
 implement variables to store data
Integrate Power Apps with other apps and services (15-20%)
Integrate Power BI with Power Apps
 create Power BI visualizations
 create data flows and schedule data flow runs
 filter data
 build reports and dashboards
 publish and share reports and dashboards
 add Power BI tiles to model-driven apps and canvas app
 add canvas apps to a Power BI dashboard
 trigger Power Automate flows from Power BI alerts
Implement AI Builder
 determine which AI Builder model type to use
 create an AI Builder model
 prepare source data for use by models
 train, test, and publish a model
 consume a model by using Power Apps
 consume a model by using Power Automate
Integrate Power Apps with Microsoft 365
 add apps to Microsoft Teams
 create a Teams app from a Power Apps app
 configure app policies
 create a Teams channel by using Power Automate
 configure and use Microsoft Word and Microsoft Excel templates
Implement Power Virtual Agents chatbots (10-15%)
Create chatbot
 assign a chatbot to an environment
 publish a chatbot
 share a chatbot
 add chatbots to Teams and other channels
 monitor and diagnose bot performance, usage, and course usage
Configure courses
 define course conversation triggers
 create questions, messages, and conditions
 extract courses from a web page
 implement greetings, escalations, error messages, and statuses
 call a Power Automate flow to run an action
Configure entities
 create custom entities
 implement entities in conversations
 implement variables to store data
Integrate Power Apps with other apps and services (15-20%)
Integrate Power BI with Power Apps
 create Power BI visualizations
 create data flows and schedule data flow runs
 filter data
 build reports and dashboards
 publish and share reports and dashboards
 add Power BI tiles to model-driven apps and canvas apps
 add canvas apps to a Power BI dashboard
 trigger Power Automate flows from Power BI alerts
Implement AI Builder
 determine which AI Builder model type to use
 create an AI Builder model
 prepare source data for use by models
 train, test, and publish a model
 consume a model by using Power Apps
 consume a model by using Power Automate
Integrate Power Apps with Microsoft 365
 add apps to Microsoft Teams
 create a Teams app from a Power Apps app
 create an app directly in Teams
 configure app policies
 create a Teams channel by using Power Automate
 configure and use Microsoft Word and Microsoft Excel templates

Microsoft Power Platform Functional Consultant
Microsoft Functional reality
Killexams : Microsoft Functional reality - BingNews Search results Killexams : Microsoft Functional reality - BingNews Killexams : A Surgeon Goes Hands-on With Microsoft’s Hololens


Microsoft‘s Hololens has been getting a tremendous amount of attention over the past few years. Hype has been steadily accelerating about the technological, financial, and social potential for augmented reality, especially given the exact frenzy surrounding Pokemon Go. To clarify, while “mixed reality” is probably a more accurate term to describe a technology that blends simulated objects with your surroundings in an almost indistinguishable fashion, we will use the term “augmented reality” in this post as it is still more common. If you don’t have the time to read this lengthy hands-on, here is the low-down: Hololens is a very impressive technology with very compelling medical use-cases, although it currently has some limitations that seem like they will be addressed in later generations of the device. Keep in mind there are additional headsets to keep an eye on, including, but not limited to, the Daqri, Epson’s Moverio, Magic Leap, ODG’s R-7 Smartglasses, and the Meta 2.


The health technology sector is very excited about the potential of augmented reality for a variety of applications. Health-related AR startups are already getting a lot of buzz and funding. One of the most talked about applications is the holographic anatomy educational program, which is the result of a partnership of Microsoft with Case Western Reserve University. studying about the various applications and the technology is exciting, however it is hard to get an idea of the potential of the Hololens, and the limitations, without trying it yourself.


Thanks to Neil Gupta, organizer of the Boston Augmented/Mixed Reality Meetup, and Microsoft, I was able to get hands on (or head in?) a Hololens and try it out for myself. We tried the Hololens in a sizable enclosed space that proved to be ideal for its tracking and projectional capabilities. The headset is pretty ergonomic. After a little fiddling with the size and the angle of an unconventional oblique strap that keeps it in place, it was easy to forget it was on your head. I didn’t feel that it was incredibly secure, however, and I imagine that during a long surgery you would need someone to adjust it occasionally.

Once the headset was secure I was ready to go. It only took a short period to get used to the gesture system, which is very limited at this point, but functional. The gesture for a click involves holding your thumb and index finger in an “L” shape and then touching them together. This will interact with wherever you are looking at. So, you target with your gaze, and then interact with a click gesture. There is an additional gesture, which is almost like a “right-click,” that consists of taking a clenched fist and then opening it slowly with your palm facing the ceiling. There is a sweet spot where your hand needs to be in order for these gestures to be recognized by the Hololens. You can even type using a virtual keyboard using this gesture system. The whole set-up works pretty reliably once you get the hang of it, but it is disappointing that the hand-tracking is not more sophisticated at this time. There is also the option to utilize voice recognition, but I mainly stuck with the gestures. I imagine the hand-tracking will Excellerate in time.


One thing I did immediately was set up my virtual holographic space. This was incredibly cool. I placed various virtual animals and objects around the room, and also could place highly functional browser windows wherever I liked. The browser windows in particular really impressed me. The text and images were high resolution and very readable. When looking at them I truly never got the sense that these were holograms. It felt like these windows were in the same space with me. Some of the holographic objects worked better than others. A combination of the object color/design and the lighting/background I placed them in front of affected the “realness” of the objects. But overall these very basic functions of the Hololens impressed me the most. I also should mention that the tracking is pretty unbelievable. A major part of the technology behind the Hololens is sensing where you are in 3D space and adjusting the projections of the holograms accordingly. While some other headsets are having issues with this particular challenge at the moment, this is not a problem for the Hololens. Moving around the room, even when moving briskly, does not cause any jitter or apparent shift in the position/rotation of the surrounding holograms.

I then spent some time playing “Project X-ray,” a pretty fun space invader type game in which spider robots explode through the wall attacking you. The game is a blast, and you have to actually physically dodge laser beams as you try and take out these robots. Not something you really want to do with anyone watching you, I might add. The illusion, once again, is very impressive here, as these virtual holes in the wall the aliens popped through at some points seemed really quite convincing.


Now it’s time to talk about the elephant in the room, which is the Hololens’ limited field-of-view (FOV). You only see holograms in a small box in the center of your vision (image source: The Verge). So when you see promotional videos with a room filled with holograms, it’s a little misleading. While the object data for those holograms are there, you really have to scan your head around to see it all. Now, from a gaming/entertainment perspective this may be a deal-breaker at the moment, but for medical applications this might not be a big deal at all.

I won’t lie, I was pretty excited after my brief experience with the Hololens. I can already see it being useful for medical applications in its current state. I’m a surgeon, so I’ve been thinking about ways I can use this in an OR. During surgery, a major challenge is controlling the location and content of visual displays in the room. There are usually several monitors that can display a camera feed, vital signs, or imaging data. Getting the circulating nurse, who sometimes is unfamiliar with the system, to try and change out the feeds, reposition the monitors, and/or open available imaging can get frustrating very quickly, especially during a stressful case. From what have experienced, the Hololens already has the ability to completely eliminate this cumbersome process and allow you to create a custom holographic layout of patient related information. Furthermore, thanks to the gesture system, you can cycle through the data and imaging at your leisure, which will benefit surgeon and patient alike. What’s unclear to me at the moment is how lighting conditions in the operating room, which can be very harsh, will affect the quality of the holographic projections. I’m also curious to see if it affects the gesture-recognition capabilities. Finally, I’m a little worried that the tinted visor of the Hololens may decrease a surgeon’s ability to view the operating field clearly, which may limit its uses for surgical applications. It also doesn’t seem like it would protect your eyes sufficiently from bodily fluids that might splash up during a case.

Some other AR applications for the operating room I’m pretty excited about include using surgical navigation technology to “see through” patients. So, for example, with an AR headset I would see the bony structure of a patient’s spine through his body using reconstructed CT scans. Another exciting application is the ability for surgeons to guide other surgeons remotely, also called telementoring.

The future of augmented reality for medicine and surgery is bright and exciting. The technology is already at the point where it has some very compelling uses. There are some limitations with the current state of the technology, but these problems appear to be relatively minor in the grand scheme of things, and will likely be overcome in the next few years. The augmented/mixed reality headset may one day join the mask and surgical cap as necessities for the OR.

Flashback: Controlling Augmented Reality: A Surgeon’s Perspective

Wed, 13 Jul 2022 12:00:00 -0500 Justin Barad en-US text/html
Killexams : A former Amazon exec's new book explains why the coming metaverse — "the next internet" — is still far off, and what exactly it may look like
  • In "The Metaverse," Matthew Ball provides an entertaining guide to the coming alternative virtual world.
  • The book is optimistic about the metaverse's potential impact, yet describes monumental obstacles standing in its way.
  • This is an opinion column. The thoughts expressed are those of the writer.

The polarization of the current era extends far beyond traditional politics to matters of taste, technology, the environment, and economics. The result is an increasing inability to engage in civil discourse on any number of subjects that have significant implications for our collective future. The title of a new book exploring the potential impact of virtual and augmented reality technologies — "The Metaverse: And How It Will Revolutionize Everything" — seems to reflect this tendency toward polemic and hyperbole.

Happily, however, the book — written by consultant and investor Matthew Ball, the former global head of strategy for Amazon Studios — demonstrates that a true believer can still make a meaningful contribution to everyone's understanding of an important topic.

Ball achieves this feat by being clear and transparent on his assumptions and grounding his discussion in real data and facts. His analysis then manages to integrate logic, humor, and an appropriate degree of skepticism of the most extreme claims, often made by those with vested interests. The result is an entertaining and thought-provoking guide to the coming alternative virtual world that should prove indispensable to not just users and developers but investors, competitors, and regulators.

"The Metaverse" is organized in three parts that explain in turn what it is, what's needed for it to actually come
into existence and, finally, why we should care. 

The term "metaverse" itself was coined 30 years ago in a sci-fi novel called "Snow Crash" by Neal Stephenson. Despite being around for so long, there is still little consensus on what precisely it is. Ball suggests that this may be in part because the very companies who most view the metaverse as both a threat and an opportunity for their existing businesses — Facebook and Microsoft are extreme examples — each propose starkly conflicting definitions that fit "their own worldviews and/or the capabilities of their companies."  The resulting confusion encourages conflation of these concepts with blockchain and Web 3.0, which "The Metaverse" does a particularly good job of untangling.

The almost 50-word definition of the metaverse offered up by Ball may be a mouthful, but it usefully highlights all the key attributes needed to establish a ubiquitous, fully-functional virtual 3D ecosystem that can accommodate an unlimited number of simultaneous participants. Ball closely examines the technical and practical constraints on realizing each of these essential features. This exercise may come across as too painfully detailed for some — the longest chapter examines the payment mechanisms needed to support a parallel metaverse economy — but it is critical to an informed view of what the metaverse can actually be. 

Despite his overall optimism about the ultimately revolutionary impact of the metaverse,  Ball does not downplay the monumental technological obstacles to achieving this vision. "Its arrival remains far off," Ball concedes, "and its effects largely unclear." Some of the constraints, like bandwidth and computing power, may be eventually overcome through creativity and persistence. Others, like the speed of light, which poses a significant challenge to maintaining real-time interactive renderings of multiple individuals separated by many thousands of miles, are likely to remain stubbornly resistant to human innovation. 

As incumbent platforms and new disrupters scramble to build their own unique piece of virtual real estate, Ball also grapples with what may be the biggest challenge to creating an all-encompassing metaverse — getting the resulting cacophony of independent virtual worlds to communicate with each other. Solving the problem of interoperability will entail the establishment of a raft of agreed technical standards and potentially a meaningful dose of government intervention. We will get a very small taste of what the latter may look like as the European Union interoperability requirements for messaging apps come into effect in the coming years.

As effective as "The Metaverse" is at describing the primary drivers and key elements of the billions being invested in what he refers to as "the next internet," it's less convincing on the claim that it will actually "revolutionize everything." The vast majority of economic value being generated today from these technologies relate to gaming applications. Although it is true that gaming is no longer primarily the province of antisocial teenage boys — indeed the sector has now surpassed Hollywood and the music industry combined — the relative paucity of compelling use cases beyond that makes one question just how truly revolutionary it will be. Indeed, for most of the other applications described or envisaged — whether in medicine, education or otherwise — it is less than clear that a full-blown metaverse is actually required.

What's more, for those of us concerned with the increase in anti-social behavior in the wake of a pandemic that discouraged face-to-face human interaction, the eventual rise of the metaverse elicits as much foreboding as awe. Ball is right to be focused on regulatory approaches to avoid corporate gatekeepers as dominant in the virtual as in the physical world. But an economically significant parallel universe that facilitates anonymity raises a host of regulatory and social concerns well beyond competition and innovation that require as much, if not more, attention.

Luckily, given how long it will be for the metaverse to become a reality, we have time to get its regulation right for a change. Anyone committed to trying to do that could do worse than starting with a copy of The Metaverse.

Jonathan A. Knee is a professor of professional practice at Columbia Business School and a senior advisor at Evercore. His most exact book is "The Platform Delusion: Who Wins and Who Loses in the Age of Tech Titans."

Sun, 31 Jul 2022 00:27:00 -0500 en-US text/html
Killexams : How a Groundbreaking Documentary Shot in Virtual Reality Used Cinematic Techniques in a Digital World

In the opening minutes of “We Met in Virtual Reality,” a bunch of avatars resembling animals and anime characters enter an open world based on “Jurassic Park,” hop into vehicles, and speed around the landscape with glee as a handheld camera tracks their moves. Later, that same camera visits house parties, dance classes, and a marriage ceremony.

Anyone who hasn’t strapped on a VR headset might think they were watching a low-budget animated movie with glitchy effects, but “We Met in Virtual Reality” is actually a groundbreaking documentary shot exclusively in VRchat, the popular VR social platform. The feature-length debut of UK-based filmmaker Joe Hunting stems from his experiences roaming VRchat over the course of three years, during which time he befriended many of the communities within. Hunting, who supports himself in part by working as a VR event photographer, has provided the most robust opportunity to experience the nature of a social platform that only VR users can truly grasp.

A far cry from the dystopian headlines about the future of the metaverse, the movie offers a more celebratory look at the potential of forging bonds and finding a sense of purpose in an all-digital space. Hunting’s camera features everyone from deaf characters to lovers from different parts of the world who met and forged their bond entirely in the digital ether. Inspired by documentaries like “Paris Is Burning” and “Bombay Beach,” Hunting bridges the gap between a new frontier of digital storytelling and traditional cinema while capturing the potential of VR on a personal scale.

“It is a complete other world at this point,” Hunting told IndieWire. “That’s what I wanted to investigate — not what it’s like to arrive as a first-timer, but the community that already exists there.”

I first met Hunting in VRchat as well, ahead of the movie’s Sundance premiere in January. In avatar form, we stood under a large tree in a vast orange meadow surrounded by distant mountains as Hunting’s own virtual cameras recorded the conversation.

“We Met in Virtual Reality” director Joe Hunting, left, in VRchat with the author of this article in avatar form.

VRChat, which as been around since 2014, is one of several major social platforms accessible by consumer-grade headsets (and in 2D mode via computer). However, while Microsoft’s Altspace has hosted thousands of users for events like Virtual Burning Man and Meta is investing major resources in growing Facebook Horizons, VRChat remains the most robust option for large-scale design work and maintains the largest user base. Hunting first tried it out in 2018 after studying an article about how some users dealt with mental health problems by socializing on the platform. “My curiosity about it exploded,” he said. “I wanted to tell more stories not just about VR but how people use this space as a separate world from our world to find joy in their lives.”

To make the movie, he used both a native camera that exists in VRchat and a more advanced third-party one called VRCLens developed by a Japanese VRChat user who goes by the name Hirabiki. Though it costs less than $10 online, the virtual camera simulates a wide array of real-world effects that enabled Hunting to bring a genuine cinematic touch to his filmmaking process.

“It allows much greater control and behaves more like a physical cinema camera,” Hunting said. “The essential reason I operated with this camera was to fly my lens as a drone, see peaking and zebra displays, and have more accuracy over my focal length and aperture. Those tools fundamentally gave me more immersion in the filmmaking process.” Hunting mainly used focal lenses within the 20mm to 50mm range, resulting in an intimate feel that often has the familiar quality of cinema verité despite the unusual setting. There are even a few striking moments when Hunting racks focus to capture characters located far apart from each other. “The recipe for success was using focus peaking on my camera display to provide me accuracy on where my line of focus was,” he said. “I had two monitors, one clean feed and another littered in guidelines, like genuine cinematography processes.”

We Met in Virtual Reality director Joe Hunting

“We Met in Virtual Reality” director Joe Hunting IRL

courtesy of HBO

The only real practical barrier to his approach was that he couldn’t shoot with multiple cameras at the same time while using a headset tethered to a single computer. “I would block out the cinematography of a scene and move between each position with the conversation or action,” he said. “I was constantly shooting with the edit in mind.”

Social platforms in VR can be overwhelming to newcomers in part because they can feel awfully busy at first. Rooms filled with avatars mean that snippets of conversations can seep into one person’s audio (just like, you know, real life). For documentary purposes, this created additional recording hurdles. “There isn’t a precise way to mic someone in VRChat specifically,” Hunting said. “It was important to me that we recorded the dialogue of the subjects through their own headset microphones to immerse audiences in an authentic experience of VRChat, and to embrace the spatial tones in each of their voices. All the dialogue is recorded in real time using OBS, and when background noises occurred, I could typically do a retake in the moment or fix in post.”

Hunting found many of his subjects organically by wandering public worlds on VRChat, though with time he was invited to Discord servers that allowed him to become more entrenched in specific communities, eight of which are shown in the movie. “I essentially lived in VR for a year,” he said. “I filmed almost every day. When I wasn’t doing interviews I was out location scouting or just getting to know different communities. It was a complete journey of exploration.”

Hunting was in touch with VRchat administrators for the project, but the platform allows creators to retain ownership of their avatars as well as the worlds they built, which meant that the documentary didn’t need any official approval from the company itself. “The film is not associated with VRChat in any way,” Hunting said. “Our relationship is simply mutual respect and understanding.”

In true documentary form, he had all his subjects sign release forms that included both their real names (though none are mentioned in the movie) and VRChat usernames. “I took the same care and honesty that any documentarian would,” Hunting said. “It was really valuable to me to make someone aware they were being filmed for a documentary and the full breadth of the context they were in and not just exploit someone because they were using anonymity.”

We Met in Virtual Reality

“We Met in Virtual Reality”

courtesy of HBO

Over the course of the movie, subjects share experiences they’ve had with depression, anxiety, and loneliness, some of which is specific to the pandemic. “We are talking about real traumas in the film so it was important to treat that carefully,” Hunting said. “The film is about the truth of their experiences in VR.”

In the wake of the positive reception for the movie, Hunting is doubling down on the potential of filmmaking in VRChat with plans for additional features and series in the works. “I think there is going to be a boom of VR filmmaking to come,” he said. “I really want the documentary to educate people on the positive impact of VR. It gives you a tangible space in this gorgeous environment, and it allows you to feel present. Presence was a big inspiration for making the film.”

“We Met in Virtual Reality” is now streaming on HBO Max.

Sign Up: Stay on top of the latest breaking film and TV news! Sign up for our Email Newsletters here.

Fri, 29 Jul 2022 03:14:00 -0500 en text/html
Killexams : A new milestone in augmented reality: Functional contact lenses

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.

Thirty years ago, a group of human test subjects volunteered to try something for the first time – they climbed into an exoskeleton, pressed their face to a vision system, and manually interacted with a mixed reality of real and virtual objects. They were testing a prototype augmented reality system at Air Force Research Laboratory (AFRL) that enabled users to engage with virtual objects merged with the real world.

The system filled half a room and cost nearly $1 million to build, but it worked – demonstrating for the first time that AR technology can enhance human performance in real-world tasks.

Last week, a new milestone in AR technology was achieved and it highlights how far the field has come over the last few decades. I’m talking about the first authentic test of an augmented reality contact lens. It happened in a research lab at Mojo Vision in Saratoga, California, and it wasn’t a crude bench test of oversized hardware with wires dangling. No, this was an authentic test of an AR contact lens worn directly on the eye of a real person for the very first time. 

Phenomenal power, teeny-tiny living space

As someone who has been involved in AR technology from the beginning, I need to highlight this milestone. Creating an AR contact lens is very difficult.


MetaBeat 2022

MetaBeat will bring together thought leaders to provide guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Register Here

When I tell that to people, they usually focus on the display technology. The ability to put a high-resolution display on a tiny transparent lens is a daunting prospect, but it’s still the least difficult piece of the puzzle. The harder part is that the tiny lens, which needs to sit comfortably on the human eye, has to communicate wirelessly with external devices and be fully powered without a physical tether of any kind. This is very challenging, and yet it’s what Mojo Vision achieved in their latest demonstration.

Of course, the display technology is impressive too.  According to the company, the Mojo Lens has a 14,000 pixel-per-inch microLED display with a pixel pitch of 1.8 microns.  To put this in context, an iPhone 13 with a Super Retina XDR Display has 460 pixels per inch. In other words, the Mojo Lens display hardware has about 30 times the pixel density of a new iPhone.

In addition, these lenses include an ARM processor with a 5GHz radio transmitter, along with an accelerometer, gyroscope and magnetometer to track eye movements.

And all of this sits directly on your eye. 

And that’s still not the hardest part. In my mind, the most challenging hurdle in achieving AR contact lenses is power. According to the company, the Mojo Lens includes medical-grade microbatteries. It’s not clear what the current battery life is for today’s prototype, but according to Mojo, their product objective is power management that enables all-day wear.

The future is AR

I’m sure there’s a long road ahead to get from today’s prototypes to widespread deployment of low-cost contacts that bring immersive AR capabilities to people around the world, but I firmly believe this is where the industry is headed. In fact, I predict that AR eyewear – first as glasses and then as contacts – will replace the mobile phone as our primary interface to digital content within 10 years. 

Furthermore, I believe augmented reality will change society significantly, as it will transform digital assets from artifacts we selectively access into seamless features of our physical surroundings. 

A few years ago I wrote a futurist narrative entitled “Metaverse 2030” that attempts to accurately portray what life will be like when AR contact lenses become commonplace. The piece suggests that in the next decade, mainstream consumers will get fitted for new contact lenses whenever they sign up for a mobile subscription. Will this really happen within the next 10 years? Only time will tell.  

But one thing is certain today – over the last 30 years, the technologies to enable immersive AR have been invented at an impressive pace, taking the field from a room full of expensive Air Force hardware in 1992, to tiny transparent lenses that fit on the surface of your eyes in 2022. And along the way, there have been many significant innovations, from the Microsoft HoloLens and Magic Leap headset to Pokémon Go and Snap AR.

With so much impressive engineering happening in labs all around the world, I remain confident that AR will replace the mobile phone as the platform of our lives within the next 10 years. 

Louis Rosenberg, Ph.D. is a longtime researcher and entrepreneur in fields of augmented reality, virtual reality and artificial intelligence, and has been awarded over 300 patents for his work in those fields. He is currently CEO and chief scientist of Unanimous AI. He began his career developing early AR technologies at NASA and Air Force Research Laboratory (the Virtual Fixtures project). Rosenberg earned his Ph.D. from Stanford University and was a professor at California State University.


Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.

If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.

You might even consider contributing an article of your own!

Read More From DataDecisionMakers

Sat, 09 Jul 2022 22:10:00 -0500 Louis Rosenberg, Unanimous A.I. en-US text/html
Killexams : Navigating the reality of legal terrains in Malaysia No result found, try new keyword!The application of XR may sound too far-fetched but many have indirectly experienced elements of XR such as Augmented Reality (AR) filters when using Instagram, TikTok, Facebook, and other social ... Thu, 28 Jul 2022 12:37:00 -0500 en-my text/html Killexams : Can Anything Be A Handbag? No result found, try new keyword!I often think about a scene in The Devil Wears Prada when Andy Sachs’ boyfriend Nate — the movie’s real villain — asks the group, “Why do women need so many purses? You have one you put all your junk ... Sun, 31 Jul 2022 19:16:34 -0500 en-gb text/html Killexams : PTC named augmented reality leader in new report

For the fourth consecutive year, PTC has been named the overall leader and a best in class vendor for its Vuforia augmented reality (AR) solution suite in PAC’s Innovation RADAR report.

The prestigious report assessed AR vendors’ support for connected workers in two criteria categories: competence and market strength.

The Farnborough-based company received the highest scores for both categories, whilst also garnering positive feedback from AR client references for addressing the highest priority industrial and enterprise AR use cases, and supporting a wide range of AR hardware options, from smartphones and tablets to wearable devices, such as the Microsoft HoloLens 2 and the RealWear Navigator 500.

“Our analysis shows that PTC’s Vuforia continues to be the most comprehensive enterprise AR solution suite available in the market,” said the report’s lead analyst and author, Arnold Vogt. “It is well positioned to help its customers and their connected workers realise tangible value from AR for use cases like training, service and repair, and manufacturing quality inspection and validation.”

Michael Campbell, Executive vice president and general manager of augmented reality at PTC, added: “Earning this recognition as the overall leader in PAC’s latest AR report reinforces the value that we deliver to our Vuforia customers and the strengths of our product and go-to-market strategies.”

He concluded: “We’re pleased to be at the forefront of the growing enterprise AR market as use cases expand, new hardware providers enter the space, and industrial companies realise increasing value from these solutions.”

The full report, “Open Digital Platforms for the Industrial World 2022” is available here.

Consultancy firm PAC delivers focused and objective responses to the growth challenges of software and IT service vendors worldwide.


Thu, 04 Aug 2022 22:38:00 -0500 en text/html
Killexams : AMPD Ventures says subsidiary Departure Lounge’s Metastage secures certification from Microsoft Mixed Reality Capture Studios

AMPD Ventures Inc (CSE:AMPD, OTCQB:AMPDF) announced that its subsidiary Departure Lounge’s Metastage Volumetric Capture Stage has received its certification to operate from Microsoft Mixed Reality Capture Studios. 

One of the highest resolution volumetric video capture stages in the world, the 6,000-square foot stage is now open for business, the company said. 

The stage features 106 12-megapixel machine-vision cameras to capture dynamic, ultra-high fidelity human performances using Microsoft’s capture technology and its Azure rendering pipeline. The three-dimensional holograms that are created can be placed into fully digital and blended reality locations, narratives, games, and experiences. 

“The Departure Lounge and Metastage teams have been working exceptionally hard over the last year to build and test the stage here in Vancouver,” Adam Rogers, VP Creative and Head of Studio at Departure Lounge said in a statement. 

READ: AMPD Ventures says its AMPD Technologies subsidiary enters C$1.8M binding memorandum of understanding with Unleash Future Boats 

“Receiving the certification is the final step in our buildout phase, and means we are now open for business as the official and exclusive provider of Microsoft’s world-leading volumetric capture technology in Western Canada,” he added. 

The Metastage Vancouver holographic capture stage is the nucleus of a synergistic set of technologies housed under one roof at the Departure Lounge studio. Departure Lounge is developing groundbreaking pipelines and workflows in virtual and immersive production to help facilitate the next generation of immersive content. 

The studio has already garnered significant interest from digital content creators in the area, with a steady stream of companies visiting the studio for preview tours and demonstrations. 

“I am exceptionally pleased that we are now Microsoft certified,” said James Hursthouse, CEO of Departure Lounge. “The SIGGRAPH 2022 Conference and Exhibition on Computer Graphics and Interactive Techniques will attract thousands of digital content creation professionals to Vancouver during the second week of August, and we’re excited to be able to welcome local and international visitors to our studio to witness the magic of holographic capture first-hand. We will be running open house tours throughout the week.” 

Steve Sullivan, general manager, Microsoft Mixed Reality Capture Studios, added: “We’re thrilled that Metastage Canada is now open for business, and we look forward to supporting Departure Lounge and Metastage in their mission to deliver the world’s premier volumetric capture solutions to customers in Canada and beyond.”

Metastage Canada is housed at the Departure Lounge facility on campus at the Centre for Digital Media in downtown Vancouver. Departure Lounge also offers cutting-edge markerless motion capture technology in partnership with, a film-production grade interactive LED wall in partnership with ARwall, as well as a team of creative professionals that develop applications showcasing the assets generated by the various technologies. 

Departure Lounge, which was acquired by AMPD Ventures in December 2021, is bringing together the experience and expertise of its founding team to develop a cohesive range of metaverse-focused technology and content opportunities, including a joint venture with 4D holographic capture pioneers, Metastage Inc, to bring their world-leading holographic capture platform to Canada. 

Metastage specializes in bringing the best of the real world into the metaverse. The company has holographically captured people (and animals) from all walks of life, including athletes and actors to executives and training instructors. 

AMPD is a next-generation infrastructure company specializing in providing high-performance computing solutions for low-latency applications. With state-of-the-art, high-performance computing solutions hosted in sustainable urban data centres, AMPD is leading the transition to the next generation of computing infrastructure as “the hosting company of the metaverse”. 

Contact the author at 

Wed, 03 Aug 2022 00:11:00 -0500 en text/html
Killexams : Nextech AR Announces Update Of Proposed Arrangement To Spin Out Real-World Augmented Reality Spatial Mapping Platform ARway

-Increases dividend shares to Nextech shareholders again to 4 million shares +25% from the 3,200,000 previously announced
-Execution of Arrangement Agreement To Spin Out Real-World Augmented Reality Spatial Mapping Platform ARway

TORONTO, August 02, 2022--(BUSINESS WIRE)--Nextech AR Solutions Corp. ("Nextech" or the "Company") (OTCQB: NEXCF) (CSE: NTAR) (FSE: N29) is pleased to announce that it has entered into an arrangement agreement dated effective July 29, 2022 with its wholly-owned subsidiary 1000259749 Ontario Inc. ("Spinco") and a special purpose financing company, 1373222 B.C. Ltd. ("FinanceCo"). The Arrangement Agreement, a copy of which will be available under the Company’s profile on SEDAR, sets out the terms on which Nextech will complete a plan of arrangement (the "Arrangement") under the Business Corporations Act (British Columbia) whereby the Company’s ARway platform and associated assets will be spun out to Spinco.

The Arrangement, if completed, will result in, among other things, Nextech receiving an aggregate of 16,000,000 common shares of Spinco ("Spinco Shares") and an aggregate of 4,000,000 Spinco Shares being distributed directly to the shareholders of Nextech on a pro rata basis. Nextech currently intends to transfer an aggregate of 3,000,000 of the 16,000,000 Spinco Shares which it receives pursuant to the Arrangement to certain service providers in consideration of past services and other indebtedness

Evan Gappelberg, Chief Executive Officer of Nextech comments "This spin-off transaction is quickly moving forward and will provide Nextech shareholders ownership in yet another exciting public company with a spatial computing platform which I believe is perfectly positioned for growth in the rapidly emerging Web3.0 world. Once approved, this spinco will be another major milestone for Nextech and its valued shareholders. This transaction is anticipated to enable the realization of the true potential of ARway (formerly ARitize Maps) assets under a highly experienced and focused management team, unlocking the true value of our real world Spatial Computing platform. He continues "After 16 years of walking around with a smartphone in our pockets the next generation of devices is about to emerge in the form of Computer glasses. Computer glasses will become the device of choice for spatial commuting and the spatial web which is the next evolution of our computer driven world. I believe it will become a major focus of investors when Apple Computer releases a pair of computer glasses which is widely anticipated and expected to happen in the next 12 months. Once fully launched ARway, spatial computing and the spatial web will have unlimited potential and use cases including; healthcare, education, sports venues, hospitals, campuses, trade shows, theme parks, airports, museums, warehouse wayfinding, entertainment forever changing our world into a metaverse with spatially aware computers".

Subco intends to seek a listing of its common shares on the Canadian Securities Exchange ("CSE") following the completion of the Arrangement.

About ARway

ARway (previously ARitize Maps) is a mobile app and studio, all-in-one no code real-world Metaverse creation tool, with self-generating augmented reality ("AR") mapping solutions for consumers and brands alike. The ARway offering will be paired with a no-code web based Creator Portal and SDK to form the Metaverse Experience Builder Platform (MEBP). Creators can map, author and publish various Metaverse experiences ranging from wayfinding, to an array of AR experiences for exclusive branded activations.

With the ARway mobile app, anyone can spatially map their location within minutes, and populate it with interactive 3D content, augmented reality navigation, audio, text, images and more. Nextech provides a number of pre-loaded 3D objects, and creators can also upload their own OBJ/GLB files, and create their own 3D objects to populate their Metaverse. The platform has a Visual Position System ("VPS") which Nextech refers to as Mapping and Localization where users can map and enable VPS in any area through the platform. Occlusion, Depth Sensing and segmentation are also available. Users can share their Metaverse with others, creating a new level of immersive interactivity for social, branding, advertising, gaming and more Metaverse experiences. Features in the ARway Creator Portal will include:

  • AR NAVIGATION: Brands and creators can now author augmented reality navigation paths for large scale maps in real time.

  • MULTIPLE CREATORS: Creators can now collaborate in the authoring of Metaverse experiences from across the globe in real time.

  • VERSION CONTROL: The option to save map edits and version control, which will allow creators to control what changes to the maps will be released publicly.

  • ANALYTICS: Creators can gauge the success of their creations against set objectives by analyzing consumption data.

Paul Duffy, the President of Nextech, recently presented an exclusive demo of ARway on the Wall Street Reporter’s Next Super Stock Live! Watch the demo - click here

The app has successfully been used and showcased at major events including:

  • "Reality Hack" at MIT: Nextech teamed up with the Massachusetts Institute of Technology (MIT) for the XR Hackathon, "Reality Hack" where ARway was used as the main Metaverse platform. MIT event organizers used ARway for their participants, providing engaging event information, immersive event updates, and indoor augmented reality wayfinding, allowing participants to navigate their way around the event. In addition, hackers got access to the ARway platform, where they used the Company’s immersive technology to build their projects. A team that used ARway received the silver prize at the hackathon.

  • RC Show by Restaurants Canada: The RC Show is one of the biggest events of the year for the foodservice and hospitality industry. As an official partner of the event, Nextech had its ARway 3D/AR technology on full display to the entire food, beverage and restaurant industry. View these video reels of ARway experiences of Bothwell Cheese and a Wine Showcase.

The ARway app has an unlimited number of use cases for augmenting physical spaces in the Metaverse, including gamification, events and tradeshows, art galleries, universities, retail stores, shopping centres, office buildings, transport, public spaces, sports stadiums, museums, restaurants, rental properties, real estate, and more. With value propositions spanning multiple industries and use cases, this app opens Nextech’s 3D/AR technology solutions to new markets, for personal and professional use by creators, brands and companies alike.

Additional Terms of the Arrangement

The Arrangement will also be subject to, among other things, approval of the CSE and the Supreme Court of British Columbia. Full details of the proposed Arrangement, the business of Spinco, the Private Placement and the ARway application will be contained in a notice of meeting and information circular, which will be mailed to shareholders and filed on SEDAR in due course.

The directors and officers of Spinco on closing of the Arrangement are anticipated to be Evan Gappelberg (director and Chairman), Paul Duffy (director and President), Belinda Tyldesley (director and Corporate Secretary) and Andrew Chan (Chief Financial Officer).

Evan Gappelberg – Director and Chairman. Mr. Gappelberg is an accomplished entrepreneur with an expertise in creating, funding and running start-ups, and he has extensive experience both as a hands-on operating executive and well as a public markets professional. He is founder and currently serves as the Chief Executive Officer and a director of Nextech. He was also co-founder and CEO of an app development company which created, published and owns over 500 successful apps for both Apple's iTunes store and the Google Play store. Prior to being a successful entrepreneur, Mr. Gappelberg worked on Wall Street and has more than 25 years of extensive experience as both a hedge fund manager and Senior Vice President of Finance. He has extensive capital markets relationships, know-how and experience in all operational facets of managing a public company.

Paul Duffy – Director and President. Creator of the HumaGram and inventor of the patent for Holographic Telepresence over the Internet (TOIP), Mr. Duffy is a serial entrepreneur with over 25 years of experience in successfully starting, expanding, diversifying and selling global technology companies. He currently serves as the President and a director of Nextech. Mr. Duffy co-founded Corporate Communications Interactive (CCI) in 1992 and grew it to one of the largest online learning and communication companies in North America. With clients such as AT&T, GE, IBM, Microsoft, Pearson Education and Manulife Financial, CCI was sold to SkillPath Seminars in 2003. Mr. Duffy is also a former member of the Board of Governors for the Michener Institute for Applied Health Sciences, and holds a Bachelor of Science in Applied Computer Science from Ryerson University.

Belinda Tyldesley – Director and Corporate Secretary. Mrs. Tyldesley is the President of Closing Bell Services, a consulting company that provides corporate secretarial services. Mrs. Tyldesley has extensive experience across all sectors of the economy with regulatory compliance in all Canadian jurisdictions and reporting issuers listed on the Toronto Stock Exchange (TSX), the TSX Venture Exchange (TSX-V), Canadian Securities Exchange (CSE) and the NEO Exchange (NEO), as well as providing legal assistance and secretarial services. Mrs. Tyldesley holds an Associate Diploma in Business Legal Practice from Holmesglen College in Melbourne, Australia. She currently serves as the Corporate Secretary and a director of Nextech.

Andrew Chan – Chief Financial Officer. Mr. Chan has over 20 years of experience across finance, accounting, business analytics, and strategy, focusing on the technology and financial services sectors with half of his career serving high-growth, public technology companies. After over a decade in public accounting (including 9 years at Ernst & Young), Andrew moved into senior finance positions with Real Matters Inc. (TSX: REAL) and goeasy ltd. (TSX: GSY) – both offering technology solutions for the financial services industry – where he was involved in several financings, transactions and acquisitions with an aggregate value of well over a billion dollars. Mr. Chan has successfully integrated and led finance-related functional groups including treasury and banking, corporate reporting and budgeting and was instrumental in forging strong relationships with business unit leaders to enable successful revenue forecasting and delivery. He currently serves as the Chief Financial Officer of Nextech. Mr. Chan is a Chartered Public Accountant (CPA,CA) and also holds a Bachelor of Commerce degree specializing in accounting and finance from the University of Toronto.

The Arrangement is subject to regulatory approval, including the approval of the CSE, court and shareholder approval of the Arrangement, and standard closing conditions. The Arrangement cannot close until the required shareholder, regulatory and court approvals are obtained. There can be no assurance that the Arrangement will be completed as proposed, or at all.

Further details about the proposed Arrangement will be provided in a disclosure document to be prepared and filed in connection therewith. Investors are cautioned that, except as disclosed in the disclosure document to be prepared in connection with the Arrangement, any information released or received with respect to the foregoing matters may not be accurate or complete and should not be relied upon.

About Nextech

Using breakthrough artificial intelligence ("AI"), Nextech is able to quickly, easily and affordably ARitize (transform) vast quantities and varieties of existing assets at scale making products, people and places ready for interactive 3D use, giving creators at every level all the essential tools they need to build out their digital AR vision in the Metaverse. Its platform agnostic tools allow brands, educators, students, manufacturers, creators, and technologists to create immersive, interactive and the most photo-realistic 3D assets and digital environments, compose AR experiences, and publish them omnichannel. With a full suite of end-to-end AR solutions in 3D Commerce, Education, Events, and Industrial Manufacturing, Nextech is well positioned to meet the needs of both commercial brands and other Metaverse contributors. Nextech funds the development of its AR and Metaverse growth initiatives through its e-Commerce platforms, which currently generate most of its revenue. Nextech's e-commerce platforms include: ("VCM"), ("IPL") and ("TruLyfe"). VCM and product sales of residential vacuums, supplies and parts, and small home appliances sold on Amazon. These e-commerce platforms serve as an incubator for developing and testing Nextech's leading edge AR, AI and machine learning applications for powering next-generation e-commerce technology.

To learn more, please follow Nextech on Twitter, YouTube, Instagram, LinkedIn, and Facebook, or visit the Company’s website:

Cautionary Statements

The CSE has not reviewed and does not accept responsibility for the adequacy or accuracy of this release.

This press release contains forward-looking information based on current expectations. Statements about the closing of the Arrangement and Private Placement, expected terms and structure of the Arrangement and Private Placement, the number of securities that may be issued in connection with the Private Placement and the parties’ ability to satisfy closing conditions and receive necessary approvals, as well as the prospective nature of the products of Nextech and Spinco and the potential growth of the associated markets on a going forward basis, are all forward-looking information. These statements should not be read as guarantees of future performance or results. Such statements involve known and unknown risks, uncertainties and other factors that may cause genuine results, performance or achievements to be materially different from those implied by such statements. Although such statements are based on management’s reasonable assumptions, there can be no assurance that the Arrangement or Private Placement will occur or that, if the Arrangement and/or Private Placement does occur, it will be completed on the terms described above. None of Nextech, FinanceCo nor Spinco assumes any responsibility to update or revise forward-looking information to reflect new events or circumstances unless required by law.

In the event that insiders of Nextech receive any Spinco Shares in connection with the Transaction, it may be deemed to be a "related party transaction" within the meaning of Multilateral Instrument 61-101 Protection of Minority Security Holders in Special Transactions ("MI 61- 101"). The Company will provide further details of the applicability of MI 61-101 and any requisite additional details in due course.

View source version on


Nextech AR Solutions Corp.
Lindsay Betts, Investor Relations Contact
Evan Gappelberg, CEO and Director
866-ARITIZE (274-8493) Ext 7201

Mon, 01 Aug 2022 23:30:00 -0500 en-SG text/html
PL-200 exam dump and training guide direct download
Training Exams List