There’s a lot to talk about after our time with Quest Pro. In our prior article we talked about the experience using Meta’s new MR headset. Here we’ll get into the nitty gritty of the headset’s capabilities and performance.
Key Quest Pro Coverage:
Quest Pro Revealed – Full Specs, Price, & Release Date
Quest Pro Hands-on – The Dawn of the Mixed Reality Headset Era
Touch Pro Controllers Revealed – Also Compatible with Quest 2
As often happens with hands-on demos, I wasn’t able to sit down and really test everything I would have liked to about headset (on account of being walked through several demos in row), but I soaked up as much as I could about how it looked and felt to use the Quest Pro.
One of my biggest surprises about the headset is that the resolving power isn’t actually much better than Quest 2. That made sense once Meta revealed that Quest Pro shares nearly the same resolution as Quest 2. Granted, the company claims the lenses have greater clarity at the center and periphery, but in any case it isn’t the kind of leap that’s going to make the headset great for memorizing or using it like a computer monitor.
I take it this decision might have been related to the resolution of the passthrough cameras (not to mention the extra processing power required to drive the headset’s 10 on-board cameras). After all, if you had a super high resolution display but lower resolution cameras, the outside world would look blurry by contrast against the sharper virtual objects.
Speaking of passthrough… while Quest Pro finally gets a full-color view, it’s not quite perfect. Because of all the adjustments the headset is doing to render a geometrically correct passthrough view, the implementation ends up with a few artifacts that manifest as color fringing around certain objects—like a faint outline of missing color.
My best guess is this happens because a mono RGB camera is employed for color information which is then projected over top of a stereo view… necessarily there’s some small angles where the color information is simply not present. This didn’t defeat the purpose of passthrough AR by any means (nor the appreciation for finally seeing in color), but it was something that would be nice to see fixed in future headsets.
As for the lenses, there’s no doubt that they’ve managed to compact the optical stack while retaining essentially the same kind performance as Quest 2… or potentially better; Meta says Quest Pro has up to 75% better contrast and a 30% larger color gamut thanks to 500 local dimming elements in the backlight, though I haven’t gotten to put this to test just yet.
Similarly, the remove of Fresnel lenses should eliminate glare and god rays in theory, but I wasn’t able to pull up the right content to see if they’ve been replaced with order kinds of artifacts. One thing I did notice though is that the lenses can reflect ambient light if angled toward direct light sources… luckily the headset comes with peripheral blinders if you want to cut this down and be more immersed.
Quest Pro isn’t just a big upgrade to the headset; the accompanying Touch Pro controllers have some interesting capabilities that I didn’t expect.
With essentially the same handle as before, they still feel great in the hand, maybe even better than my favorite Touch controller (the original Touch v1) thanks to a closer center of gravity and a nice weight from an on-board rechargeable battery and improved haptic engines.
The single biggest improvement to the controllers is surely the addition of on-board inside-out tracking. Not only does this remove the ring to make the controllers more compact and less likely to bump into each other, but now they can track anywhere around you, rather than going haywire if they leave sight of the headset’s cameras for too long. It’s early to say (and Meta has made no mention of it) but this could even open up the controllers to functioning like extra body trackers.
I didn’t get to put the controller tracking to the test with something demanding like Beat Saber, but until I can, I’m hoping Meta was smart enough to make sure these could hold up to the Quest platform’s most popular game.
The new capabilities on the Touch Pro controller are hit or miss for me so far.
First is the pinch sensor that allows you to push down on the thumb rest to register an input. Combined with squeezing the index finger, this creates a pretty natural pinch gesture. It feels a little novel, but I could see this being used as an easy way to emulate pinch inputs in hand-tracking apps without any need for the developers to make changes. The gesture also provides a clearer single point of interaction compared to pulling a trigger or pressing a button, both if which are often abstracted from the real position of your fingers.
As for the attachable stylus tip which goes on the bottom of your controller… I’m not really sold. Personally I find holding the controller upside down to use as a bulbous white-board marker to be fairly unergonomic. It’s a neat idea in theory—and I love that the stylus tip is pressure sensitive for added control—but I’m not sure the headset yet has the precision needed to really pull this off.
In the demos I saw that used the controller as a stylus, in both cases the virtual surface I was expected to draw on had drifted just far enough away from the physical surface it was supposed to represent that my stylus didn’t think it was close enough to start creating a mark… even though I was physically touching the controller to the physical surface.
That might be an implementation issue… after all, the pressure-sensitive tip should be able to clearly inform the system of when you are making contact and when you aren’t… but even so, once I recalibrated the surfaces and tried to draw again, I saw the surface drift fairly quickly (not by much, but even a centimeter of mismatch makes using a stylus feel weird). This might work fine for coarse annotations, like a shape here, or a few words there, but it’s far from something like a Wacom tablet.
As for the haptics… in my short time with them it seemed like there’s multiple haptic engines inside, making the controller capable of a broader range of haptic effects, but there wasn’t a moment where I felt particularly wowed by what I was feeling compared to what I’ve felt on Quest 2.
Granted, haptics are often one of the most underutilized forms of XR output, and often the last to be considered by developers given the difficultly of authoring haptic effects and the peculiarities of different haptic engines in different controllers. I hope this is something that will become a more obvious upgrade in the future as developers have more time to play with the system and find where to best employ its capabilities.
One last thing about the Touch Pro controllers… they’re also compatible with Quest 2 (unfortunately not Quest 1). Not only does this reduce the potential for fragmentation between different controller capabilities, but it means some of the new goodness of Quest Pro can come to Quest 2 users who don’t want to drop $1,500 on the complete package.
I definitely deliver credit to Meta here as a pro-customer move. Now if they really want my praise… it would be amazing if they made Touch Pro controllers compatible with any headset. In theory—because the controllers track their own position and don’t rely on unique LED patterns, or headset-based CV processing, etc—they should be able to simply report their own position to a host system which can integrate the information as needed. It’s a stretch, but it would be really great if Meta would offer all the great capabilities of the Touch Pro controllers to any headset out there that wanted to implement them, thus creating a larger ecosystem of users with matching controller capabilities.
Quest Pro is no doubt more compact and balanced than any headset Meta has made previously, but it’s also heavier at 722 grams to Quest 2’s 503 grams.
Granted, this is another instance where Meta’s decision to put a cheap strap on Quest 2 comes back to bite them. Despite not being able to say that Quest Pro is lighter, it might in fact be the more comfortable headset.
While ergonomics are really hard to get a grasp on without hours inside the headset, what’s clear immediately is that Quest Pro is more adjustable which is great. The headset has both a continuous IPD adjustment (supporting 55–75mm) and a continuous eye-relief adjustment. Not to mention that the on-board eye-tracking will tell you when you’ve got the lenses into the ideal position for your eyes. Ultimately this means more people will be able to dial into the best position for both visuals and comfort, and that’s always a good thing.
But, it has to be said, I have an issue with ‘halo’ headstraps generally. The forehead pad has a specific curve to it and thus wants to sit on your forehead in the spot that best matches that curve… but we all have somewhat different foreheads, which means that specific spot will be different from user to user. With no way to adjust the lenses up and down… you might have to pick between the ‘best looking’ and ‘most comfortable’ position for the headset.
I’ll have spend more time with Quest Pro to know how much this problem exists with the headset. And while I’d love to see other headstrap options as accessories, a halo-style headstrap might be a necessity for Quest Pro considering how much of the face the headset is attempting to track with internal cameras.
Did you miss a session from MetaBeat 2022? Head over to the on-demand library for all of our featured sessions here.
With digital disruptors eating away at market share and profits hurting from prolonged, intensive cost wars between traditional competitors, businesses had been looking to reduce their cost-to-income ratios even before COVID-19. When the pandemic happened, the urgency hit a new high. On top of that came the scramble to digitize pervasively in order to survive.
But there was a problem. Legacy infrastructure, being cost-inefficient and inflexible, hindered both objectives. The need for technology modernization was never clearer. However, what wasn’t so clear was the path to this modernization.
Should the enterprise rip up and replace the entire system or upgrade it in parts? Should the transformation go “big bang” or proceed incrementally, in phases? To what extent and to which type of cloud should they shift to? And so on.
The Infosys Modernization Radar 2022 addresses these and other questions.
Join today’s leading executives at the Low-Code/No-Code Summit virtually on November 9. Register for your free pass today.
Currently, 88% of technology assets are legacy systems, half of which are business-critical. An additional concern is that many organizations lack the skills to adapt to the requirements of the digital era. This is why enterprises are rushing to modernize: The report found that 70% to 90% of the legacy estate will be modernized within five years.
Different modernization approaches have different impacts. For example, non-invasive (or less invasive) approaches involve superficial changes to a few technology components and impact the enterprise in select pockets. These methods may be considered when the IT architecture is still acceptable, the system is not overly complex, and the interfaces and integration logic are adequate. Hence they entail less expenditure.
But since these approaches modernize minimally, they are only a stepping stone to a more comprehensive future initiative. Some examples of less and non-invasive modernization include migrating technology frameworks to the cloud, migrating to open-source application servers, and rehosting mainframes.
Invasive strategies modernize thoroughly, making a sizable impact on multiple stakeholders, application layers and processes. Because they involve big changes, like implementing a new package or re-engineering, they take more time and cost more money than non-invasive approaches and carry a higher risk of disruption, but also promise more value.
When an organization’s IT snarl starts to stifle growth, it should look at invasive modernization by way of re-architecting legacy applications to cloud-native infrastructure, migrating traditional relational database management systems to NoSQL-type systems, or simplifying app development and delivery with low-code/no-code platforms.
From the above discussion, it is apparent that not all consequences of modernization are intentional or even desirable. So that brings us back to the earlier question: What is the best modernization strategy for an enterprise?
The truth is that there’s no single answer to this question because the choice of strategy depends on the organization’s context, resources, existing technology landscape, business objectives. However, if the goal is to minimize risk and business disruption, then some approaches are clearly better than others.
In the Infosys Modernization Radar 2022 report, 51% of respondents taking the big-bang approach frequently suffered high levels of disruption, compared to 21% of those who modernized incrementally in phases. This is because big-bang calls for completely rewriting enterprise core systems, an approach that has been very often likened to changing an aircraft engine mid-flight.
Therefore big-bang modernization makes sense only when the applications are small and easily replaceable. But most transformations entail bigger changes, tilting the balance in favor of phased and coexistence approaches, which are less disruptive and support business continuity.
Phased modernization progresses towards microservices architecture and could take the coexistence approach. As the name suggests, this entails the parallel runs of legacy and new systems until the entire modernization — of people, processes and technology — is complete. This requires new cloud locations for managing data transfers between old and new systems.
The modernized stack points to a new location with a routing façade, an abstraction that talks to both modernized and legacy systems. To embrace this path, organizations need to analyze applications in-depth and perform security checks to ensure risks don’t surface in the new architecture.
Strategies such as the Infosys zero-disruption method frequently take the coexistence approach since it is suited to more invasive types of modernization. Planning the parallel operation of both old and new systems until IT infrastructure and applications make their transition is extremely critical.
The coexistence approach enables a complete transformation to make the application scalable, flexible, modular and decoupled, utilizing microservices architecture. A big advantage is that the coexistence method leverages the best cloud offerings and gives the organization access to a rich partner ecosystem.
An example of zero-disruption modernization that I have led is the transformation of the point-of-sale systems of an insurer. More than 50,000 rules (business and UI) involving more than 10 million lines of code were transformed using micro-change management. This reduced ticket inventory by 70%, improved maintenance productivity by about 10% and shortened new policy rollout time by about 30%.
Technology modernization is imperative for meeting consumer expectations, lowering costs, increasing scalability and agility, and competing against nimble, innovative next-generation players. In other words, it is the ticket to future survival.
There are many modernization approaches, and not all of them are equal. For example, the big-bang approach, while quick and sometimes even more affordable, carries a very significant risk of disruption. Since a single hour of critical system downtime could cost as much as $300,000, maintaining business continuity during transformation is a very big priority for enterprises.
The phased coexistence approach mitigates disruption to ensure a seamless and successful transformation.
Gautam Khanna is the vice president and global head of the modernization practice at Infosys.
Welcome to the VentureBeat community!
DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.
If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.
You might even consider contributing an article of your own!
By now, building a tech stack is a given for restaurants of all sizes, but as we’ve previously stated, investing in technology does not take a one-size-fits-all approach. While national restaurant brands like Domino’s and Chipotle have become leaders in cutting-edge foodservice technology, how does a smaller brand compete with fewer resources? According to Peter Baghdassarian, co-owner of seven-unit Armenian kabob chain, Massis Kabob, you have to do your homework, know your company’s needs, and not be afraid of investing ahead of the curve.
Baghdassarian said that Massis Kabob, which has been serving Mediterranean food in California mall food courts since 1976, was one of the first restaurants to invest in digital video menu boards around 17 years ago.
“This was a huge deal at the time, and we had to partner with a company in Taiwan to make them because it was so hard to get them here,” he said. “The menu boards increased our business because our food was so hard to explain through regular menu displays.”
When Baghdassarian’s father first opened Massis Kabob, most customers were unfamiliar with Mediterranean food and had never eaten kabobs, Baghdassarian said, so eventually the digital menu boards helped to explain the menu of shish kabobs, pita wraps, and combo plates. This is exactly how Baghdassarian approaches all aspects of food service technology: will this make my employees’ lives easier? Is it solving a problem? If not, it goes in the garbage, he said.
Currently, Massis Kabob has a very simple tech stack: they use Toast for POS and third-party delivery integration, Incentivio for building out and operating their loyalty app, and are currently looking for a new partner for scheduling software.
“When we wanted to swap out our POS system, we did not have a full-time IT guy to help us like a big chain would,” Baghdassarian said. “If the tech requires a mouse and a computer in an office, my manager is not going to use it. We don’t have that kind of setup. It has to be on a phone or tablet and has to be very intuitive.”
While it might sound like Baghdassarian is skeptical of a lot of technology, he just knows exactly what he wants and what would work for a counter-service kebab restaurant. Massis Kabob just opened a new flagship location last month in Glendale. The 3,500-square-foot-store is the largest of the chain’s seven locations and can accommodate on-premises patrons and lanes for pickup and delivery. Unlike many of Massis Kabob’s larger restaurant industry colleagues, however, the off-premises-focused location does not have a drive-thru lane because “that’s just not practical in Los Angeles.”
As the restaurant industry in general has become more off-premises-focused and people discover and interact restaurants increasingly through apps, Baghdassarian said it has been an adjustment for their brand. “We’re not flipping burgers,” he said, adding that it takes more effort and exact timing to make kabob orders from scratch, and it’s not a continuous process like other types of quick-service would be.
“We tell people their food will be ready in 12 minutes, because otherwise it will be sitting and getting cold,” Baghdassarian said. “We call our customers personally when their food is ready for pick up, and even though it’s one more step, it lets us do more quality control than our competitors.”
Of course, this process is much easier now with the launch of Massis Kabob’s new app which became available to get this year and allows them to have a one-on-one relationship with customers through crucial customer data.
“We’re glad to be working [with Incentivio] because I avoid having to hire four guys to sit around in an office doing data analysis,” Baghdassarian said. “McDonald’s might have a group of data scientists doing custom loyalty stuff for them, but I’m not going to have the time or half a million dollars to spend on that.”
As for Baghdassarian’s final bits of advice for investing in tech as a smaller restaurant brand? Don’t necessarily go for the biggest companies in technology just because you recognize the names, make sure the technology is user-friendly, and take advantage of your more compact size to add more personal touches:
“One time we had a tech failure on the part of a third-party delivery company and a customer’s food didn’t get picked up by a driver,” he said. “We looked him up in the database and noted that he was a regular customer and had spent thousands of dollars with us. So, I picked up the bag of food and drove 20 minutes to hand-deliver it myself.”
Contact Joanna at [email protected]
Find her on Twitter: @JoannaFantozzi
Rajat Bhargava is an entrepreneur, investor, author and currently CEO and cofounder of JumpCloud.
From the 1980s until the mid-2000s, the monoculture around Microsoft ruled. Users logged into Windows-managed computers and used Office and Windows File Server; businesses relied on Microsoft Active Directory (AD) to manage user identity and access.
Then, IT evolved. On-premises environments and closed systems gave way to the flexibility of the cloud. Organizations adopted Mac- and Linux-based systems. Software as a service (SaaS) environments exploded. Data centers started to be replaced by infrastructure as a service (IaaS) providers. Now, Gartner predicts that over 95% of new digital workloads will be deployed on cloud-native platforms by 2025, a dramatic increase from 30% in 2021.
With cloud servers preferred for data processing and storage, web applications now dominate the market. In part because wired connections gave way to wireless networks and people became more mobile through smartphones, and Google Workspace (aka G Suite, Google Apps) and M365 (aka Office 365) became as popular as machine-based Office applications in the enterprise space.
In this environment, organizations can’t be bound to anachronistic approaches as businesses shift to the cloud and globally distributed workforces. Now’s the time for companies—especially small and medium-sized enterprises (SMEs)—to approach IT with an open mind and an open approach.
“Open” in this context doesn’t mean porous or loose; it represents scalability, flexibility and agility in terms of changes in technology and developments in the stack. An open approach improves end user experience, worker productivity and satisfaction. An open approach to IT can be a critical tool in helping organizations establish zero-trust security without sacrificing the agility and flexibility made possible by the cloud.
In this article, I’ll offer some tips to getting started with this approach.
Modernizing IT stacks means making sure that work—remote and hybrid—functions well. Employees care about doing their job; they want easy access to the resources they need. IT teams want a similarly streamlined experience and assurance that company data remains secure without impacting productivity. My company’s survey of 506 SME IT admins found that nearly 75% prefer a single solution to manage employee identities, access and devices than having to manage a number of different solutions. An open directory platform approach incorporates a cloud-hosted “virtual” domain that meets this need, offering the flexibility and security necessary to support modern workplaces.
This means creating an IT environment that consumes identities wherever they live. Not just employee identities but also device identities, allowing your system to be open to receive information from authorized sources anywhere. On the outgoing side, it means creating a single source of user identity that can be propagated out to other devices, other users or to an authorized network.
Identity as a service and cloud directories are vital tools that enable an open approach. Look for those that offer fluidity and the flexibility to change resources any time (for example, from M365 to Google Workspace or vice versa).
Flexible Security Layers
Instead of traditional perimeters, an open approach favors a creation of virtual offices and security perimeters around each employee—and whatever devices they use. Being open doesn’t equate to a cavalier security approach; it’s a way to offer authorized access to resources anywhere that is convenient and tracked for compliance and overall visibility.
Security layers can evolve with each organization’s need and should include:
• Identity layer: A cloud directory houses authentication credentials and establishes centralized access control across user identity, admin access, service accounts and machines. Centering identity within a cloud directory allows SME teams to draw a security perimeter around each employee, enabling updates without disruption and providing access to on-prem and cloud-based resources.
• Device layer: Most IT environments operate within an ever-evolving state of company-issued, personal and mobile devices running some combination of Mac, Windows or Linux systems. In this complicated device ecosystem, organizations should extend user identity to establish device trust, meaning that a device is known and its user is verified. A mobile device management solution (MDM) is one option that can install a remote agent to handle basics—including multifactor authentication (MFA) and permissions—zero-touch onboarding and remote lock, restart or wipe. Determine the control level you need in your device environment, factoring in options like how you honor employee device choice and how you manage your bring your own device (BYOD) policy.
• IT resource layer: In office environments, employees generally use a form of single sign-on (SSO) to log into their desktop at designated workstations and then get instant access to applications and shared files and servers. In remote, hybrid and other modern IT environments, SSO should include everything from SaaS apps to systems, files, infrastructure and shared networks. Some organizations use SSO solely for web-based applications, while some centralize identity and extend it to virtually any IT resource through authentication protocols like LDAP, SAML, OpenID Connect, SSH, RADIUS and REST.
Given security, ongoing monitoring and compliance needs, visibility is critical to an open IT approach. Considering the breadth of access transactions, businesses should look for a holistic solution with broad coverage.
Basic event logging data is table stakes, and IT solutions should include a method for capturing discrete and unique log formats. That includes logs from SSO and from cloud RADIUS for network connection, LDAP and device connections—any log format for resources deployed in your stack.
Because integration requirements make log analysis and management solutions expensive, challenging to implement and difficult for admins managing custom feeds for authentication protocols, consider options that offer a wide range of analysis by enriching raw data. This can be done with a number of other data points, sessionizing the data through post-processing. Such information provides admins with broad insight across their entire IT environment, not just into a particular service or user.
For many organizations, extending closed legacy systems was a necessity. In the age of hybrid and remote work, it’s proving more of a liability than an asset. An open approach allows companies to embrace a diverse, modern IT environment that can keep pace with what users need, keeping them and company data secure at every access point.
Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?
Major chemical companies are backing pyrolysis plants that convert plastic waste into hydrocarbon feedstocks that can be turned into plastics again. The process uses high temperatures in the absence of oxygen to break down plastics into a mixture of smaller molecules known as pyrolysis oil. But the practice has its critics, according to a cover story in Chemical & Engineering News.
Proponents of pyrolysis argue that the process can make up for the shortcomings of traditional recycling, which captures only about 9% of plastics in the U.S., according to the U.S. Environmental Protection Agency. But environmentalists are not yet convinced, and a growing number of jurisdictions, such as California, don't consider pyrolysis recycling at all, writes Senior Editor Alex Tullo. Critics say that pyrolysis facilities can't actually accept the mixed plastic waste that they claim to, as residual contaminants gum up the process too much. A second charge is that pyrolysis is really just incineration. Another concern is scale. Pyrolysis and other forms of chemical recycling have roughly 120,000 t of capacity currently onstream in the U.S.—a miniscule fraction of the 56 million t of overall plastics production in North America in 2021.
Industry executives say they are more committed than ever to recycling and are eager to practice pyrolysis at large scale. Firms are building facilities that are bigger than before to increase capacity. Many companies are attempting to take in more mixed waste, with approaches such as using catalysts and adsorbents to filter out particulate matter and eliminate the most reactive compounds from the feedstock stream. And interest in pyrolysis is taking off, with petrochemical companies building infrastructure to process the products of pyrolysis plants and large engineering companies licensing technology to third parties that want to get into the business. How the technology works in the real world will go a long way to determining the public's perception of the plastics industry.
Citation: Controversial approach aims to expand plastics recycling (2022, October 12) retrieved 17 October 2022 from https://phys.org/news/2022-10-controversial-approach-aims-plastics-recycling.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
When people believe that a door is closing—that they have a limited amount of time left to enjoy something, such as dining out or traveling—they gravitate to the comfort of something familiar rather than the excitement of something new, according to research published by the American Psychological Association.
In eight experiments with nearly 6,000 total participants, researchers explored whether people tend to prefer novel, exciting experiences, such as trying a new restaurant, or familiar ones, such as returning to an old favorite—and whether those preferences shift with the amount of time people believe that they have left to enjoy similar experiences.
The research was published in the Journal of Personality and Social Psychology.
Previous research has found that, on average, people tend to opt for novel and exciting experiences over familiar ones. They would rather enjoy a new movie than rewatch something they've already seen, for example, given equal access to both. However, study authors Ed O'Brien, Ph.D., and Yuji Katsumata Winet, of the University of Chicago Booth School of Business, suspected that "perceived endings" might affect those choices by nudging people to return to a meaningful old favorite.
In the first experiment, the researchers asked 500 online participants and 663 college and business school students to read hypothetical scenarios in which they were given the choice between a new experience or a familiar, beloved one—such as memorizing a new novel versus rereading an old favorite, or visiting a new city versus revisiting a city they loved.
Half the participants were simply asked to make the choice, while the other half were instructed to imagine that it was the last chance that they would have for a while to travel or read a novel. Overall, across all the situations, participants in the "endings" groups were more likely to choose familiar activities compared with participants in the control groups.
In the next set of experiments, the researchers moved beyond hypothetical questions to explore people's behavior in lab and real-life settings. In one, for example, participants were told they would be given a gift card to a restaurant and that the gift card needed to be used in the next month.
Then, half the participants were told to reflect on how few opportunities they would have for going to restaurants in the next month and specific things that might prevent them from going to restaurants. Finally, participants were asked whether they would prefer a gift card to a restaurant they'd visited before or one that was new to them. Overall, 67% of the participants in the "endings" condition preferred a gift certificate to a familiar restaurant, compared with just 48% of those in the control condition.
Finally, the researchers explored why perceived endings seemed to push participants toward familiar things. They found evidence that it was not simply because the familiar experiences were a safe bet that participants knew they would enjoy, but also because they were more likely to find those familiar things personally meaningful.
"Our findings unveil nuance to what people really mean by ending on a high note," said Winet. "Endings tend to prompt people to think about what's personally meaningful to them. People like ending things on a meaningful note as it provides psychological closure, and in most cases old favorites tend to be more meaningful than exciting novelty."
"The research is especially interesting because, on the surface, it runs counter to the idea of the bucket list, whereby people tend to pursue novelty—things they've never done but have always wanted to do—as they approach the end of life," O'Brien said. "Here we find that, at least in these more everyday ending contexts, people actually do the opposite. They want to end on a high note by ending on a familiar note."
The researchers noted that the findings could help people better structure their time to maximize their enjoyment of experiences, for example by visiting an old favorite attraction on the last rather than the first day of a vacation. Retailers and marketers, too, could take advantage—a café slated to close for renovations might put more of its favorite dishes on the menu rather than try new items for sale.
And perhaps, according to the researchers, such psychological framings could be useful for addressing larger societal problems. "Nudging people toward repeat consumption by emphasizing endings and last chances could subtly encourage sustainable consumption by curbing the waste that necessarily accumulates from perpetual novelty-seeking," Winet said.
Citation: When endings approach, people choose the familiar over the novel (2022, October 6) retrieved 17 October 2022 from https://medicalxpress.com/news/2022-10-approach-people-familiar.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
Labskin is delighted to announce our selection by The Intelligence Advanced Research Projects Activity (IARPA), part of the US Office of the Director of National Intelligence, to join a team of experts to develop new ways to evaluate radiation exposure in civilians and military personnel.
Labskin is a key member of a consortium selected to develop these technologies in collaboration with a multidisciplinary team of experts including professors from the University of Columbia in New York, Georgetown University in Washington DC, the Georgia Institute of Technology, scientists from the American Type Culture Collection (ATCC) and computer scientists and researchers from the project lead ARETE Associates, a Defense contractor specializing in sensing solution and machine learning algorithms.
In a project worth $800k, starting immediately, Labskin will help develop this technology into minimally invasive testing for radiation for a program known as Targeted Evaluation of Ionizing Radiation Exposure (TEI-REX). TEI-REX aims to develop novel approaches to evaluate organisms exposed to low-dose ionizing radiation. Labskin coupled with Skin Trust Club’s expertise in skin research and microbiology is essential for the project.
The goal of the project is to develop a new biodosimetry standard which could be applied to maintain the safety of military and civilian populations working or living in close proximity to ionizing radiation sources, such as: nuclear plants, nuclear vessels, ammunition, etc. Labskin’s contribution is the creation of a simple non-invasive swab test to collect signatures from the skin surface that allows machine learning algorithms to detect and quantify the impact of any amount of radiation exposure on the skin microbiome.
This is an unique opportunity to revolutionize the way we test for radiation exposure. Labskin and Skin Trust Club are at the forefront of an increasing number of cutting edge technologies that are changing our world. This technique can also be applied to detect the impact of pollution or a variety of chemicals on the environment. Furthermore, this type of testing could not only be used to detect exposure to these kind of events in humans but also in complex ecological systems such as the soil, crops or sediments”
David Caballero-Lima, Chief Scientist, Labskin
We are committed to the success of this very exciting project. The inclusion of Artificial Intelligence and the opportunity to work with ARETE Associates, with their vast experience in complex AI applications, will result in further advances in how AI can be used in conjunction with our skin model at scale. This project coincides with completion of the expansion of our US labs in Delaware, which will greatly help the implementation of this large project. We believe our proven ability to transition technology to the field with Skin Trust Club will be invaluable as we progress this project.”
Colin O’Sullivan, Chief Information Officer, Labskin
Oct 07, 2022, 12:30amUpdated on Oct 07, 2022
Fairlawn is taking a pro-active approach to crack down on catalytic converter thefts as the state continues to see an uptick.
Fairlawn officials say they are etching serial numbers onto the part for free.
The catalytic converters are often stolen because they contain precious metals.
Officials say scrap yards won't take them if they have serial numbers on them.