Submitted by HP Inc.
By James McCall, Chief Sustainability Officer
In 2021, HP announced a range of ambitious climate action targets, including a commitment to be net zero by 2040 — a full decade ahead of the Paris Agreement. We’ve published our Sustainable Impact Report for over 20 years and have actively worked to reduce our footprint for decades. That’s because it’s in our company’s DNA to push toward the goal of being the most sustainable and just tech company in the world.
We’ve had many successes. Last year HP was one of only 14 companies worldwide, and the sole tech firm, to receive a prestigious Triple A rating in Climate, Water, and Forest benchmarks from the not-for-profit Carbon Disclosure Project (CDP) — our third year in a row. And because consumers care about their footprint and want their purchases to have a positive impact, whether they’re buying a new computer, printer, or coffee machine, sales related to our sustainability efforts have more than tripled, hitting $3.5 billion in fiscal year 2021.
But we realize there is more to be done to reach our goal of cutting our absolute greenhouse gas emissions 50% by 2030, which means minimizing Scope 1, 2, and 3 emissions across our end-to-end value chain.
Scope 1 emissions are from HP’s direct operations. Scope 2 are indirect emissions, such as the electricity that powers our operations. Scope 3 relates to activities not controlled by HP, such as “upstream” emissions from our supply chain and “downstream” emissions from customer use of our products. Together, Scopes 1, 2, and 3 represent the cradle-to-grave emissions of our products, and nearly all our emissions (99%) are Scope 3, with almost 70% of those coming from our supply chain and 30% from customer use.
Tackling Scope 3 emissions
With our supply chain representing over two-thirds of our emissions, our mandate was clear: To reduce the footprint of our printers, computers, and monitors, we had to reduce the footprint of the components, manufacturing, assembly, and transportation of those items. We have hundreds of suppliers, so we needed to take a data-based approach to this problem. We examined our supply chain data and found that our 30 largest partners were responsible for nearly 80% of the Scope 3 emissions from our directly-contracted-suppliers operations. If we could assist those 30 companies in becoming more eco-friendly, the results would be far-reaching. To help these suppliers reach the next level of success, we leaned into the philosophy of “If you provide a man a fish, you feed him for a day. If you teach a man to fish, you feed him for a lifetime.” Not only would helping the suppliers help HP, but it would benefit their bottom line, other customers, the communities where they operate, and the planet as a whole. So we got to work.
Because HP has stressed responsible sourcing, human rights, and sustainability as part of our supplier selection, many of our partners already had a strong base but needed extra support. Building off our real-world learnings within HP, we partnered with them to create environments where they could adopt long-lasting environmentally conscious approaches that would be best for their unique businesses.
Over the last two years, HP has brought in top-tier environmental groups such as the CDP and World Wildlife Fund (WWF) to host virtual workshops for those 30 suppliers. Participants learned about energy efficiency, renewable energy, setting science-based targets, external reporting, and more.
At the same time, we asked our partners to disclose their footprint using CDP Supply Chain reporting tools. Nearly 200 suppliers (representing over 95% of our yearly spending) are currently doing so. This transparency helps HP better understand our footprint and informs the broader tech industry utilizing this supply chain.
Tackling the rest
The results have been incredible: Twenty of our top 30 suppliers have formally committed to setting meaningful greenhouse gas reduction targets following the Science Based Targets Initiative. We are also proud that 100% renewable electricity now powers the final assembly of over 95% of our worldwide PC and display products. HP and our supply chain partners are making substantial progress, but there’s still much more to do. It’s vital that we address the “upstream” supply chain adding to our footprint. However, we cannot reach net zero without also tackling the 30% of our emissions generated during the ongoing customer use of our products. The good news is that customers are actively seeking sustainable choices on shelves, online, or as part of enterprise purchase for printers, computers, and monitors. Our goal is to help them do just that—to make the home, office, or hybrid work setup of the future the most sustainable ever.
How HP is building a sustainable and ethical supply chain
Hundreds of suppliers make up HP’s supply chain — one of the largest in the IT industry — and the company’s commitment to make ethical, sustainable, and resilient products protects its business and brand, strengthens customer relationships, and creates opportunities to innovate.
HP works with peers across the IT industry to engage the entire supply chain in efforts to eradicate minerals that directly or indirectly support armed groups and to promote responsible sourcing of minerals regardless of origin. In the European Union, for example, we support the Conflict Minerals Regulation, which focuses on responsible smelter sourcing regardless of country of mineral origin, including conflict-affected and high-risk areas (CAHRAs) worldwide.
We summarize supplier performance using Sustainability Scorecards, designed to incentivize suppliers and drive ongoing improvement through consistent, comprehensive, and actionable feedback. The results contribute to a supplier’s overall procurement score, which impacts their relationship with HP and ongoing business.
In collaboration with NGO partners and other external organizations, we provide programs designed to help suppliers continually Boost along their sustainability journey. These programs focus on areas such as worker well-being, rights and responsibilities, and environmental, health, and safety (EHS) awareness. In 2021, there was a 114% increase in factory participation in HP’s Supply Chain Sustainability Programs.
We partner with logistics suppliers that have the same environmental mindset as HP to provide solutions to reduce CO2 impact, such as biofuels for ocean freight and electric vehicles for road freight. We are also investigating Sustainable Aviation Fuel for air freight. Additionally, in the United States, HP is a Gold Level Sponsor of Truckers Against Trafficking (TAT), which helps combat human trafficking by educating and mobilizing our trucking supplier network, in coordination with law enforcement agencies.
Our Amplify Impact program invites partners to help drive meaningful change across the global IT industry. Partners that pledge will tap into our extensive knowledge, training, and resources to assess and work to Boost their own sustainability performance. To date, 1,400 channel partners have been trained, educated, and empowered through HP Amplify Impact.
Retail sale / Customer use
HP Planet Partners is the company’s return-and recycling program for computer equipment and printing supplies. HP ink and LaserJet cartridges returned through HP Planet Partners go through a multiphase “closed loop” recycling process. Recycled plastic from empty cartridges is used to create new Original HP cartridges and other everyday products.
Post-sale / End-of-use
We develop services that aim to keep products in use longer, offer service-based solutions, and recapture products and materials at end of use. For instance, through our HP Device Recovery Service we buy used devices securely to provide them new purpose, extend their life spans, and reduce negative environmental impact. Customers receive reverse logistics, data sanitization with a certificate, a sustainability benefit report, and the fair-market value of the device.
HP Inc. creates technology that makes life better for everyone, everywhere. Through our portfolio of printers, PCs, mobile devices, solutions, and services, we engineer experiences that amaze. More information about HP (NYSE: HPQ) is available at www.hp.com.
Sustainable Impact at HP, Inc.
Sustainable Impact is our commitment to create positive, lasting change for the planet, its people and our communities. Click here for more information on HP’s Sustainable Impact initiatives, goals and progress.
More from HP Inc.
For a long time, boutique builders have been the only way to get a desktop PC that you could quickly service yourself. Over the last few years, HP’s Omen gaming brand has made considerable strides to incorporate easily upgradable and replaceable components and standardized parts into its line of gaming PCs. Admittedly, this approach inherently risks turning a computer into yet another beige box PC that look like every other desktop. For that reason, I was excited to hear that the upcoming Omen 45L would feature HP’s existing Omen design language, with a user-friendly, slightly custom design. On paper, the HP Omen 45L strikes a perfect balance between mainstream accessibility and unique compared to the rest of the field. HP sent me an Omen 45L for review, the first HP gaming desktop from a major OEM I’ve used in a very long time. Today I’ll share my main takeaways from my experience with the system.
As configured, my HP Omen 45L was spec’d to the gills with an Intel i9-12900K, 64GB of HyperX DDR4 RAM, 2TB WD Black NVMe SSD and an NVIDIA RTX 3090 GPU. Regardless of the configuration, it ships with an 800W 80 Plus Gold-rated PSU and a case with a Cryo Chamber—one of the main reasons why I was excited about the desktop. The Cryo Chamber isolates the 12900K and the rest of the system components, allowing them to cool separately from the radiator. This design also allows plenty of airflow around the GPU and RAM to ensure the components don’t affect the CPU’s cooling. Additionally, I can attest that the gap between the Cryo chamber and the main chamber of the case serves nicely as a handle, making it easier to carry. As configured, the system’s MSRP was $4,049.99, but it is currently on sale for $3,549.99 (as of July 15th, 2022). It was an interesting choice to see HP go with DDR4 on this system as the Intel 12th Gen processors and Z690 motherboards are also capable of DDR5. I believe that HP likely made this decision mostly due to cost.
In addition to the desktop, HP completed the Omen gaming experience by sending me the Omen 27c monitor and HyperX keyboard and mouse. As far as the Omen 27c monitor’s specs go, I think it’s a very nice monitor. However, I do think HP should offer a higher tier monitor beyond the 1440P curved and 4K 27” monitors it offers today. The 27c monitor fits in with the 25L, 30L and 40L Omen PCs. HP needs a bigger, higher quality gaming monitor, like the Omen X Emperium it developed three years ago as a part of NVIDIA’s line of BFGD TVs. While those BFGDs were admittedly a bit overpriced and underwhelming, there are just so many epic gaming monitors out there now. I’d love to see HP throw its hat into the ring with a halo monitor product.
The Design and Build Quality
The overall design and build quality of the Omen 45L was quite good for a major OEM, though the bar admittedly isn’t very high. The nice thing is that HP designed the case itself for the Omen, allowing it to really fit nicely into the overall Omen design language. The Omen 45L is elegant, but simple. The same could be said for the 27c monitor, which had lots of very square and angular aspects to it. I love the nod to the Omen brand in the RGB logo on the front along with the 3 RGB ring fans. It was an interesting choice for HP to not go RGB on the rear exhaust fan while the other fans and CPU block have RGB and I think for a small increased cost it would Boost the complete system appearance. Overall, I think the design and integration of the Omen 27c monitor complements the desktop extremely well.
Featuring a blend of brushed metal with glass, the quality of the case itself felt extremely high. That said, I thought the power button was in an odd location and could have been larger and had a more tactile feel. I appreciated HP’s use of a GPU bracket to secure the GPU during shipping and to prevent sagging. However, I believe using the bracket to also route the power cables would have given the system a cleaner appearance. If not that, sleeved power cables would have been nice to Boost the premium feel of the system. The previously mentioned RGB Omen-branded CPU cooler is a very nice touch and fits in very well with the overall design language. Still, if you can see it, you end up seeing a lot of the other power and fan cables that aren’t sleeved. It looks a bit like something someone would have built at home without much attention paid to the appearance of cables. This has generally been a problem with many PC OEMs of varying sizes, but boutiques tend to get this part right most of the time. I would welcome HP to look at what boutique builder Maingear has done with its Stealth technology in collaboration with Gigabyte. HP could help it grow as a standard, making cleaner desktops a more cost-effective and common thing.
HP’s system design has four USB ports on the front with only two 5 Gbps ports and six USB ports on the back with two USB 2.0 Type-A ports, one 5 Gbps and one 10 Gbps Type-A ports and the same speed Type-C ports. I think that in this regard HP is just hitting the bare minimum of what’s necessary and should try to do better with that. Sure, I have seen many other major OEMs do the same thing on the rear I/O ports, but ultimately HP Omen should be different. As a gamer myself, I can never have enough USB ports on the back of my machine. Having just built my own Z690 system, the ASUS ROG board had considerably more and faster USB ports. I think a lot of users will be pretty disappointed once they find out how much slower their PC is compared to boutique and custom PCs and how many less ports they have in comparison.
Hands on Experience
The setup was extremely easy and simple, and I really liked that the system was up-to-date when it arrived. I also appreciated that it didn’t feel necessary to set up an account with the Omen Gaming Hub. Speaking of the Omen Gaming Hub, it was nice to have the ability to manage both the desktop and monitor from a single place. That includes the light controls, though I think they could be a little more user friendly and granular. As far as the Omen Light Studio specifically goes, I think it would be nice to have HyperX software built-in so that people who buy HyperX accessories for their HP Omen PC don’t need to load any additional software.
A system with these specs isn’t going to have any trouble playing the latest games, especially since it was attached to a 1440P 27” monitor. Honestly, the 3090 was almost overkill for every game at that resolution; I had no issues running all my games, including Battlefield 2042, at max settings without a single glitch. I would probably recommend the 4K Omen 27 monitor or a Samsung Odyssey G9 if you really want to push the NVIDIA RTX 3090 to its limits. The Omen 27c monitor that HP shipped to me with the system was a nice gaming display, but I was quite surprised by the amount of edge backlight bleed. I would have expected more from a high-end monitor.
The HP Omen overclocking utility uses Intel XTU to benchmark and set performance, with a single-click ‘Turbo Mode’ that allowed me to increase the RAM performance from 3200 MT/s to 3733 MT/s. This delivered a negligible performance increase compared to overclocking the CPU, which requires more granular and painstaking increases of the CPU clock speed. I don’t recommend overclocking a system you want to last you a long time; usually, the risk outweighs the benefits. That said, the Omen 45L has enough cooling for users to push the clock speed a little more; I’d like to see HP offer more automatic overclocking like we see from some of the motherboard vendors.
Regarding genuine gaming performance, I played Battlefield 2042 online in a 64-person server at ultra settings. I got an average of 105 fps, so I probably occasionally hit the limit of this monitor in less graphically intensive scenes. Overall, if you plan to max your games out at max settings, the 1440P monitor may be a great fit if you have a powerful GPU inside like an RTX 3080 or 3090 (HP Does not offer AMD GPUs on this system). Regarding temps during heavy gaming sessions, the GPU peaked at 73C and the CPU around 68C, which makes sense when you consider the sheer size of the radiator in the Omen 45L’s ‘Cryo Chamber.’ The design of the 45L enables the GPU to get ample fresh air without interfering with the CPU’s fresh air, enabling both to run cool and quiet during gaming sessions. I did not get to evaluate HP’s support as I did not encounter any issues, but I consider that to be a good thing for this review.
HP’s Omen 45L impressed me on paper when it was first announced, and it’s quite clear that it is even more impressive in real life. While the Omen 45L is quite large, that is also what enables it to be such a powerful, cool and quiet gaming powerhouse. With a top-spec machine utilizing the latest and greatest chips from Intel and NVIDIA, it is a competent gaming machine that looks great and is reasonably priced for a major OEM. That said, I think that gamers will balk at the lack of I/O on the back of the machine, which is inferior to a boutique or custom-built machine. Even compared to Dell’s Alienware Aurora R13 and R14, it has considerably fewer and slower ports on the front and back. I would also like to see HP integrate HyperX more into the brand and user experience, so it is easier for users to manage all their hardware in one place. I’m genuinely excited about what HP has done with the Omen 45L, and it is among the top of my recommendations for a major OEM system but as always, there is still room for improvement.
Note: Moor Insights & Strategy writers and editors may have contributed to this article.
Moor Insights & Strategy, like all research and tech industry analyst firms, provides or has provided paid services to technology companies. These services include research, analysis, advising, consulting, benchmarking, acquisition matchmaking, and speaking sponsorships. The company has had or currently has paid business relationships with 8×8, Accenture, A10 Networks, Advanced Micro Devices, Amazon, Amazon Web Services, Ambient Scientific, Anuta Networks, Applied Brain Research, Applied Micro, Apstra, Arm, Aruba Networks (now HPE), Atom Computing, AT&T, Aura, Automation Anywhere, AWS, A-10 Strategies, Bitfusion, Blaize, Box, Broadcom, C3.AI, Calix, Campfire, Cisco Systems, Clear Software, Cloudera, Clumio, Cognitive Systems, CompuCom, Cradlepoint, CyberArk, Dell, Dell EMC, Dell Technologies, Diablo Technologies, Dialogue Group, Digital Optics, Dreamium Labs, D-Wave, Echelon, Ericsson, Extreme Networks, Five9, Flex, Foundries.io, Foxconn, Frame (now VMware), Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries, Revolve (now Google), Google Cloud, Graphcore, Groq, Hiregenics, Hotwire Global, HP Inc., Hewlett Packard Enterprise, Honeywell, Huawei Technologies, IBM, Infinidat, Infosys, Inseego, IonQ, IonVR, Inseego, Infosys, Infiot, Intel, Interdigital, Jabil Circuit, Keysight, Konica Minolta, Lattice Semiconductor, Lenovo, Linux Foundation, Lightbits Labs, LogicMonitor, Luminar, MapBox, Marvell Technology, Mavenir, Marseille Inc, Mayfair Equity, Meraki (Cisco), Merck KGaA, Mesophere, Micron Technology, Microsoft, MiTEL, Mojo Networks, MongoDB, MulteFire Alliance, National Instruments, Neat, NetApp, Nightwatch, NOKIA (Alcatel-Lucent), Nortek, Novumind, NVIDIA, Nutanix, Nuvia (now Qualcomm), onsemi, ONUG, OpenStack Foundation, Oracle, Palo Alto Networks, Panasas, Peraso, Pexip, Pixelworks, Plume Design, PlusAI, Poly (formerly Plantronics), Portworx, Pure Storage, Qualcomm, Quantinuum, Rackspace, Rambus, Rayvolt E-Bikes, Red Hat, Renesas, Residio, Samsung Electronics, Samsung Semi, SAP, SAS, Scale Computing, Schneider Electric, SiFive, Silver Peak (now Aruba-HPE), SkyWorks, SONY Optical Storage, Splunk, Springpath (now Cisco), Spirent, Splunk, Sprint (now T-Mobile), Stratus Technologies, Symantec, Synaptics, Syniverse, Synopsys, Tanium, Telesign,TE Connectivity, TensTorrent, Tobii Technology, Teradata,T-Mobile, Treasure Data, Twitter, Unity Technologies, UiPath, Verizon Communications, VAST Data, Ventana Micro Systems, Vidyo, VMware, Wave Computing, Wellsmith, Xilinx, Zayo, Zebra, Zededa, Zendesk, Zoho, Zoom, and Zscaler. Moor Insights & Strategy founder, CEO, and Chief Analyst Patrick Moorhead is an investor in dMY Technology Group Inc. VI, Dreamium Labs, Groq, Luminar Technologies, MemryX, and Movandi.
Steve Nuñez is technologist-turned-executive currently working as a management consultant helping senior executives apply artificial intelligence in a practical, cost effective manner. He takes an incremental approach to AI adoption, emphasizing the organizational change required for analytics and A.I. to become part of the company DNA.
Before moving to consulting Steve led the professional services and technical pre-sales organizations in Asia Pacific for MapR, a “big data unicorn” acquired by HP Enterprise. While leading the field organization, Steve served clients including Toyota, Bank of China, Philips, Samsung, and the government of India in their bio ID program.
Steve has been a contributing editor and reviewer for InfoWorld since 1999.
Reverb is HP’s second VR headset, and this time around the company is aiming mainly at the enterprise market, but not shying away from selling individual units at a consumer price point. As the highest resolution headset presently available at that consumer price point, it has a unique selling point among all others, though the usual compromises of Windows Mixed Reality still apply.
To be up front, the HP Reverb headset itself is a solid improvement over its predecessor by most measures. The new design is comfortable and feels higher quality. The new displays and lenses offer a considerably better looking image. And on-board audio is a huge plus. However, while its hardware has improved in many ways, it’s still a ‘Windows Mixed Reality’ headset, which means it shares the same irksome controllers as all Windows VR headsets.
Reverb’s headlining feature is its high-resolution LCD displays, which are significantly more pixel dense than any headset in its class. On paper, we’re talking about 2,160 × 2,160 per display, which is a big step up over the next highest resolution headsets in the same class—the Valve Index, showcasing a resolution of 1,440 × 1,600 per display (also LCD, which means full RGB sub-pixels), and HTC Vive Pro’s dual 1,440 × 1,600 AMOLEDs, which feature an RGBG PenTile pixel matrix. Among the three, Reverb has a little more than twice the total number of pixels.
There’s no doubt that Reverb’s displays are very sharp, and very pixel dense. It’s impossible to focus on a single pixel, and the screen door effect (unlit spaces between pixels) is on the verge of being difficult to see. It has the best resolving power of any headset in its class, which means textures, edges, and text are especially crisp.
Unfortunately, overall clarity is held back in a large way by plainly visible mura. At a glance, mura can look similar to the screen door effect (in the way that it’s ‘locked’ to your face and reduces clarity) but is actually a different artifact resulting from poor consistency in color and brightness across the display. It ends up looking like the display is somewhat cloudy.
As HP is mostly pushing Reverb for enterprise, they probably aren’t terribly concerned with this—after all, text legibility (a major selling point for enterprise customers) gets a big boost from the headset’s high resolution whether or not mura is present. For anyone interested in Reverb for visual immersion though, the mura unfortunately hampers where it might be otherwise.
There’s also a few other curious visual artifacts. There’s a considerable amount of chromatic aberration outside of the lenses’ sweet spot. There’s also subtle—but noticeable—pupil swim (varying distortion across the lens that appears as motion as your eye moves across the lens). In most headsets, these are both significantly reduced via software corrections, and I’m somewhat hopeful that they could be improved with better lens correction profiles for Reverb in the future. While I couldn’t spot any obvious ghosting or black smear, interestingly Reverb shows red smear, which is something I’ve never seen before. It’s the same thing you’d expect with black smear (where dark/black colors can bleed into brighter colors when you move your head, especially white), but in Reverb it manifests most when red (or any color substantially composed of red, including white) shares a boundary with a dark/black color. In my testing this hasn’t led to any significant annoyance but, as ever, it could be bothersome in some specific content.
From a field of view standpoint, HP claims 114 degrees diagonally for Reverb, which is higher than what’s typically quoted for headsets like the Rift (~100) and Vive (~110). Nobody in the industry really seems to agree what amounts to a valid field of view measurement though, and to my eyes, Reverb’s field of view falls somewhere between the two. So whether you call it 105 or 114, Reverb is in the same field of view class as most other PC VR headsets. These are Fresnel lenses, which means they are susceptible to god rays, which are about as apparent on Reverb as with latest headsets like the Rift S, and a bit less prevalent than the original Rift and Vive.
Reverb’s other big feature is its major ergonomic redesign. HP has ditched the halo headstrap approach seen on every other Windows VR headset and instead opted for a much more (original) Rift-like design, including on-ear headphones. At least to my head, Reverb’s ergonomics feel like a big improvement over HP’s original Windows VR headset.
I found it quite easy to use for an hour or more while maintaining comfort. As with all headsets of this design, the trick is knowing how to fit it right (which isn’t usually intuitive). New users are always tempted to tighten the side straps and clamp the headset onto their face like a vice, but the key is to find the spot where the rear ring can grip the crown of your head, then tighten the top strap to ‘lift’ the visor so that it’s held up by ‘hanging’ from the top strap rather than by sheer friction against your face. The side straps should be as loose as possible while still maintaining stability.
I was able to get Reverb to feel very comfortable, but I’m a little thinking that the headset won’t easily accommodate larger heads or noses. Personally speaking, I don’t fall on either ends of the spectrum for head or nose size, so I’m guessing I’m fairly average in that department. Even so, I had Reverb’s side straps as loose as they would possibly go in order to get it to fit well. If I had a bigger head, the straps themselves wouldn’t have more room to accommodate; all the extra space would be made up by further stretching the springs in the side struts, which would put more pressure on my face than is ideal.
I also felt like I was pushing the limits of the headphones and the nose gap. The best fit for the headphones is to have them all the way in their bottom position; if there were a greater distance between the top of my head and my ears, or if I preferred the top strap adjustment more tightly, the headphones wouldn’t be able to extend far enough down to be centered on my ears.
With the nose gap, I was feeling a bit of pressure on the bridge of my nose, and actually opted to remove the nose gasket entirely (the piece of rubber that blocks light), which gave me just enough room to not feel like the headset was in constant contact with the bridge of my nose. If you have a larger nose or a greater distance between the center of your eye and your nose’s bridge, you might find the nose gap on Reverb annoyingly small.
As with most other Windows VR headsets, Reverb lacks a hardware IPD adjustment, which means only those near to the headset’s fixed IPD setting will have ideal alignment between their eyes and the optical center of the lenses. We’ve reached out to HP to confirm the headset’s fixed IPD measurement, though I expect it to fall very close to 64mm. If you are far from the headset’s fixed figure, you’ll unfortunately lose out on some clarity.
So, if it fits, Reverb from a hardware standpoint is a pretty solid headset, and the singular choice for anyone prioritizing resolution over anything else. However, Reverb can’t escape the caveats that come with all Windows VR headsets.
Mostly that’s the controllers and their tracking. Reverb uses the same Windows VR controllers as every other Windows VR headset except for Samsung (which has slightly different controllers). Yes, they work, but they are the worst 6DOF controllers on the market. They’re flimsy, bulky, and not very ergonomic. They actually track quite well from a performance standpoint, but their tracking coverage hardly extends outside of your field of view, which means they lose tracking any time your hands linger outside of the sensor’s reach, even if that means just letting them hang naturally down by your sides.
The tracking coverage issue is primarily driven by the tracking system used in every Windows VR headset: a two-camera inside-out system. HP says Reverb’s tracking is identical to the first generation headsets, and as such, Reverb’s two cameras lose controller tracking as often as its Windows VR contemporaries. Luckily, the headtracking itself is pretty darn good (on par with Rift S in my experience so far), and so is controller tracking performance when near the headset’s field of view. For content where your hands are almost always in your field of view (or only leave it briefly), Windows VR controller tracking can work just fine. In fact, Reverb holds up very well when playing Beat Saber on its highest difficulty because your hands don’t spend much time outside of the field of view before entering it again (to slice a block). But there’s tons of content where you hands won’t be consistently held in the headset’s field of view, and that’s when things can get annoying.
For all of its downsides, the Windows VR tracking system also means that Reverb gets room-scale 360 tracking out of the box and doesn’t rely on any external sensors. That’s great because it means relatively easy setup, and support for large tracking volumes.
The compromises on the controller design and tracking were easy to swallow considering how inexpensively you could find a Windows VR headset ($250 new in box is not uncommon). But Reverb has introduced itself as the new premium option among Windows VR headsets at $600, which shines a much brighter light on the baggage that comes with every Windows VR headset to date.
While Windows Mixed Reality—which is built into Windows and comes with its very own VR spatial desktop—is the native platform for Reverb and all other Windows VR headsets, there’s an official plugin that makes it compatible with most SteamVR content, which vastly expands the range of content available on the headset.
Disclosure: HP provided Road to VR with a Reverb headset.
Modernisation of data architecture is key to maximising value across the business.
Dietmar Rietsch, CEO of Pimcore, identifies best practices for organisations to consider when managing modern enterprise data architecture
Time and again, data has been touted as the lifeline that businesses need to grow and, more importantly, differentiate and lead. Data powers decisions about their business operations and helps solve problems, understand customers, evaluate performance, Boost processes, measure improvement, and much more. However, having data is just a good start. Businesses need to manage this data effectively to put it into the right context and figure out the “what, when, who, where, why and how” of a given situation to achieve a specific set of goals. Evidently, a global, on-demand enterprise survives and thrives on an efficient enterprise data architecture that serves as a source of product and service information to address specific business needs.
A highly functional product and master data architecture is vital to accelerate the time-to-market, Boost customer satisfaction, reduce costs, and acquire greater market share. It goes without saying that data architecture modernisation is the true endgame to meet today’s need for speed, flexibility, and innovation. Now living in a data swamp, enterprises must determine whether their legacy data architecture can handle the vast amount of data accumulated and address the current data processing needs. Upgrading their data architecture to Boost agility, enhance customer experience, and scale fast is the best way forward. In doing so, they must follow best practices that are critical to maximising the benefits of data architecture modernisation.
Below are the seven best practices that must be followed for enterprise data architecture modernisation.
Enterprises gain a potent competitive edge by enhancing their ability to explore data and leverage advanced analytics. To achieve this, they are shifting toward denormalised, mutable data schemas with lesser physical tables for data organisation to maximise performance. Using flexible and extensible data models instead of rigid ones allows for more rapid exploration of structured and unstructured data. It also reduces complexity as data managers do not need to insert abstraction layers, such as additional joins between highly normalised tables, to query relational data.
Data architects are moving away from clusters of centralised enterprise data lakes to domain-based architectures. Herein, data virtualisation techniques are used throughout enterprises to organise and integrate distributed data assets. The domain-driven approach has been instrumental in meeting specific business requirements to speed up the time to market for new data products and services. For each domain, the product owner and product team can maintain a searchable data catalog, along with providing consumers with documentation (definition, API endpoints, schema, and more) and other metadata. As a bounded context, the domain also empowers users with a data roadmap that covers data, integration, storage, and architectural changes.
This approach significantly reduces the time spent on building new data models in the lake, usually from months to days. Instead of creating a centralised data platform, organisations can deploy logical platforms that are managed within various departments across the organisation. For domain-centric architecture, a data infrastructure as a platform approach leverages standardised tools for the maintenance of data assets to speed up implementation.
Implications of data silos for the data-driven enterprise are diverse. Due to data silos, business operations and data analytics initiatives are hindered since it is not possible to interpret unstructured, disorganised data. Organisational silos make it difficult for businesses to manage processes and make decisions with accurate information. Removing silos allows businesses to make more informed decisions and use data more effectively. Evidently, a solid enterprise architecture must eliminate silos by conducting an audit of internal systems, culture, and goals.
A crucial part of modernising data architecture involves making internal data accessible to the people who need it when they need it. When disparate repositories hold the same data, data duplicates created make it nearly impossible to determine which data is relevant. In a modern data architecture, silos are broken down, and information is cleansed and validated to ensure that it is accurate and complete. In essence, enterprises must adopt a complete and centralised MDM and PIM to automate the management of all information across diverse channels in a single place and enable the long-term dismantling of data silos.
With the advent of real-time product recommendations, personalised offers, and multiple customer communication channels, the business world is moving away from legacy systems. For real-time data processing, modernising data architecture is a necessary component of the much-needed digital transformation. With a real-time architecture, enterprises can process and analyse data with zero or near-zero latency. As such, they can perform product analytics to track behaviour in digital products and obtain insights into feature use, UX changes, usage, and abandonment.
The deployment of such an architecture starts with the shift from a traditional model to one that is data-driven. To build a resilient and nimble data architecture model that is both future-proof and agile, data architects must integrate newer and better data technologies. Besides, streaming models, or a combination of batch and stream processing, can be deployed to solve multiple business requirements and witness availability and low latency.
Data today is no longer limited to structured data that can be analysed with traditional tools. As a result of big data and cloud computing, the sheer amount of structured and unstructured data holding vital information for businesses is often difficult to access for various reasons. It implies that the data architecture should be able to handle data from both structured and unstructured sources, both in a structured and an unstructured format. Unless enterprises do so, they miss out on essential information needed to make informed business decisions.
Data can be exposed through APIs so that direct access to view and modify data can be limited and protected, while enabling faster and more current access to standard data sets. Data can be reused among teams easily, accelerating access to and enabling seamless collaboration among analytics teams. By doing this, AI use cases can be developed more efficiently.
Cloud computing is probably the most significant driving force behind a revolutionary new data architecture approach for scaling AI capabilities and tools quickly. The declining costs of cloud computing and the rise of in-memory data tools are allowing enterprises to leverage the most sophisticated advanced analytics. Cloud providers are revolutionising how companies of all sizes source, deploy and run data infrastructure, platforms, and applications at scale. With a cloud-based PIM or MDM, enterprises can take advantage of ready-use and configured solutions, wherein they can seamlessly upload their product data, automate catalog creation, and enrich it diverse marketing campaigns.
With a cloud PIM or MDM, enterprises can eliminate the need for hardware maintenance, application hosting, version updates, and security patches. From the cost perspective, the low cost of subscription of cloud platforms is beneficial for small businesses that can scale their customer base cost-effectively. Besides, cloud-based data platforms also bring a higher level of control over product data and security.
Businesses often have to move beyond legacy data ecosystems offered by prominent solution vendors to scale applications. Many organisations are moving toward modular data architectures that use the best-of-breed and, frequently, open source components that can be swapped for new technologies as needed without affecting the other parts of the architecture. An enterprise using this method can rapidly deliver new, data-heavy digital services to millions of customers and connect to cloud-based applications at scale. Organisations can also set up an independent data layer that includes commercial databases and open source components.
Data is synchronised with the back-end systems through an enterprise service bus, and business logic is handled by microservices that reside in containers. Aside from simplifying integration between disparate tools and platforms, API-based interfaces decrease the risk of introducing new problems into existing applications and speed time to market. They also make the replacement of individual components easier.
Modernising data architecture allows businesses to realise the full value of their unique data assets, create insights faster through AI-based data engineering, and even unlock the value of legacy data. A modern data architecture permits an organisation’s data to become scalable, accessible, manageable, and analysable with the help of cloud-based services. Furthermore, it ensures compliance with data security and privacy guidelines while enabling data access across the enterprise. Using a modern data approach, organisations can deliver better customer experiences, drive top-line growth, reduce costs, and gain a competitive advantage.
How to get ahead of the National Data Strategy to drive business value — Toby Balfre, vice-president, field engineering EMEA at Databricks, discusses how organisations can get ahead of the National Data Strategy to drive business value.
A guide to IT governance, risk and compliance — Information Age presents your complete business guide to IT governance, risk and compliance.
LIBERTY LAKE, Wash. - August 2, 2022 - ( Newswire.com )
Vega Cloud, Inc., the cloud automation and optimization platform for highly regulated and performance-driven industries, today announced a collaboration with HP Teradici, creator of the industry-leading PCoIP® remote display protocol, to develop services that will provide businesses the freedom to efficiently deliver high-performance digital workstations, even over the most challenging network conditions.
Vega Cloud will leverage HP Anyware* (formerly Teradici CAS) and PCoIP® technology to develop two digital workstation services: Secure WorkRemote (SWR) for regulated industries and Vega Atelier remote studios for resource-intensive industries like Media and Entertainment. Vega Cloud has proven its all-inclusive, digital workstations provide robust security and sustainability performance anywhere, on any device, from any network. As critical business assets, healthcare and financial institution customers demand detailed evidence to ensure digital desktops comply with applicable laws and regulations. Vega Cloud consistently exceeds the information security and performance requirements and will continue to evolve as guidelines change.
Secure WorkRemote and Vega Atelier remote studios incorporate multilayered security and IP protection. Each workstation is deployed inside a zero-trust isolation pool to trap and inoculate bad actors. The Teradici PCoIP® protocol and AES 256 encryption ensures that only pixels leave the environment. Both services have built-in optimization to auto-scale memory, graphics, and CPU to efficiently power through intensive applications.
The platforms contain a robust catalog of pre-configured desktop SKUs so administrators can quickly deploy and decommission workstations, significantly reducing operational management and support. Operational metrics and key performance indicators can be monitored by workgroups or discipline with drag-and-drop simplicity and automated workflows. Administrators have confidence controls around desktop delivery and compliance are delivered appropriately and rapidly.
"Hybrid workplaces operate on several different networks and this complexity has outstripped perimeter-based network security," said Kris Bliesner, CEO, Vega Cloud. "The strategic partnership with HP Teradici and Vega Cloud focuses on data and service protection and expands protection and performance to include all workstation assets (devices, infrastructure components, applications, virtual and cloud components) and end users. This is a paradigm-shifting approach to digital workstations powered by Cloud Infrastructure and Optimized by Vega's Platform."
"HP Anyware provides the security and performance that Vega Cloud needs for its new services to excel in the most demanding use cases," said Paul Austin, Worldwide Head of Teradici Channel Sales at HP. "Vega Cloud's capabilities align with our focus on delivering secure digital workspace solutions, and we're proud to work with them on enabling the industries we serve together to realize the benefits of hybrid work."
About Vega Cloud
Vega is a Software as a Service (SaaS) platform for enterprise cloud management. Vega unlocks the power of public cloud infrastructure, giving businesses the freedom to innovate and efficiently deliver world-class products and services. Vega combines scale management with speed and efficiency to drive business outcomes that align with end-user goals. Public Cloud infrastructure isn't just for architects or DevOps engineers anymore. Vega puts the power of fully optimized, fully managed infrastructure to work for your business.
About HP Teradici
HP Teradici is the creator of the PCoIP remote display protocol, which delivers digital workspaces from the data center or public cloud to end users with the highest levels of security, responsiveness, and fidelity. HP Anyware (formerly Teradici CAS), which won an Engineering Emmy® from the Television Academy in 2020, powers the most secure remote solutions with unparalleled performance for even the most graphics-intensive applications. HP Teradici technology is trusted by leading media companies, design houses, financial firms and government agencies and is deployed to millions of users worldwide. For more information, visit www.teradici.com or www.hp.com/anyware.
Connect with HP Teradici on Twitter @Teradici, LinkedIn, YouTube and the Teradici blog.
For more information about Vega SWR and Vega Atelier, please visit https://www.vegacloud.io/secureworkremote and https://www.vegacloud.io/vega-atelier.
Teradici and PCoIP are trademarks of HP, Inc., and are registered in the United States and/or other countries. Any other trademarks or registered trademarks mentioned in this release are the intellectual property of their respective owners.
*Network access required. HP Anyware supports Windows®, Linux® and MacOS® host environments and Window, Linux, MacOS, iOS®, Android®, and Chrome OS® end-user devices. For more on the system requirements for installing HP Anyware, refer to the Admin Guides at: https://docs.teradici.com/find/product/cloud-access-software
Press Release Service by Newswire.com
Original Source: Vega Cloud Announces Global Strategic Collaboration With HP Teradici for the Next Generation of Hybrid Digital Workspace Performance and Security
Manufacturers do not always ask the right question when they start to consider additive manufacturing (AM) for particular parts, said Claudia Galdini from HP during Advanced Manufacturing last week.
“Instead of asking ‘How could I switch this design to additive?’ the question could be ‘How can I take the full benefit of this 3D printer with a design that is made exactly for that?’,” she said during her session, which provided a ‘purchaser’s checklist’ for AM.
Taking such an approach helps maximise the benefits of the technology, she said. Advantages can include a boost to sustainability thanks to low wastage and high recyclability. Printing in one piece can make parts lighter and Boost performance, while eliminating joints helps prevent leaks when working with fluids.
More training is needed to help organisations optimise the introduction of AM, however. “There are so many technologies on the market that it can be overwhelming looking for the best one for you,” she said.
“Using the machines themselves does not require that much training. What we believe needs to be intensified is, for example, the ‘design for additive’ courses, things like that, because sometimes we’ve been in contact with clients that really get the full benefits of our technology, but at the beginning didn’t even think of implementing some function of additive. That’s really just a matter of growing up, both in your education and your enterprise, without really having the chance to experiment. So it’s all about trying and failing, experimenting.”
Get to grips with the future factory at Advanced Manufacturing, part of the Engineering Futures webinar series. Register for FREE to watch on-demand content now.
Content published by Professional Engineering does not necessarily represent the views of the Institution of Mechanical Engineers.
A wide variety of malwares and vulnerability exploits can be bought with ease on underground marketplaces for about $10 (£8.40) on average, according to new statistics – only a few pennies more than the cost of London’s most expensive pint of beer.
The average price of a pint of beer has risen by 70% since the 2008 financial crisis and earlier this year, researchers at customer experience consultancy CGA found one pub in London charging £8.06. The researchers, perhaps sensibly, did not name the establishment in question.
But according to a new report, The evolution of cybercrime: why the dark web is supercharging the threat landscape and how to fight back, produced by HP’s endpoint security unit HP Wolf Security, the price of cyber criminality is tumbling, with 76% of malware advertisements, and 91% of exploits, found to retail for under $10.
Meanwhile, the average cost of an organisation’s compromised remote desktop protocol (RDP) credentials clocked in at just $5 (£4.20) – a far more appealing price for a beer as well, especially in London.
Vulnerabilities in niche systems, predictably, went for higher prices, and zero-days, vulnerabilities yet to be publicly disclosed, still fetch tens of thousands of pounds.
HP Wolf’s threat team got together with forensic specialists Forensic Pathways and spent three months scraping and analysing 35 million posts on dark web marketplaces and forums to understand how cyber criminals operate, gain each other’s trust, and build their reputations.
And unfortunately, said HP senior malware analyst and report author Alex Holland, it has never been easier or cheaper to get into cyber crime.
“Complex attacks previously required serious skills, knowledge and resource, but now the technology and training is available for the price of a gallon of gas,” said Holland. “And whether it’s having your company and customer data exposed, deliveries delayed or even a hospital appointment cancelled, the explosion in cyber crime affects us all.
“At the heart of this is ransomware, which has created a new cyber criminal ecosystem rewarding smaller players with a slice of the profits. This is creating a cyber crime factory line, churning out attacks that can be very hard to defend against and putting the businesses we all rely on in the crosshairs.”
The exercise also found many cyber criminal vendors bundling their wares for sale. In what might reasonably be termed the cyber criminal equivalent of a supermarket meal deal, the buyers receive plug-and-play malware kits, malware- or ransomware-as-a-service (MaaS/RaaS), tutorials, and even mentoring, as opposed to sandwiches, crisps and a soft drink.
In fact, the skills barrier to cyber criminality has never been lower, the researchers said, with only 2-3% of threat actors now considered “advanced coders”.
And like people who use legitimate marketplaces such as Ebay or Etsy, cyber criminals value trust and reputation, with over three-quarters of the marketplaces of forums requiring a vendor bond of up to $3,000 to become a licensed seller. An even bigger majority – over 80% – used escrow systems to protect “good faith” deposits made by buyers, and 92% had some kind of third-party dispute resolution service.
Every marketplace studied also provides vendor feedback scores. In many cases, these hard-won reputations are transferrable between sites, the average lifespan of a dark web marketplace clocking in at less than three months.
Fortunately, protecting against such increasingly professional operations is, as ever, largely a case of paying attention to mastering the basics of cyber security, adding multi-factor authentication (MFA), better patch management, limiting risks posed by employees and suppliers, and being proactive in terms of gleaning threat intelligence.
Ian Pratt, HP Inc’s global head of security for personal systems, said: “We all need to do more to fight the growing cyber crime machine. For individuals, this means becoming cyber aware. Most attacks start with a click of a mouse, so thinking before you click is always important. But giving yourself a safety net by buying technology that can mitigate and recover from the impact of bad clicks is even better.
“For businesses, it’s important to build resiliency and shut off as many common attack routes as possible. For example, cyber criminals study patches on release to reverse-engineer the vulnerability being patched and can rapidly create exploits to use before organisations have patched. So, speeding up patch management is important.
“Many of the most common categories of threat, such as those delivered via email and the web, can be fully neutralised through techniques such as threat containment and isolation, greatly reducing an organisation’s attack surface, regardless of whether the vulnerabilities are patched or not.”