The cult classic movie Office Space is a scathing critique of life for software engineers in a cubicle farm, and it did get a lot of things right even if it didn’t always mean to. One of those is the character of Tom Smykowski whose job is to “deal with the customers so the engineers don’t have to”. The movie treats Tom and his job as a punchline in a way, but his role is actually very important for most real businesses that rely on engineers or programmers for their core products.
Engineers can have difficulty relating to customers, and often don’t have the time (or even willingness) to handle the logistics of interacting with them in the first place. Customers may get frustrated understanding engineers or communicating their ideas clearly to them. A person like Tom Smykowski is often necessary to bridge the gap and smooth out the rough edges on both sides, but in the Linux world there are very few Toms to rely on. The customers, or users, have to deal directly with the engineers in many situations, and it’s not working out very well for either group. Linux has a marketing problem, and it needs a marketing solution if it ever wants to increase its market share in the PC realm.
If you’ve ever gone further into the diverse and layered world of Linux than installing a pre-packaged distribution like Ubuntu or Mint, you’ve probably come across someone who claims that the proper way to refer to “Linux” is actually as “GNU/Linux”, or has gone on a rant about how binary blobs are dangerous, or any number of other topics. Any of these points may in fact be valid, but will instantly turn away anyone who is simply looking for a quality piece of software and may not yet care about the finer points of FOSS or the motivations of the people who are involved in creating the software. Truly, these developers and coders should be commended and respected for the creations that they have brought into the world but can’t be expected to market their products effectively since they aren’t marketers. These beliefs about software are passionately held and firmly believed, but aren’t a good way of interacting with the outside world. The core problem here is that people with deep knowledge on a subject often have difficulty relating that knowledge to the general public, and they need some help.
Let’s look a little deeper into this problem as it relates to Linux and take a broad overview of the current state of operating system useage rates. For desktops and laptops, Windows has 87% of the market, with macOS trailing at around 10% and Linux under 4%. Both Microsoft and Apple have huge marketing budgets and also benefit from some institutional advantage here. But if we look at systems who do not rely on marketing for sales, such as the supercomputing or server worlds, Linux is dominant in every way. Virtually 100% of supercomputers use Linux now. How you define a webserver is contentious, and Linux figures range from 70% to 98% depending on whether you count cloud services and subdomains, but anyway Linux runs the vast majority of the web. Even smartphones are dominated by the Linux-powered Android, with about 65% of devices, 20% using iOS, and the rest being an amalgamation of fading Blackberries, Windows Phones, and others.
From these numbers we can infer that there is some intrinsic benefit to working in a Linux environment. Not only does it have dominance when raw computing ability is needed, either in a supercomputer or a webserver, but it must have some ability to effectively work as a personal computer as well, otherwise Android wouldn’t be so popular on smartphones and tablets. From there it follows that the only reason that Microsoft and Apple dominate the desktop world is because they have a marketing group behind their products, which provides customers with a comfortable customer service layer between themselves and the engineers and programmers at those companies, and also drowns out the message that Linux even exists in the personal computing realm.
To deliver an example of how frustrating it can be to get through jargon in the Linux world, take a look at Puppy Linux, a version of Linux specifically designed to run on a jump drive or on legacy hardware. It’s been around since the early 2000s, so it’s not new to the game. Its main features are its small size and the ability to save its state to the jump drive it’s installed on, preserving the settings and files between reboots and across different machines.
The installation process is not straightforward, despite its age, and requires two separate jump drives or a single jump drive and a computer with Puppy already installed. It seems as though the website for the distribution should have directions, or at least link to the directions. Instead, the front page is largely a treatise on how Puppy Linux isn’t actually a “distribution” per se, and a technical description of what does and doesn’t count as a true Linux distribution.
Confusingly, underneath this paragraph is a set of obtain links labeled “Official Distributions”. This is a perfect example of the customer having too much direct interaction with the engineers. It’s as if we have to listen to a lecture on the difference between Phillips and Torx screws before being allowed to use a screwdriver for a simple task. We need to know how to install and use the software first, and then we can investigate its nuances and ideology once we know how to use it.
Of course we’re picking on Puppy Linux a little to help illustrate a point, but this trend is far from rare in the Linux world. On the other hand, a counterexample of how even a simple buffer between users and developers can work, and work well, can be found at Canonical, the company that manages the Ubuntu distribution. Their home page is informative, easy to understand, and not cluttered by jargon. The obtain page is clearly located, as are directions for installing the software. There are some hiccups though, like the 64-bit versions being labeled as “AMD” despite being able to run on Intel hardware, which is a needless holdover from a forgotten time when 32-bit processors were the norm. Nonetheless, it’s a great example of how smooth a Linux distribution can be when a group of people who understand people’s needs and wants act as a Tom Smykowski-like layer between the creators of the software and its users.
Part of the problem too is that Linux and most of its associated software is free and open source. What is often a strength when it comes to the quality of software and its flexibility and customizablity becomes a weakness when there’s no revenue coming in to actually fund a marketing group that would be able to address this core communications issue between potential future users and the creators of the software. Canonical, Red Hat, SUSE and others all had varying successes, but this illistrates another problem: the splintered nature of open-source software causes a fragmenting not just in the software itself but the resources.
Imagine if there were hundreds of different versions of macOS that all Apple users had to learn about and then decide which one was the best for their needs. Instead, Apple maintained its unity and is all the better for it, from a user’s point-of-view. They also have an annual operating budget of $71 billion compared to Canonical’s $6.2 million, which surely doesn’t hurt Apple either and further cements the point that marketing (and budget size) is important.
Now, I am making a few assumptions here, namely that “the Linux community” is a monolithic bloc rather than a loose confederation of people who have specific, often unrelated, interests within the computing world. There is no single point-of-contact for all things Linux-related, and that makes it a little difficult to generalize about the entire community as a whole. To that end, there is no single “goal” of the Linux community and no one in it may even care about having a 1-2% market share in the personal computing arena.
As an electrical engineer and someone who occasionally has difficulty with pointers when stumbling through code, I am admittedly on the outskirts of the community as a whole, but this critique comes from a place of respect and admiration for everyone who has made it possible for me to use free software, even if I have to work hard to figure things out sometimes. I have been using Linux exclusively since I ditched XP for 5.10 Breezy Badger and would love to live in a world where I’m not forced into the corporate hellscape of a Windows environment every day for no other reason than most people already know how to use Windows.
With a cohesive marketing strategy, I think this could become a reality, but it won’t happen through passionate essays on “free as in freedom” or the proper way to pronounce “GNU” or the benefits of using Gentoo instead of Arch. It’ll only come if someone can unify all the splintered groups around a cohesive, simple message and market it to the public. We need someone who can turn something like a “Jump to Conclusions Mat” into a million dollars.
Linux, or GNU/Linux to acknowledge the large number of packages from the GNU OS that are commonly used alongside the Linux kernel, is a hugely popular open-source UNIX-like operating system.
The Linux OS kernel was first released in 1991 by Linus Torvalds, who still oversees kernel development as part of a large development community. Linux runs most of the cloud, most of the web, and pretty much every noteworthy supercomputer. If you use Android or one of its derivatives, your phone runs an OS with a modified Linux kernel, and Linux is embedded in everything from set-top boxes to autonomous cars. About 1.3 million people even use it to play games on Steam.
The world of Linux is a little more complicated than that of Windows or macOS, however. The open source nature of the Linux kernel and most of its applications allows anyone to freely modify them, which has resulted in a proliferation of different versions geared towards specific functions. Each of these distributions (or ‘distros’) uses the core Linux kernel and usually some GNU packages, and then a selection of software packages variously developed internally, taken from an upstream distro, or built from other open-source software.
Thus, Pop!_OS, for example, shares most of its software with Ubuntu, from which it descends. Ubuntu was originally a fork of Debian, and still contains a large percentage of the same codebase, regularly synced. All three also use the Linux kernel and numerous GNU software packages. You can roll your own distro if you like, customised to include whatever software your use case, philosophy or personal preference demands.
This can lead to complaints about fragmentation from both users and developers targeting the platform. However, many of these distributions are closely related and the underlying Linux operating system means that - much like its Unix-compatible POSIX-compliant relatives such as OpenBSD and macOS - once you understand the fundamentals of using GNU/Linux, you can apply that knowledge to any other Linux OS and be confident that everything will work more or less as you expect.
One last point to note is that while all Linux distros rely to some extent on voluntary contributions from a community of developers for their continued development and stability, some distros are backed by large commercial software development organisations, with Canonical (which develops Ubuntu) and Red Hat being key examples. Because they benefit from full-time corporate support and upkeep, these distros are often updated more frequently than at least some of their community rivals and may be better options for businesses who prioritise stability.
Although desktop Linux is a comparatively niche use case compared to the operating system’s ubiquitous server presence, it’s also the most fun and rewarding. An Ubuntu-based distro is currently your best bet if you want things to just work with a minimum of faff, but our favourites also include distros like Arch and Slackware, which actively encourage you to cultivate a deeper understanding of the OS underlying your desktop.
System 76’s Pop!_OS is one of the most comfortable choices for desktop Linux users who just want to get on with things. It’s based on Ubuntu, but strips out some of the more controversial elements, such Ubuntu’s default Snap package system, while adding useful features such as out-of-the-box support for Vulkan graphics. Its target audience is developers, digital artists, and STEM professionals.
Pop!_OS has a particularly pleasant graphical installation interface, designed to be quick and approachable. Its slick Cosmic desktop is based on GNOME 2, and vaguely reminiscent of macOS’s GUI layout. Future iterations are set to ship with an entirely new window manager, developed in-house by System76. System76 is also an OEM and makes laptop, desktop and server systems, all of which run the distro by default.
Arch is a thoroughly modern, rolling-release distro that nonetheless aims to provide a classic Linux experience, giving you as much hands-on control over your OS and its configuration as possible. You’ll have to choose your own desktop environment after installation, for example.
Its official repositories typically update quickly enough, but these exist alongside the bleeding-edge community-driven AUR (Arch User Repository) system, from which you can compile packages and install them as usual via the Pacman package manager. For those who don’t want to drive straight to the DIY ethos, Manjaro is the most popular of its derivatives, built to be more beginner-friendly, with a graphical installation interface and quality-of-life tools for driver management.
Based on Debian, Canonical’s Ubuntu Linux shares a significant chunk of its architecture and software, such as the friendly apt package management system. But it brings a lot of unique features to the table. Canonical’s Snap packages, for example, are designed to make it easy to package and distribute tamper-proof software with all necessary dependencies included, making it extremely well-suited to office workstations.
Ubuntu operates on a fast development cycle, particularly compared to Debian’s slow but stable releases. It also cheerfully provides proprietary drivers and firmware where needed, and, although Ubuntu itself is fully free, Canonical is here to make a profit, meaning that enterprise-grade support contracts are available, and the developers’ approach to security is tuned to the needs of business.
One of the longest-established distros, dating from 1993, Debian has numerous popular derivatives, from Ubuntu to Raspberry Pi OS. It introduced the widely-used and much-cloned apt package management system for easy software installation and removal, and to this day prioritises free, open and non-proprietary drivers and software, as well as wide-ranging hardware support.
While Ubuntu and Red Hat are tailored to enterprise, Debian remains a firmly non-profit project dedicated to the principles of the free software movement, making it a good choice for GNU/Linux purists who want a stable OS that’s nonetheless comfortable to use, with a variety of popular GUIs to choose from.
Another 1993-vintage distro, Slackware (no relation to the popular collaboration platform) is still very much alive and kicking, despite a website whose front page was last updated in 2016. That’s set to change soon with the imminent release of Slackware 15.0, which those who want the latest features can already access in the form of Slackware-Current.
As you might gather from the slow release cycle, Slackware is built for long-term stability. It also maintains several classic Linux features that other distros have abandoned, making it a popular choice with many old-school users for that very reason. It uses a BSD-style file layout and hands-on ncurses installation interface, is deliberately “UNIX-like” and, most notably, eschews Red Hat’s now-ubiquitous systemd, so you’ll be using init rather than systemctl commands to manage services. Refreshingly, it boots to the command line by default, but you can choose from a range of desktop environments. You’ll probably also want to add a package manager such as swaret.
If you want a “pure” and slightly old-school Linux experience, Slackware is an excellent choice and a great way of getting a handle on the underpinnings of Linux as an OS. It’ll run on almost anything, from a 486 to a Raspberry Pi, to your latest gaming PC, with support for x86, amd64, and ARM CPUs.
Not every PC is an eight-core gaming behemoth with 32GB RAM and the latest graphics card. But versatile, lightweight Linux distributions mean that an underpowered netbook or Windows XP-era PC can be brought back into use as a genuinely functional home computer, with all the security updates and modern software support that you’ll need.
One of the best-known lightweight Linuxes, Puppy Linux isn’t a single distribution, but rather a collection of different Linux distros, each set up to provide a consistent user experience when it comes to look, feel, and features. Official versions are currently available based on Ubuntu, Rasbian, Debian, and Slackware, with both 32- and 64-bit versions available for most of these.
They’re all designed to be easy to use for even non-technical people, small - around 400MB or less in size - and equipped with everything you’ll need to make a PC functional. Having to choose your Puppy can be a little confusing, but there’s a guide to help you through it. Although 32-bit CPUs are supported, you’ll want an Athlon processor or later for more latest versions to be viable. For more modern systems, note that Puppy doesn't support UEFI, either, so switch your BIOS into legacy mode before installation. ARM architecture is also supported in the form of Raspberry Pi.
Ubuntu MATE - pronounced mah-tay like the hot beverage - isn’t the absolute lightest-weight distro around, requiring at least 1GB RAM and a 64-bit Core 2 Duo equivalent processor. It is nonetheless a superb choice if you need to bring an elderly home PC or underpowered laptop back into viable use.
The MATE desktop environment is popular with Windows XP veterans and comes with tweak tools already installed for easy customisation. And as it’s an Ubuntu variant, you get that distro’s wide-ranging repositories, excellent hardware support and easy gaming, with a user interface that’s a bit lighter and more comfortable for Linux newcomers.
Although a 32-bit x86 distribution is no longer available, you will find both 64- and 32-bit versions for Raspberry Pi, and versions specifically designed for a small range of pocket PCs.
A few versions of this ultra-lightweight distro are available to download: A fully functional command line OS image (16MB), a GUI version (21MB), and an installation image (163MB) that’ll support non-US keyboards and wireless networking, as well as giving you a range of window managers to choose from.
As you’d assume from its minuscule file size, Tiny Core doesn’t come with much software by default, but its repositories include the usual range of utilities, browsers and office software that you’ll need to make use of your PC. You can run it on a USB drive, CD, or stick it on a hard disk, and it’ll work on any x86 or amd64 system with at least 46 megabytes of RAM and a 486DX processor, although 128MB RAM and a P2 are recommended. Arm builds are also available, including Raspberry Pi support.
In practice, most distros that are good on the desktop are entirely adequate for use as part of your enterprise server infrastructure, although you’ll probably want to install a version without a graphical desktop for most use cases. If you operate an enterprise server, you’ll want something with stable Long Term Support versions, responsive security updates, and that’s familiar enough to make it easy to troubleshoot. Right now, Ubuntu and Red Hat derivatives are particularly solid choices.
Red Hat Enterprise Linux is synonymous with big business. Although RHEL’s source code is, of course, open, it uses significant non-free, trademarked and proprietary elements, and updates that you need a subscription to access. Red Hat emphasises security, hands-on subscriber support and regulatory-compliant technologies and certification. Its developers also put a lot of effort into its enterprise-grade GUI, which can be more comfortable for those who’d rather not do all their configuration at the command line.
Red Hat itself - now a subsidiary of IBM - has contributed important elements to Linux as a whole. With fully free community Red Hat derivative CentOS’s move to deliver up Long Term Support (LTS) versions in favour of a rolling release model (via CentOS Stream), RHEL is perhaps the best option for consistent, long-term stability for anyone who requires a Red Hat based Linux distribution for business use.
Fortunately for SMBs, the no-cost version of RHEL has been expanded to compensate for the loss of traditional CentOS, allowing individual developers and small teams with up to 16 production systems to get a free subscription, providing access to the distro’s update repositories.
Amazon’s own Red Hat derivative, Amazon Linux, is designed to work optimally on the cloud service provider’s platform. It supports all features of Amazon’s EC2 instances and its repositories include packages designed to seamlessly integrate with AWS’s many other services. Long Term Support versions are available, making it an appealing CentOS replacement, as long as you’re happy moving your machines to the AWS cloud.
Although its VM image and containerised versions are designed first and foremost for deployment on AWS, you can download VM images for on-premises use if you want them.
While Amazon Linux is based on CentOS, its successor, Amazon Linux 2022, is built on Fedora, but respecc’d as a server distro.
While most desktop Linuxes are just as capable as servers, we’re going out of our way to recommend Ubuntu for both, as it’s incredibly easy to roll out a wide range of secure and fully-functional servers from its packages. It’s also free and conspicuously quick when it comes to security updates.
Its Long Term Support versions get five-year security and ten-year extended maintenance guarantees. As well as x86 architecture, it’s available for ARM, as well as IBM’s POWER server and Z mainframe platforms, although its legacy hardware support pales in comparison to Debian’s.
Ubuntu is entirely free for everyone, but you can subscribe to Canonical’s commercial support if you need it, and Ubuntu’s popularity means that it’s widely supported by third-party firms and community forums.
Some people use Linux because it’s free, or because it’s fun to tinker with, or because they don’t like being beholden to a large corporate entity. Others use Linux for security: either to maintain it or to test it. There are a number of distros designed for those who want to lock down their privacy and security at all costs, as well as distros built for infosec professionals who need to make use of more specialised tools.
If you work on other people’s computers or on public networks and you’d like to minimise the risk of your identity, communications and data being compromised, TAILS is the OS-on-a-stick for you.
Based on Debian, TAILS’ most distinctive feature is that it routes all internet traffic via TOR by default and, when used as a live distro, it lives on an 8GB+ USB stick and runs in RAM, leaving no trace on the host PC unless you deliberately choose to do so. The 1.2GB live image includes a GNOME 3 desktop environment, with all the conveniences of a modern desktop Linux.
Kali is not your everyday desktop distro - it isn't recommended fof all use cases. But for those that are looking for pen testing and red-team-oriented security functions its a great choice. It's based in Debain and it comes with a lightweight Kali Linux Desktop environment by default. That also includes GNOME and KDE Plasma versions as well.
The main attraction here for users is the ready-to-go security tools. There are a wide range of 32- and 64-bit images for various platforms and use cases, as well as password-cracking VoIP research and RFID exploitation. In total, Kali comes with 600 security tools, though there are very few use cases that will need all of them. There is also specialist hardware support, such as Kali NetHunter for Android and a few ARM images, like Apple M1 architecture.
To find out your exact requirements, such as storage (2GB to 20GB) and also which security tools you need, there's a guide. If you opt for the basic installation, you will be able to use metapackages to pull down exactly the tools you need.
ParrotOS may be a single distro, but there are two types of it; both are based on Debian's Testing tree and they're also available with the MATE, KDE and XFCE desktop services.
The Home Edition (shown above) is a lightweight OS for daily use. It has a specific focus on privacy and operates as a quick-assembly pen testing tool, for those that need them.
There are other services that provide greater privacy, such as TAILS amnesiac live distro, but ParrotOS comes with some decent pre-installed capabilities, such as secure file sharing, cryptography, and end-to-end encrypted comms and anonsurf for those that want to proxy all online traffic through the TOR network.
To be that little bit more secure, there is ParrotOS Security Edition which is a Kali Linux alternative. This comes with pen testing and digital forensics tools, such as network sniffers and port scanners, but also car hacking features as well. ParrotOS is a community project so there are no enterprise options like you would find with Kali. But it is very close to the GNU/Linux style and there is a fairly large community of users to get support or advice from.
The state of Salesforce: Future of business
Three articles that look forward into the changing state of Salesforce and the future of businessFree Download
The mighty struggle to migrate SAP to the cloud may be over
A simplified and unified approach to delivering Enterprise Transformation in the cloudFree Download
The business value of the transformative mainframe
Modernising on the mainframeFree Download
The Total Economic Impact™ Of IBM FlashSystem
Cost savings and business benefits enabled by FlashSystemFree Download
Red Hat has named Matt Hicks as its president and chief executive officer succeeding Paul Cormier who is shifting into the chairman’s post.
Hicks previously served as Red Hat’s executive vice president of products and technologies, joining the company in 2006 as a developer on the IT team.
He quickly rose via leadership positions across the organisation and was a foundational member of the engineering team that developed Red Hat OpenShift.
“When I first joined Red Hat, I was passionate about open source and our mission, and I wanted to be a part of that. I am humbled and energised to be stepping into this role at this moment,” Hicks said. “There has never been a more exciting time to be in our industry and the opportunity in front of Red Hat is vast. I’m ready to roll up my sleeves and prove that open source technology truly can unlock the world’s potential.”
Cormier took on the president and CEO mantel in 2020 and has been with open source specialist for 21 years.
“Matt is the exemplification of a true Red Hatter and is absolutely the right person to step into this role,” Cormier said. “His experience across different parts of our business has given him depth and breadth of knowledge about how we can best work together to scale and remain the open hybrid cloud leader.
“He understands our product strategy and the direction the industry is moving in a way that’s second to none.”
Cormier played an instrumental role in the expansion of Red Hat’s portfolio to a full, modern IT stack based on open source innovation. His efforts to transform Red Hat Linux from a freely downloadable operating system to a subscription model with Red Hat Enterprise Linux (RHEL), was a pivotal moment.
As chairman, Cormier will serve as Red Hat’s strategic touchstone and key advisor. His focus will continue to be on scaling the company and accelerating customer adoption of open source technology.
Having led more than 26 acquisitions at Red Hat, Cormier will also work closely with Red Hat leadership on future merger and acquisition strategy. He will continue to work alongside IBM chairman and CEO, Arvind Krishna in this capacity with both Cormier and Hicks reporting to Krishna.
“As chairman, I’m excited to get to work with our customers, partners, and Matt in new ways,” Cormier said. “My focus moving forward will be on helping customers drive innovation forward with a hybrid cloud platform built on open source technology. Open source technology has won the innovation debates and whatever the future looks like, it’s going to be built on open source technology and Red Hat will be there.”
Error: Please check your email address.
Customer attrition is a normal part of business—no company expects to keep all its customers for a lifetime.
But what many ICT Solution Providers have seen over the past few years goes beyond normal or expected rates of customer churn. As we highlighted in the first installment of Dicker Data’s ‘Meeting of the Minds’ series, many Partners are watching as key customers go to market for new providers—or are lured away by competitors—at higher rates than ever before.
What is driving this trend? And what are your options to create greater stickiness when one of your key customers starts reviewing their ICT Providers? We’ll take a closer look in this article.
A growing body of research—not to mention, plenty of anecdotal evidence—suggests that demand for cloud solutions continues to increase amongst Australian businesses.
Gartner, for instance, reports that end-user spending on public cloud services in Australia is expected to rise 17.6 per cent over last year to a total of $18.7 billion in 2022. Further data from Forrester finds that, “in 2022, 30% of APAC firms with cloud-first strategies will shift to cloud-native.”
In many instances, business users’ expectations for technology are set at home and brought from there into the office. Providers who do not provide the type of agility customers increasingly seek—especially those who only provide basic IT or break/fix services—may see higher rates of churn, lost renewals, and lost tenders compared to those offering more sophisticated solutions that meet modern expectations.
View our video on the Australian IT Provider Market Opportunity in SMB for more information.
Despite growing evidence that business consumers are increasingly demanding cloud-centric solutions, many Partners are hesitant to move investments in private cloud infrastructure to public cloud platforms like Azure.
Some Partners view SPLA as a comfortable, less risky option, while others actively decline pursuing customer opportunities on Azure due to the potential for cost creep.
To better understand the factors that are driving this resistance, Jakub Wolinski, Cloud Services Manager at Dicker Data, surveyed SPLA-based MSPs on their evaluations and considerations of an Azure-first business model.
Perhaps unsurprisingly, cost management and profit margin considerations led the list of limiting factors, with customer questions, contractual obligations, and technical support also well-represented in participant responses.
It is important to note, however, that when pressed for more information the majority of participants demonstrated that they believed that Microsoft Azure “is always less profitable for SPLA Partners”. Additionally, many respondents who listed costs as their major consideration, we found, had last reviewed the platform 3-5 years ago.
Although the mechanisms involved in building economies of scale and reducing operating costs are different with Microsoft Azure, changes to Microsoft’s technical solutioning mean that it is now possible to create profitable business models that work on Azure.
The key to Partner success is not looking at Azure as a ‘like-for-like’ comparison, but rather, creating a new business model in parallel with existing offerings.
Essentially, the decision to move to Azure isn’t a ‘lift and shift’ conversation. For many Partners, making the decision to initiate a data centre migration will require an all-in business model change to see the true benefits of economy of scale and profitability they’re currently accustomed to.
While a business model overhaul may seem daunting, on paper, our survey also uncovered the fact that Partners with non-cloud customers in a managed service arrangements were more frequently pursued by vendors to “resell Azure Subscriptions (69%) versus looking at building a profitability-based business for migrating entire private cloud (25%)”. Clearly, this highlights that one of the major blockers to moving to cloud lies in the challenge of proving a sustainable business model.
Through Dicker Data’s Microsoft team, Partners have the opportunity to review their managed service business as a whole, as well as receive support in launching new profitable cloud offerings.
Taking this proactive approach might sound tedious. However, doing so will safeguard against scenarios where key customers (i.e. those that cover your infrastructure) begin reviewing their providers or worse losing them as a customer. Utilising vendor funding and programs allows you to prepare now for migrations to Azure that make commercial sense 1-3 years in the future.
Watch our short video on how you can access customer funding and programs from Dicker Data & Microsoft.
Remember that 17.6% rise Gartner mentioned? As you know, many of your clients have KPIs to Strengthen their business in the areas of customer experience, digital transformation, and growth. Incumbent Providers that aren’t offering solutions that support these goals may be especially vulnerable to the loss of key customers.
One way to think about ensuring your position—either by winning on your current offerings or through new, competitive solutions—is to focus on what you already know works. Tap into your sales team for feedback—especially in regional areas and states such as Queensland, Western Australia, and South Australia, where established relationships carry weight.
For example, you already know that your customers don’t care where their data is hosted—just that it can be accessed anytime, from anywhere, in a secure and cost-effective manner. Extending that line of thinking all the way through to deploying new cloud solutions could keep your business from being cut out of the conversation when your existing customers evaluate their IT requirements and decide to go to market.
As we look ahead, news of consolidation in Australia’s ICT industry might seem challenging. Yes, many businesses are considering hyperscalers’ offerings as a part of the conversation, often due to their greater degree of visibility. But there is still a role to play for local Partners—especially those who understand the specific needs of their communities and can build one-to-one relationships in a way that the major hyperscalers can’t.
As Australian businesses' security requirements expand due to pending legislation and government policies, Partners with private or hybrid cloud environments may increasingly be called on to support customers’ security requirements. As this demand grows, Partners will need to assess their existing security posture and determine whether or not the costs to provide a secure environment are worth it—or if there are options within public cloud that could help reduce their risk and liability.
This is where we encourage partners to consider the shared responsibility model on public cloud, which takes some of the liabilities away reducing your direct risk.
As you consider what the future looks like for your on-premises managed services, SPLA-based business model, or hardware-focussed practice, ask yourself:
What is your exit strategy for moving away from the costly hardware refresh cycle?
My daily conversations with Partners highlight, it’s still taking a precipitating event—such as a hard drive failing or a customer not being able to operate—to motivate Partners to make a change. The events are different for every single Partner, but it is the Partners that are taking the time now to seriously evaluate how they're going to operate in the future that are really coming into their own in this new landscape.
Business model reviews and customer migrations are serious undertakings, but you don’t have to do them alone. The team at Dicker Data supports Partners in achieving their full potential through empowering them to offer industry-leading public cloud services.
"Leveraging Dicker Data’s pre-sales team was a gem,” explains Geoff Smith, a former MSP owner and current Microsoft Solutions Development Specialist at Dicker Data. “It’s something that’s really needed for Partners that have a smaller base of loyal clients to add that added value to the relationship when Azure deals pop up".
To explore how cloud could work for you, visit our resource hub or register for a personalised Partner immersion session.
Error: Please check your email address.
On TikTok, Instagram and YouTube, some kids are making millions. But any child working as an influencer is at risk of exploitation.
Choose your answer and the correct choice will be revealed.
It may not be the most recognizable deal from this list, but Dell's acquisition of enterprise storage company EMC for $67 billion in cash and stock stands as the tech industry's largest buyout. This took place in 2015.
Nvidia acquisition of Arm in a transaction worth $40 billion is #2 in this list, marking one of the biggest semiconductor purchases in the multi-decade history of the chip business. The amazing irony is that this $40 billion dollar deal doesn't involve any chips at all. Four years prior, Softbank had purchased Arm for nearly $32 billion.
Third place goes to renowned chipmaker Broadcom, which was scooped up by rival Avago Technologies in a cash and stock deal valued at $37 billion. In 2020, scrappy chipmaker AMD reached an agreement to purchase FPGA specialist Xilinx in an all-stock deal valued at $35 billion. That's the fourth entry in our top 5 list.
To round up the top 5 deals in tech history, IBM acquired open-source software developer Red Hat for $34 billion. The deal took place in 2018, after IBM and Red Hat had been partners on enterprise-grade Linux for over 20 years.
Other high-profile acquisitions that have fell off the rankings in latest years include Microsoft's purchase of LinkedIn for $26.2 billion in 2016. Also, HP's takeover of Compaq in 2002 which was virtually tied with Facebook's buyout of WhatsApp with both being ~$19 billion exchanges. And for completion's sake, Google paid $12.5 billion for Motorola and sold it to Lenovo two years later for $2.9 billion -- minus all those mobile patents they actually wanted for its Android ambitions.
You may be wondering why AOL's buyout of Time Warner isn't among the answers, considering it was a larger deal than any of those above. AOL paid $106 billion for Time Warner in 2000 -- it was actually originally $165 billion but later reduced -- however that acquisition is typically excluded from this sort of list because AOL and Time Warner are considered media companies more than tech companies.
“Communications can be an albatross with all the different channels and all the technology out there today. This drops everything into one bucket … This will be the further differentiator between [Nextiva] and RingCentral and 8x8 because they have pieces and parts, but they don’t have anything all together like this,” said one longtime Nextiva partner about the company’s new Workhub platform.
Nextiva, a provider of cloud-based unified communications, has unveiled a collaboration “hub” that brings together multiple channels to gives context to communications.
Nextiva Workhub is a software platform for team collaboration that lets users manage their conversations with internal teams and outside customers from a single place, eliminating the need for countless, separate communication tools, according to Chethan Visweswar, Nextiva’s global vice president of product development.
The problem with collaboration products today isn‘t the lack of tools, Visweswar said. It’s the lack of context that comes with employees use multiple communication tools.
“You might have access to some very powerful tools, but the context is lost because nothing is harnessing that conversation,” he said. “What are we really trying to focus on is bringing together those communications into a contextual conversation.”
[Related: Channel Champion Eric Martorano Is Leaving Nextiva ]
Nextiva’s new Workhub app breaks down walls between UC and collaboration applications to deliver users a single conversation view, or threaded conversations, Visweswar said. The platform can pull in communications and meeting information from a variety of voice, text, video, and file sharing applications, as well as email applications such as Microsoft Outlook and Gmail. These threaded conversations allow users to view data such as notes on calls, recordings, customer surveys and calendars in one place so they won‘t have to dig around for the information they’re looking for, he added.
Businesses using Workhub will benefit from improved productivity because their users won‘t have to switch or toggle between communication channels, reduced application costs and improved customer engagement, the company said.
“Communications can be an albatross with all the different channels and all the technology out there today. This drops everything into one bucket,” said Matthew Brewer, vice president and sales lead for Brewer Communications, a longtime Nextiva partner.
Brewer Communications, an Oilville, Va.-based telecom solution provider, leads with Nextiva voice products for its base of midmarket and enterprise customers in the medical and manufacturing verticals. Workhub opens a lot of doors for the firm because it gives customers “one windowpane” into all their communications and interactions with their own clients, Brewer said. It can also help businesses cut out other software they may be buying for collaboration, he added.
“The way they have [communications] threaded -- it just simplifies how you communicate and how you keep track of how you communicate as well,” he said. “I think it will eliminate some other applications [customers] are paying for that they might not need by having it all in this one bucket.”
Businesses will be able to track customer conversations that mention shipping issues, for example, to uncover and address trends. For partners, Workhub can help automate tasks, such as automatically sending out a customer survey to clients that reached out for assistance, Visweswar said.
Nextiva competes with the likes of RingCentral and 8x8. The latest platform will advance the company beyond the bounds of business communications by giving users a customer management and customer engagement offering, too, said Chris Reaburn, chief marketing officer for Nextiva.
“Taking those communications and threading them together into a single conversation is unique -- I can pick up any channel and advance the conversation. What we’re doing that maybe puts us in the realm of a Zoom or a Slack, is that not only have we developed the team collaboration capabilities for teams to collaborate within the four walls, but also, draw outside parties into those into those collaborations,” Reaburn said.
Brewer looks forward to future iterations of Workhub that will include analytics for partners and end users, he said. “What will be nice is to be able to look historically, from a management perspective, to understand customer sentiment and engagement,” Brewer said. “There are more things that are coming in the product, but they‘ve hit the ground running … This will be the further differentiator between [Nextiva] and RingCentral and 8x8 because they have pieces and parts, but they don’t have anything all together like this.”
Workhub has been in beta with more than 3,000 customers and is now generally available to customers and through the channel, Reaburn said. Workhub can be easily accessed through a browser.
The Scottsdale, Ariz.-based company in April launched its new NextivaOne partner program as a ‘springboard’ to recurring revenue growth through the channel. The company is doing just over 50 percent of its business through the channel today and the channel is the company‘s biggest route to market.