Free 920-803 cheat sheet with Free PDF and PDF Braindumps
killexams.com gives the most current and 2022current Pass4sure Technology Standards and Protocol for IP Telephony Solutions Exam Questions with PDF Braindumps and even real questions for the latest articles of Nortel 920-803 Exam. Training our Real 920-803 dumps to boost your knowledge and even pass your 920-803 test with good Represents. We 100% assurance your success throughout the Test Centre, covering each one particular of the themes of the test and even enhancing your Expertise of the 920-803 test. Pass with full surety with these appropriate questions.
Exam Code: 920-803 Practice test 2022 by Killexams.com team Technology Standards and Protocol for IP Telephony Solutions Nortel Technology questions Killexams : Nortel Technology questions - BingNews
Search resultsKillexams : Nortel Technology questions - BingNews
https://killexams.com/exam_list/NortelKillexams : Get plugged in
The 404 802: Where we're breaking bread with the Gut Man (podcast)
The Audiophiliac Steve Guttenberg makes an unusual midweek appearance on today's 404 episode and as usual, the discussion subjects cover Steve's audiocentric stream of consciousness with a touch of disillusionment, like a Web site that takes the creativity out of naming your band, a mask that enhances audio, the unnatural origins of 3D, and the degrading quality of CD-Rs over time.
Fri, 24 Jun 2022 01:01:00 -0500entext/htmlhttps://www.cnet.com/culture/4968/Killexams : “Inquisitiveness To Explore Makes A Big Difference…”
There are many who aspire to hold prestigious positions at big corporate houses, but very few are willing to work hard enough for it. Such positions can be achieved by only those who love what they do. Dr Aloknath De is one of them. He calls himself a ‘missionpreneur.’ He refuses to box himself and loves to think out of the box. He does not become comfortable in a setting as he likes to do things outside his comfort zone. He has been Samsung India’s first CTO besides many more achievements to his credit. This is Dr Aloknath De’s story, as told to EFY’s Siddha Dhar.
Born in the spiritual town of Nabadwip Dham in West Bengal, Aloknath had the influence of holy men on him from a very young age. Even though his family migrated to Kolkata soon after he turned a year old, Aloknath credits his birthplace for teaching him the lessons of renunciation, finding fulfillment in life, and cherishing it.
Aloknath’s father picked up his LLB degree out of passion while working as a gazetted officer at the Food Corporation of India. Under his tutelage, Aloknath was home-schooled till the age of six. A rigorous man, it was he who built the strong foundation of science and mathematics for Aloknath.
Aloknath recalls, “He was a very disciplined person, and he influenced my love for science. So, taking up science after 10th standard was a very natural choice.” Aloknath went to a proper school only when he was in the 2nd standard. While home-schooling provided him with the basic skills, he gained ‘interactive intelligence’ only after he went to school.
Aloknath says, “I realised that natural intelligence, or the ability to react in a situation, comes only when you interact and talk with more and more people. The more you converse, the more you grow.”
Aloknath’s mother, a homemaker, has always been soft-natured. She is an epitome of love and affection. This fusion of hard and soft behaviour of parents is what Aloknath tries to maintain in his own life as well, even today. In fact, his leadership mantra is: “Be Hard on Goal, Be Soft on Soul.”
Diligent from childhood, it was quite natural for Aloknath to be a topper in his school. But by the time he was in 6th standard, a sense of restlessness loomed over him. “It gave me a feeling of accomplishment in the beginning, but then it made me feel as if I was not being challenged enough.” This notion of not ever getting too comfortable in something was a lesson that has driven Aloknath all through his life.
On the lookout for more competition to raise his accomplishment bar, Aloknath decided to appear for a Central government merit scholarship programme, which allowed him to study in a residential school of his choice. And he got admission in Ramakrishna Mission (RKM) Residential School in Purulia—a district 250km away from Kolkata.
Though he had the option to choose a school in Kolkata and stay near his family, Aloknath decided not to base his choice on emotions, and rather grab the opportunity to live and operate independently. The pangs of separation from his parents and siblings struck him in the initial phase. But Aloknath knew that he was accountable for the decision that he had taken and that he had to stick to it—a lesson that he would recall again, years later.
Infuse lateral thinking, when needed, before making decisions rationally
Keep your mind open to learning new things; continuous learning is of essence
Get out of your comfort zone; perturb a bit when too comfortable. Bring stability from comfort zone and growth from outside the comfort zone
Building zero-to-one has different challenge than scaling one-to-infinity
What is interesting about life is that sometimes taking an apparent step back can help you leap forward by ten steps
Your unique way of doing and supporting will always be valued by the community
Although his father was a strong influence in his life, it was his monk headmaster at RKM whom Aloknath regards as his role model. “He was a man of such purity. He had mastery over multiple languages. Once I was writing an essay about the river Ganges. I showed him the first draft and he helped me make it better—his knowledge was profound. I would say it was the best piece in Bengali that I have ever written.”
It is not surprising that science and math were his favourite subjects. But Aloknath was not excellent in academics alone, he also took active part in school debates, elocutions, recitation competitions, and so on. He even appeared on numerous TV debate shows broadcast by Doordarshan.
A rational thinker since childhood, Aloknath quickly figured out that his heart lay in engineering. When he was in the 11th standard, he decided to share this with his father, who was then disappointed with his decision. “My father wanted to be a doctor, but it was not possible due to financial constraints. He was hoping that I would become a physician, but I chose engineering. I remember it was a long-drawn evening discussion with my father and uncle. They tried very hard to change my mind.”
Aloknath did not budge from his goal and readied himself to prepare for the next chapter in his life.
IIT and finding his calling
A large chunk of Aloknath’s future plans began taking shape in his undergraduate years that he spent at IIT Kharagpur. While studying Electronics and Communication Engineering (ECE), he remained a high-achiever here too. Having done his secondary education in a Bengali-medium school, the fluency with which some students at IIT spoke English alarmed him. But he did not let it intimidate him. He remained headstrong and mastered the language pretty soon.
At an institute of IIT’s stature—where the brightest minds of the country came to make their dreams come true—Aloknath found there were many who were just as meritorious as him. But he took it as a blessing in disguise, as he was always looking for a more competitive space where his limits could be tested, and his mind could expand.
Aloknath says, “It’s important to accept that some people are better than you at something and it’s absolutely okay to be not as good as them. But what is important is to bring yourself to a minimum threshold level on attributes that matter in life and excel in a few dimensions.”
As his world-view expanded and he met more people, Aloknath found his calling—telecommunications—at IIT Kharagpur a fascinating field indeed! “I saw the power of communications and how it helped people stay connected remotely. I saw how it could compete with transportation and thought it was revolutionary.”
The realisation didn’t occur overnight, though. He spent months and months practicing books and magazines, listening to radio, and talking to people to soak in as much information as possible and then come to this conclusion. “Inquisitiveness to explore makes a big difference. When you keep your queries alive, one fine morning you’ll come across some information that makes you pause. And you take a moment to think about it and ask further questions. It is in this process of asking and answering these questions that you realise what you are drawn to.”
A three-step process Aloknath likes to follow for his goal is ABC of Achievement: Align (your thoughts), Beam (your energy), and Collaborate (your ecosystem). As he began the seeding for his future roles, he realised how telecommunication systems could be built with hardware and then-new software elements. He also started appreciating how engineering deployment could complement technological innovations.
Dream job and memories at IISc
The unpredictability of life is enthralling. It makes you do things you never intended to, and gives you results you never expected. Although Aloknath had planned to pursue his master’s degree after IIT and even got admission to a US university, a financial crisis forced him to start working right after his graduation in 1985. Yet, he did not let this setback derail him from his plans.
Thankfully for Aloknath, he found his way to Bharat Electronics Ltd (BEL). In an era when MNCs did not rule the subcontinent, PSUs were the dream job of many, and Aloknath was no different.
But, while joining BEL, he knew what his ultimate goal was—to learn telecommunication systems as much as he could.
At BEL, he was part of a team that built a low-flying target detection radar, which was the first indigenously built radar system: INDRA 1. Till date, INDRA series of radars are deployed on the borders of India.
His role in BEL, Sahibabad needed him to travel a lot to the headquarters in Bangalore (now Bengaluru). He made it a rule to take a different route during each of his trips. This was his way of having fun and exploring the country while working. As he travelled the length and breadth of India, Aloknath’s mantra to always go out of his comfort zone found more strength. “Even today, whenever I see that I’m getting too comfortable, I try to find new things to do that make me uncomfortable for a while.”
But going out of your comfort zone can be scary besides being uncomfortable. How did he tackle that? He says, “The trick is to keep yourself anchored 50% in comfort and unsettle the rest. Going completely out of your comfort zone makes it probably very risky. Bring stability from the comfort zone, and growth from outside the comfort zone.”
After a two-year stint at BEL, Aloknath went on to pursue his master’s degree from another premier institute of India—IISc Bangalore. A fond memory of IISc that he holds close to his heart, beyond studies, is how he would save some money from his scholarship grant to buy cassettes of classical music. An avid lover of classical music since a young age, Aloknath had even tried learning the stringed musical instrument sitar. Although he could not sing well, his affinity for classical music drove him to find solace in it, irrespective of his prevailing mood.
At IISc, Aloknath further broadened his knowledge about telecommunications and realised how every aspect uniquely complemented each other. It was this fascination that took him from India to Canada, where he pursued his PhD and also started another phase of his life.
Flight to Canada and international exposure
Like many middle-class Indians at that time, Aloknath’s flight to Canada was the first one he had ever taken in his life. While he was working hard to revolutionise telecommunications, calling from a foreign country was a luxury that he could not afford then. And that would also make his parents anxious at times.
“When I first landed there, I sent them a telegram but it took ten days to reach. In the meantime, my father would go to the Air India office every now and then to enquire if my flight had landed. The people there assured him that many flights had gone and come back since my flight, so I was probably already well-settled in Canada,” he laughs.
Aloknath did his PhD in ‘communication with signal processing’ from McGill University on a full scholarship. After ‘communication with control’ in IIT and ‘communication with computing’ in IISc, he wanted to fathom this combination as well. Aloknath knew that it would eventually bring him to a more holistic view of telecommunications.
Surviving in Canada was no easy feat. Aloknath had to adjust to the limited food choices and the cold climate. However, he liked the cosmopolitan culture of Montreal and saw himself living there for a long time, and hence took up learning French language.
Aloknath’s PhD thesis was supported by Canadian Institute for Telecommunication Research; he also bagged the Alexander Graham Bell prize in Canada for his doctoral research. His PhD mentor was a wonderful man who made the whole learning process at McGill very enjoyable for Aloknath. “He was an immigrant himself; so he knew what challenges I was facing and he helped me navigate through them.”
It was through this mentor that Aloknath got his first break in Canada. He bagged an opportunity to work in Nortel Networks, which was then called Bell Northern Research. His first assignment had ample scope for innovation. He just needed to find the meaningful business gap. This, he credits to being curious and looking for means to go above and beyond to find a pain point he ought to fix.
But as he kept getting better at making these discoveries, he missed the excitement of implementing them and seeing them in their realised form. He knew it was time to get out of his comfort zone and soak deeply into practical implementations and field issues to grow in industry. “I told my manager that I wanted to do the dirtiest work. The most mundane things that nobody wanted to do.”
In the summer of 1999, Aloknath made a decision that would change the trajectory of his life—leaving Canada after a decade and coming back to India. “It was one of the toughest decisions I had to make in my life,” Aloknath remarks. He was eligible for a Canadian PR. His son was three years old and had access to better education there; and he himself had a thriving career.
In his true spirit to think through and make a call, Aloknath mulled over the situation for an entire year. He thought rationally yet infused elements of lateral thinking before taking decision. After all, he couldn’t look back and regret his decision. “I decided to come back to India. Although it seems unbelievable now, I did envision something along the lines of how India would look like in 2020, and wanted to see if I could be part of that vision coming true.”
His reasoning to come back was based on two things. First, he had built a broad international perspective that expanded his mind and made him understand what makes a country developed. Second, he understood deeply what was needed to be a subject matter expert on communication with controls, computing, and signal processing elements, and build global innovative products.
He recalls, “With all that knowledge in my bag, I felt complete and thought it was the right time to come back. Many around me felt that I was taking a step backwards. But what is interesting about life is that sometimes taking an apparent step back can help you leap forward by ten steps.”
Return to India and re-building from zero
Back in India, Aloknath had uprooted his life in Canada by taking the risk of applying to only two companies. Fortunately, he received offers from both the companies. “The same postman brought the two offer letters on the same day. I found the phrase so true: God helps those who help themselves. The power of hope and ambition beats everything,” he remembers.
He went with one of these companies—Hughes Software Systems in Gurgaon. His interview with the Director of Hughes India had happened while he was still in Canada. Incidentally, the Director had made it a point to highlight how different his life would be once Aloknath was back in India.
In such a case, one might take a moment to reconsider the decision. But Aloknath knew exactly what he wanted. “I told the executive that I am cognizant of all of that, and I was making the decision while being fully aware. I am ready to face challenges as they come.”
This director who also became his manager and is now a good friend, was a source of inspiration for Aloknath as he made his switch so seamless; he never felt like he had left home. At Hughes too, Aloknath’s vision for joining a company did not alter—he wanted to do something that nobody else was doing or was willing to do.
He recalls, “During that time, digital signal processing in the base layer was effectively not there in Hughes communication stack. Many prospective clients said that they could have a service agreement with Hughes if that solution could be included in service delivery. That was truly a defining statement for me.”
He was entrusted with building a team for the same. Here came a moment to take something from zero to one, an opportunity of intrapreneurship. “Taking something from level one and scaling it to infinity is relatively easy because you have done it once. Building something from zero to one is the real challenge.”
Aloknath had to build a team of twenty people with strong DSP expertise and also start securing the DSP business beyond Hughes. It gave him joy that, due to this twenty-member team, Hughes could garner indirectly about eighty other people in higher-level stack areas. And this continued scaling up.
After a four-year stint at Hughes, he had a new itch. Even though the chip was never his core, he realised that semiconductors and VLSI design were a part of telecommunications and had a tremendous potential. So, he decided to join STMicroelectronics, which was looking to start telecom business ab initio in 2003. He joined them as the head of this division.
In 2008, when a joint venture between STMicroelectronics and Ericsson was signed, Aloknath was appointed as the MD of the group in India. He was handling the teams of both Noida and Bengaluru. He built an enviable team by internal transfer, external hiring as well as company acquisitions. By the time he left, he had 1000 team members.
To understand the full stack better, he became a ‘student’ of chip design. Even though he was the MD, he would often join the freshers during their training sessions to learn more about frontend and backend chip design, the processes and nuances. “In order to make decisions as an MD and interact as a global leader, I needed to know the subject to a great extent. I thought it was an integral part of decision making and I had to learn it to do my job well.”
After eight good years at ST-Ericsson, a world giant was waiting for Aloknath to knock on their door.
Samsung and a missed opportunity
Aloknath had the good taste of administrative life for a long time at ST-Ericsson where he was also a Board Director of the company, and he now craved for a more technical role. He wanted to lead with technology. But before that, he wanted to pause and take time to contemplate—much like he used to do in his childhood days.
He took a sabbatical of about six months and decided to experiment with a few things during the period. He taught at IIT Delhi as an Adjunct Professor, served as a consultant to Accenture, and mentored a location-technology based startup during this phase. While mentoring the startup, he got down to interacting with the ground-level customers to understand things from their perspective. “If you can take a month or even a few days to interact with the people on the ground, there is no better learning than this. I would not trade this for anything else.”
But as they say, you cannot hide exceptional talent anywhere. This time, Aloknath’s talent was found by South Korean electronics giant Samsung. Although the folks at Samsung wanted Aloknath to come over to Korea and work with them from their headquarters, Aloknath’s priority was now sorted out; he could not leave his family in India and the family was not in a position to relocate at that juncture.
So, Aloknath was almost ready to decline the offer but his destiny, however, had other plans. The very next day after this interview, he received a call from Samsung HQ. “They were willing to interview me again to create a position especially for me if I only wanted to work in India.”
This is how Aloknath became the first CTO of the Indian unit of the conglomerate with his base in Bangalore. His mantra, again, was simple—he didn’t want to work on something that was going well. He wanted something that was aspirational, but they couldn’t seem to make it work.
At Samsung, he had to create an innovation layer onto the flagship products by building new intellectual properties (IPs) without jeopardising base engineering deliveries. In the first fifteen years of Samsung’s journey in India, it had about 2000 patents, which increased to 7500+ patents under his leadership by their silver jubilee celebrations time.
He became the Corporate VP for Samsung Electronics, Korea and contributed to the internet of things (IoT) on SmartThings platform. Despite these accomplishments, he felt like something was missing. The earlier-missed opportunity to join the team in Korea kept gnawing at him. He asked them if he could join them in HQ and the team in Korea happily agreed. He got an IoT lab to work in Korea where he spent 7-8 days every month for thirty months or so. This made him a very effective global leader.
The data platform that Aloknath built in Korea is now connecting 200 million home appliances and mobility devices globally. With his global role and expertise in AI and Blockchain, he has built an impactful AIoT Centre-of-Excellence in India. Now, with this background and experience behind him, he has set a personal mission of researching and nurturing activities around Cyber-Physical Systems.
While the accomplishments of Aloknath are innumerable, his story of finding his love is like a romantic film. Aloknath and his wife knew each other since long, but never knew that they would be cojoined by destiny in an everlasting bond.
They got married towards the end of Aloknath’s PhD, that is, six months before he would graduate. His wife had to stay back in Kolkata for some time. “She had to go to a neighbour’s place to call me and had to request them to go out,” he laughs. She soon joined him in Canada, where she also pursued a postgraduate degree at Concordia University.
Upon returning to India, she has been working as a Math teacher. She is a big fitness junkie.
“She is also my personal trainer. I was never into fitness that much, but since the pandemic, I started taking care of my physique. And would it not be a waste if I didn’t utilise my opportunity despite having a trainer at home!” he says with a smile.
The couple has a son who was born in Canada, studied in India, and went on to pursue Economics and Physics at the University of Chicago. After his stint at an investment bank, he currently works with a global new-gen ride-sharing company.
Aloknath’s love for music is still intact, thanks to his mother who continues to practice music till date. She is the inspiration behind his love for classical music.
In December 2021, Aloknath took superannuation from Samsung formally. However, he is still associated with them as an Executive Consulting Director and wishes to continue as long as it makes sense for both. He calls it his Corporate Home.
Aloknath has increased his efforts to mentor deep-tech startups and has also become an angel investor. He is now an Adjunct Professor with IISc (ECE) and IIT Jodhpur (CS). Aloknath also has a venture of his own in the works.
Talking of his future plans, Aloknath says, “During India’s 75th Independence Day celebration, it is solemn for me to rededicate to the service of the nation. R&D serves as an impetus for business growth. If we can hone India’s talent for R&D and can excite business to spend greater on R&D and innovation, we can do wonders as a country.”
In fact, he revealed that the unveiling of his singular mission could happen soon; but he is in no hurry. Because if there’s anything he has learnt, it is to make ‘the rest of his life, the best of his life!’
Sun, 07 Aug 2022 23:34:00 -0500en-GBtext/htmlhttps://www.electronicsforu.com/special/my-story/inquisitiveness-explore-makes-big-differenceKillexams : Securing the Vault
Securing the Vault
By Russ Banham
Pity the poor information technology security officer, confronted with a fusillade of cyber threats. No sooner has one chink in the armor been mended than another crack appears elsewhere. When you consider the increasing depth of expertise of organized criminals, it is a serious challenge.
The insidious combination of spyware and phishing (one of several social engineering frauds deceiving people into thinking a scam perpetrated by a hacker is a bona fide request for information) joins many other cyber risks that have been identified, such as distributed denial-of-service (DDOS) attacks, portable electronic device data leakage, desktop utility application attacks, mashup threats, radio frequency identification (RFID) attacks, mobile and wireless device endpoint attacks, application security threats, botnets and dynamic random access memory (DRAM) attacks, among a host of others, according to Gartner. Hackers deploy diverse tools in perpetrating the attacks, such as viruses, worms and Trojans, exploiting corporate security weaknesses in their firewalls, routers, servers and other systems that allow for unauthorized penetration of networks. Fortunately, an entire industry is on tap to provide services and tools to stay one step ahead of the next scourge.
The wide range of cyber threats across the globe today requires companies to implement a holistic risk management process, in which technology security is achieved through systematic risk identification, assessment, quantification and mitigation. Threats must be evaluated for their potential likelihood and financial severity, and then prioritized for corporate resource allocation purposes. While it is impossible for any organization to fully protect itself from cyber risks — the cost would be overwhelming — companies must put their dollars where they will most benefit. As Steven Bandrowczak, chief information officer at Toronto-based communications provider Nortel, puts it, “I can make this place 100 percent bunkerproof and safe, but then we wouldn’t have money left over.”
Given the perilous risk of a security breach adversely affecting corporate reputation, technology security is a strategic concern these days. It is common now for CIOs to report on a scheduled basis to their companies’ boards of directors about technology risks and respective mitigation tactics. When a cyber threat rears, such reports are immediate, in terms of the necessary action steps. “Boards have become educated about information security because of the many rules, from Sarbanes-Oxley to Gramm-Leach- Bliley to HIPAA (Health Insurance Portability and Accountability Act) that have heightened their responsibility for security failures,” explains Pamella Easley, lead practice director in enterprise risk services at consultancy RSM McGladrey. “Directors want to be sure that any and all vulnerabilities compromising company data have been identified and the controls tested.”
“Customers trust us with their vital information, as do government agencies,” Bandrowczak says. “We have to demonstrate that we can secure this data.” Hence, the latest interest in an enterprise risk management approach to technology security. The process begins with risk identification—a basic understanding of current and emerging threats. “You have to assume that company data will be stolen,” says Avivah Litan, vice president and security analyst at Gartner, a technology risk management consultancy.
“Consequently, your job is twofold —protecting data from being stolen and then preventing stolen data from being used.”
Once inside, hackers have access not only to sensitive employee and customer data, but also proprietary corporate and competitive information, all of it now for sale in the underground market. What makes getting one’s arms around corporate information so difficult from a security standpoint is its ephemeral nature and movement. “You need to think in terms of your vital data and where it resides — not just on the computer in an office but how it flows throughout your organization,” says Vincent Weafer, vice president of Symantec Security Response, a provider of technology security tools and services. “You have to ask, ‘Is the data in transit, at rest, on a USB key or in a laptop?’ Once you understand where the data is and its importance, you can assess the risks and develop strategies to protect it.”
According to SunGard Availability Services, technology risks can be broken down into three categories: natural threats like a hurricane that shut down systems or prevent access to data to continue business; accidental threats, in which an employee unintentionally obtains improper access to verboten data; and intentional threats, the wide array of illegal scans, probes and attempted accesses via malicious code. The latter is the most threatening. “We live in a digital world via the Internet, and its access channels have also evolved to where the Web is a platform for services like social networking and sophisticated information sharing. This drives a higher degree of interaction,” explains Deepak Batheja, chief technology officer and SVP-Products at SunGard Availability Services, which ensures that 10,000 customers in North America and Europe have access to their business-critical information systems. “More applications are being deployed on laptops and more silent communication is going on between applications on the laptop and data center. That, in turn, means increased opportunities for interception. So, while the majority of incidents today are accidental threats, enterprises must be vigilant against all risks and SunGard solutions can help ensure end-to-end information availability and protection.”
After threats are identified, they must be assessed and measured for their potential frequency and financial impact. “Once we know a threat exists we look at its likelihood to affect us,” says Anne Wilms, executive vice president and CIO of Rohm and Haas, a Philadelphia-based manufacturer of specialty chemical materials. “Something may be highly unlikely, but the cost could be devastating. Conversely, something may be more likely, but the cost more manageable. With these in mind, we examine our vulnerabilities and current mitigation practices, and then, based on our resources, make determinations where people, processes and tools must be deployed.”
After assessing and quantifying cyber risks insofar as their likelihood and financial impact, companies must trigger corporate resources to do battle against them. With security spending rising within most corporate information technology (IT) departments, war is, indeed, being waged.
Among newer strategies is so-called security-as-a-service, which comprises a wide range of cyber security software that is owned, delivered and managed remotely by one or more providers. Instead of deploying all the security troops at the corporate gate to battle cyber criminals, they’re assembled outside the organization at the vendor’s site. “Enterprises have a wide range of options today when procuring and implementing information security technology, from the more traditional approach of buying, installing and managing security software to newly emerging security-as-a-service offerings,” says John Pescatore, vice president, Internet security, at technology consultancy Gartner. “In this one-to-many model, customers subscribe and pay for some or all of the services provided.”
A similar approach is cloud-based security — also provided by third-party security specialist organizations to customers as an on-demand service (as opposed to a more traditional software product). Cloud-computing security, a term that did not exist two years ago (the word “cloud” refers to the ephemeral world of the Internet), is tightly coupled to the external bandwidth services sold by Internet service providers. “Many mobile users are accessing business data and business services like SalesForce.com or various Google applications that don’t traverse the corporate network, thereby creating new cyber risks,” Pescatore explains.
Cloud-based computing security tools are essentially placed between mobile users of technology and cyber criminals, although enterprises still must maintain adequate perimeter security controls to protect risks to non-mobile assets like the data center and desktop PCs.
Other less new twists include external hosting, in which the enterprise owns and manages the security function, but instead of it being located physically at the enterprise, it is located at an external hosting provider site; and insourcing, in which the security controls are owned and managed by the enterprise, but external service personnel are hired on a contract basis to monitor, operate and manage the security controls. Companies also are buying more cost-effective, unified threat management security products from vendors, as opposed to myriad point solutions, giving them financial recourse to deploy the remainder of their budgets toward emerging threats or those risks perceived to have the most adverse impact on the enterprise. “It frees up money and personnel to deploy against more contemporary challenges like highly targeted attacks, social engineering and identity theft, not to mention tomorrow’s threats,” says Pescatore. “The goal is to constantly optimize the technology security budget to address the biggest threats — the ones that can affect your brand reputation.”
Many companies are collapsing the number of security tools fromvendors into more single-point solutions. “The tools we use are best of breed across the board, rather than many point solutions,” says Baljit Dail, global CIO at Aon Corporation. “We’ve prioritized our risks, and have identified as among our greater concerns the stealthy, highly targeted attacks that are becoming more common, as opposed to the DDOS attacks of the past that went after thousands of computers. Consequently, we’re standardizing and centralizing our IT platform, looking at the right tools to deploy our resources more effectively. We’re also implementing global solutions and standards, whether it is data encryption services, e-mail encryption or even our hard drives. What we’re trying to do, first and foremost, is protect our reputation.”
Craig Shumard, chief information security officer at the Philadelphiabased health and employee services provider, CIGNA Corp., agrees that unified threat management and other new security products have great appeal in the cost-strapped economy. “Unified threat management, in which many products are coming together in one solution, complements other new practices like in-the-cloud security, where spam and virus filtering, for example, is provided by third parties outside the perimeter of a company’s IT infrastructure,” he says. CIGNA subscribes to services provided by Symantec for this purpose.
Other CIOs like David Vordick at USEC Inc., a Bethesda, Marylandbased provider of enriched uranium to the nuclear power industry, use tools from multiple vendors and maintain an internal infrastructure of systems and services to mitigate cyber risks.
“Occasionally we’ll engage a third party to help us do a targeted risk assessment to get an independent view of our security, but for the most part we handle things in-house, doing, for example, ongoing vulnerability assessments and system scans to determine if we need new patches or new products installed from vendors,” Vordick explains. “We’re also using more tools these days, in terms of defense and strategy, and our security budget has increased. We certainly have more people here now whose specific job title is information security.”
Dimension Data, a global IT solutions provider with $3.8 billion in 2007 revenues, takes a different tack altogether. “We leverage outside vendors on a subscription basis to provide the most accurate signatures for our e-mail and virus filtering,” explains Mark Slaga, CIO and chief technology officer of Dimension Data Americas. “For me, it’s a no-brainer — they have teams of experts to create the rules and signatures, but we do keep the equipment on our premises and manage the equipment ourselves. As for our firewalls and security policy work, they’re part of our core competency so they’re handled internally, as are data encryption and management of the hard drives.”
People Are Key
While hardware and software tools can be wielded to combat many cyber threats, the most uncertain element in the chain of management is people. Employees have been known to leave laptops on airplanes and trains. Laid-off workers have taken vital corporate data with them out the door. Others have sold sensitive data for personal gain. Education and training are the best line of defense in these cases, provided either by third-party certified or internally by the chief security officer and staff. “We’re trying to create a risk-aware culture here, which is not the easiest thing when you have tens of thousands of employees,” acknowledges Aon’s Dail. “We’re constantly sending out communication from our executive team about security and the need to protect confidential information. We have instituted an Awareness Program that provides security training online, and have made taking this training compulsory in the U.S. We will soon require it elsewhere in the enterprise.”
Security training and education also are de rigueur at TCS, a provider of information technology services and outsourcing solutions. “We offer a range of online education courses, including a 20-question quiz that every employee must complete on security awareness,” says Ananth Krishnan, the firm’s chief technology officer. Vordick says USEC is doing the same — “raising awareness in our end user community about what they are permitted to do and not do, and when to let us know when something doesn’t look right.” CIGNA, notes Shumard, “focuses a lot of energy around training and education. We have several people on staff right now directly focused on training, education and information protection.”
All these efforts are making cyber crimes more difficult to perpetrate. But, as Shumard concedes, “At the end of the day there will always be a percentage of security risks, as much as 20 percent, that not only our company but no company has a solution for,” he says. “That’s why training and awareness are so important.”
Mon, 03 Jan 2022 19:37:00 -0600text/htmlhttps://www.wsj.com/ad/article/zurich_securing_vault.htmlKillexams : Universal Wireless Communication Consortium meetsMadhuri Velegar K in Bangalore
The conference was aligned to specifications. About a select 200 invitees, including vendors and operators of wireless technology products and services, attended the CII-Universal Wireless Communication Consortium Microconference here at the Oberoi Hotel.
The task at hand was to focus on the nomadic customer. A customer who wants to primarily communicate with friends and family and for business purposes. He requires an instrument that provides him the broadest service area, whose service is simple, cost effective and works the same everywhere, every time.
The subjects debated included relevance of adopting wireless technology, specifically 'wireless in local loop'. WLL allows communications connection to the home from the network, plus mobility.
The issue of government monopolies and the importance of making the right choice of technology suitable to the unique needs of the country were also discussed at the one-day seminar this week.
Interrupted by caffeine and tobacco breaks, the seminar was snowed under with questions from the participants by mid morning.
After a short lunch by the poolside, the conference highlighted the salient features of the technologies available in the market today.
The emphasis was on TDMA IS 136, a technology that according to most businessmen in the room, is the most versatile, cost-effective and flexible for adoption in the India environment.
Universal Wireless Communication Consortium was launched in 1995 as a collaborative effort among leading vendors and operators of wireless products and services. These included AT&T Wireless, BellSouth Cellular, Ericsson, Hughes Network Systems, Lucent Technologies, Motorola, Noika Mobile Phones, Nortel, Philips Consumer Communications, Sun Microsystems, Industar Digital PCS and Southwest Bell Systems.
The UWCC initiative came at such a time that it helped deliver an enhanced portfolio of global mobility services across all spectral, market and subscriber bands.
The platform for developing and delivering such an enhanced personal communications features package to subscribers worldwide comprises ANSI TIA IS-136 standard version of 'time division multiple acess' or TDMA radio frequency features in combination with enhanced Internet working and mobility of the IS-41 'wireless intelligent network', or WIN standard capabilities.
Of course, the issue about which technology is better, the TDMA or the CDMA, 'code division multiple acess', which also offers the latest in wireless communications that integrates voice, date and video on terrestrial, semi-terrestrial and cellular networks, raised a few sparks with UWCC members. The felt that "the CDMA is not fully capable of giving enhanced features while the TDMA is also programmed for offering third-generation services which call for huge speeds".
Where does India stand today? The Department of Telecommunications is readying itself to accept bids for basic telephony in various states.
N K Sinha, member, technology, Telecom Commission, said in his inaugural address: "Of an unprecendeted advancemnet in technology, specifically in telecommunications. We're on the threshold of developing a customer access network that uses wireless technology. It won't be a replacement for POTS (plain old telephone services) but a cost effective, lease line digital service with Internet access."
Though Sinha spoke of a wireless nirvana, the ground realities were felt sharply by a few members in the auditorium. "Unless we Strengthen the basic telephone infrastructure, provide a telephone for every man, woman and child, speaking of high-density, spohisticated features and services is irrelevant," says Kumar S Dasgupta, chief executive, telecom projects, RPG Enterprises.
The pathway for India to take would be, according to Larry Wood, senoir network architecture specialist, Asia Pacific, mobile systems, american standards, Ericsson, "To move beyond the pilot and case studies conducted to educate the DoT and send the decision makers to countries all around the globe where this technology has been commercially viable."
Ericsson Communications Private Limited, in conjunction with DoT, has undertaken a pilot project by setting up a five-line TDMA based telephone exchange in Bangalore wherein subscribers can call and use services of voice transfer, date transfer and fax facilities, all of them connected to the network using WLL technology.
In Maharashtra, Hughes Network Systems, with Ispat Limited, has another pilot project underway that offers basic service licence.
Keary Cannon from Hughes Network System states: "We're planning 110,000 lines of WLL by end of year 2000 in addition to fibre and wired services."
The HNS has deployed initial systems in central Bombay and in Pune with V5.2 interface to NEC switches. The TDMA WLL system will provide voice, date, fax, wireless and Internet access.
While the Indian speakers from the government were not so forthcoming about details of how and when the infrastructure first and technology second will be in place. In fact they asked for proposals from the participants of this conference. And our counterparts from the US and Europe had a sleeve full of ideas to share.
Fredric J Hibbler, vice-president, Operations UWCC, says: "The driving forces behind growth and adoption of wireless technology in the US and in India will be very different. While it has been convenience that has been the motivating factor in North America and Canada, it has been necessity for South America and for India which will push them towards being wireless."
The initial stage of confusion about which technology should we adopt and which will have long-term benefits and which will be cost effective and most important and which will be simple to deploy, will remain in India for a while as it did even in the US for a few years.
And whenever there is a monopoly player, says Larry Wood, the situation is not very good because they do not understand technology, which should be left to businessmen to decide. All that is needed is a free and open gateway which allows private users to enter the field and gauge the operation themselves.
The market forces should be allowed to ultimately decide on which technology should be adopted and how, says Wood.
In the US, the market is moving towards third generation deployment of technology services. Kailas Rao, an independent entrepreneur and chairman and president of Industar Digital PCS shared his experiences in reaching where he is today: He has established a highway where he ensures a lot of traffic (information) passes and he collects a lot of dimes as a fee for the same. "There are innumerable areas and business models to wireless technology, it's not just wireless voice transfer but wireless date transmission, wireless inventory control, wireless scanners, wireless remote-meter reading, wireless remote-time machines, a variety of applications that will benefit the customer, and make his life more productive, creative and prosperous.
Larry Wood says almost the same thing. Why can't the Indian government realise the importantce of merging with other wireless service providers around the globe and earn some good money? "For instance, the phone I use in Hong Kong has the GSM technology installed so whenever I move around the globe, I can access the same GSM base and make and receive calls in that country. But though the Indian government has mandated this technology, it does not have any mergers or partnerships with them anywhere around the world.
There are many visitors and travelers like me who will pay (the Indian government) to make and receive calls or use any of the value-added services while I am staying in this country.'
The driving force for adoption of wireless technology both in the West and in India would be three-fold according to Umesh Amin, director Technology and Service Planning- Wireless Strategy Group AT&T.
One, it will be a market driver. What is that people want? Two, it will make money or how long do I wait before I get a return on my investment and three, competition, a company that enters late adopts a different technology and asks the subscribers to come to them, they have something different.
In the US, wireless technology is extremely advanced and they are dealing with a mature market today, say the experts. Except in the data field, offers Wood. As the Internet and the IP markets are growing at a phenomenal rate and here's where the third generation services will become important.
Hibbler also asserted that the two biggest services that people request for in the US are colour ID and voice mail. "We're becoming more nomadic and roaming is the hot seat of all activity today. We're trying to get roaming agreements with all important carriers around the world."
It's also a step recommended for India. Mergers and tie-ups with service carriers around the world, as Wood says, "there's a convergence of technologies taking place around the world. You have cable companies offering telephone services and Internet services, telephone companies offering cable services and vice-versa, long distance players jumping to niche segments and local markets going global that mergers and MoUs are an important step forward in this industry.
The bottom line, say the experts, is to make agreements where systems are already in place; hit the roaming end. Because it's only when you enter you will find markets opening up and an increase in demand.
Another cautionary word is: Don't worry about technology, businessmen will do that. When the government steps in to set a mandate, the process is long, and in many cases it ends up mandating something which may not be the best business model.
So whether you are in India or the US, only two things matter, to provide a service and to make money. Governments are not professionals. They must back out.
Learn from the experiences not just of the US, but dynamic case studies where wireless technology using the TDMA has successfully worked, Indonesia, Sau Paulo in Brazil, Russia, and do not make the same mistakes that they did. Reinventing technology is a costly affair, reinvent the market forces, find what your customer needs and provide him that.
For once when he is hooked, he will pay the price to have that piece of value in his daily life. Like Anil Raj, chairman and president Ericsson Communications Private Limited, put it, "you can do almost anything with just one phone, and one telephone number. Use it at home, when you go to office, use it from there, and use it practically in any country you roam.'
As the evening drew to a close, the mobile phones stopped playing the Sare jahan se achha tune and vendors, operators and manufactures stepped outside for some hors d'oeuvre and drinks, with fresh ideas, new models and a clearer understanding of how wireless can change our lives.
Sun, 04 Jul 2021 14:34:00 -0500text/htmlhttps://www.rediff.com/computer/1998/sep/10uwcc.htmKillexams : Avaya Solidifies Hold in Health Care With AcquisitionNo result found, try new keyword!Avaya's aquisition of Nortel's healthcare division will allow ... It will have a booth there and be prepared to answer any questions. At the time of publication, Gupta did not hold positions ...Sat, 30 Jul 2022 12:00:00 -0500en-ustext/htmlhttps://www.thestreet.com/opinion/avaya-solidifies-hold-in-health-care-with-acquisition-10692267Killexams : Should You Buy Shopify (TSX:SHOP) Stock Before Earnings?
Shopify(TSX:SHOP)(NYSE:SHOP) is an Ottawa-based company that provides a commerce platform to North American and international customers. This e-commerce stock proved to be one of the most explosive tech stocks in North America in the latter half of the 2010s. However, it has suffered a major downturn in the first half of this new decade.
Indeed, the company has drawn comparisons to another Canadian cautionary tale: Nortel. Nortel was a telecommunications and data networking equipment giant that saw its stock erupt during the dot-com bubble in the late 1990s. At its height, Nortel stock accounted for more than a third of the total valuation of all companies on the TSX. Its price would crater in response to falling revenues, taking many investors down with it.
Can Shopify stage a comeback after this brutal stretch? Let’s dive in.
Why this top tech stock has been throttled over the past year
Shares of this tech stock have plunged 69% in 2022 as of close on July 25. The stock is down 76% in the year-over-year period. Shopify does deserve some leeway as the TSX Index has suffered from broader volatility since the beginning of the spring season.
Fortunately, the oil and gas price boom saw the energy heavy TSX Index avoid a more significant downturn. That phenomenon reared its head in the most latest trading session. The S&P/TSX Composite Index increased 121 points on the day, with the energy sector delivering a 3.5% bump. However, the S&P/TSX Capped Information Technology Index suffered a 1.27% decline.
Beyond market volatility, Shopify has also been the target of short-sellers. Short-selling involves borrowing a security and selling it with the intent of buying it at a lower price at a later date. When it works, the short-seller can collect profits after repaying the loan. Prominent short-seller Andrew Left of Citron Research called the company “dirty” and a “get-rich-quick scheme.” There have been looming questions over its financial reporting regarding its merchants in latest years. Shrinking profit margins after the COVID-19 pandemic bump have also drawn the ire of analysts.
Is there reason for optimism for Shopify going forward?
In the middle of July, the company announced that it had cancelled fall internships and put a hold on recruiting. Shopify is not alone in the tech sector. Many companies have scaled back on hiring in the face of soaring inflation, rising interest rates, geopolitical tensions, and economic uncertainty.
Shopify is set to release its second-quarter 2022 results on July 27. This will be a big report for the company as it battles a brutal pullback. Consumers are feeling the squeeze of inflation, which is bad news for e-commerce companies that rely on an active client base. A recession, even a mild one, will be very poor timing for this company in the first half of this decade.
In Q1 2022, Shopify delivered total revenue growth of 22% to $1.2 billion. Meanwhile, adjusted gross profit jumped 14% to $646 million. Its adjusted net income was reported at $25.1 million, or $0.20 per diluted share — down from $254 million, or $2.01 per diluted share, in the previous year. Overall, the company experienced a dramatic decline in revenue and earnings-growth rates compared to past years.
Shopify: Should you buy today?
Shares of this tech stock are trading in solid value territory compared to its industry peers. Meanwhile, it is still on track for strong earnings growth. That said, Shopify is facing mounting competition in the e-commerce space. Investors should keep a close eye on its upcoming earnings report that could make or break its stock performance for the rest of the summer. I’m staying on the sidelines when it comes to this tech stock in the near term.
Tue, 26 Jul 2022 04:24:00 -0500Ambrose O'Callaghanen-CAtext/htmlhttps://www.fool.ca/2022/07/26/should-you-buy-shopify-tsxshop-stock-before-earnings/Killexams : 0-In Announces New Products Based on Breakthrough Formal Verification Algorithms
Enhanced Assertion-Based Verification Suite Enables Systematic Method for Eliminating Bugs in IC Designs
SAN JOSE, Calif. – January 27, 2003 – Today 0-In Design Automation, the Assertion-Based Verification Company, announced a suite of new products based on powerful new formal verification technologies that increase performance by more than 100X over the previous version. V2.0 of the 0-In Assertion-Based Verification (ABV) Suite combines simulation with static formal and dynamic formal verification to provide a broad solution for fast and thorough functional verification of complex ASICs and System-on-Chip (SoC) devices.
Advanced Technology Finds Tough Bugs Before Silicon The 0-In ABV Suite provides development teams the power to answer two key questions about the verification process:
Does the design meet the target specification?
Have all the bugs been eliminated before tape-out?
0-In's ABV products answer these questions by enabling assertions to check all aspects of the design's behavior and to catch bugs at the earliest possible point in the verification process. 0-In's V2.0 ABV Suite includes two new formal verification products as well as enhancements to 0-In's dynamic formal verification technology that finds tough, corner-case bugs usually not found until chips are in the lab. The combined power of 0-In's products enables development teams to find bugs missed by every other verification method.
"Finding all bugs prior to tape-out is critical for competing in today's aggressive market environment," said Emil Girczyc, 0-In President and CEO. "V2.0 brings to market breakthrough formal verification algorithms that find the toughest bugs in complex designs before tape out. The power of these new algorithms leaves no place for bugs to hide, allowing design teams to meet aggressive time-to-market requirements."
New Products and Technology 0-In's V2.0 ABV Suite includes two new products, 0-In Checklist and 0-In Confirm, as well as new features and technology for all existing 0-In products
0-In Checklist uses static netlist-analysis technology to rapidly and automatically find many common syntactic and semantic RTL coding errors, including simulation-to-synthesis mismatch errors, clock domain crossing errors, and others. 0-In Checklist is fast and easy to use, requiring no simulation and producing essentially no false error reports. Indeterminate assertions may be promoted to simulation and formal verification.
0-In Confirm finds deep RTL design bugs that are missed by all other verification methods. In particular, 0-In Confirm targets corner-case or worry-case assertions with deep counterexample (DCE) technology, a breakthrough exhaustive formal verification algorithm that is capable of finding bugs hundreds of cycles away from any selected simulation state. 0-In Confirm also can be used to verify that late-stage bug fixes are correct.
In V2.0, 0-In Search incorporates new algorithms that intelligently analyze and prioritize simulation cycles, increasing speed by 100X over previous releases and enabling users to apply dynamic formal verification across their entire regression suite. Simulation tests guide the formal algorithms to deep states, avoiding computational limitations. Dynamic formal verification technology then uses exhaustive formal algorithms to find bugs that simulation misses.
0-In Check includes the CheckerWare library, a rich library of over 70 Verilog assertion checkers that work in both simulation and formal verification and are testbench- and simulator-independent. V2.0 adds support for Accellera assertion standards, improves simulation performance and incorporates new coverage metrics for assertions.
The 0-In V2.0 ABV Suite provides value to designers and verification engineers throughout the entire development cycle, delivering a comprehensive assertion-based verification methodology that works from block-level through system-level verification, including regression testing, simulation acceleration, and hardware emulation. 0-In products are all simulator- and testbench-independent, making them applicable throughout the verification process. "Our new products are designed to address the customer's specific verification goals at each phase of the design cycle, no matter which vendor's tools are being used at each phase," noted Dr. Girczyc.
0-In supports Accellera assertion standards and 0-In products are interoperable with a wide range of tools from other EDA vendors.
Customers ABV products from 0-In have been in production usage for nearly three years, which has generated feedback from numerous tape-outs and provided the impetus for V2 enhancements. 0-In's ABV tools are used today by leading design teams at AMD, Cisco Systems, Fujitsu, Hewlett Packard, LSI Logic, National Semiconductor, Nortel Networks, Sun Microsystems, and many other system and semiconductor suppliers.
"We've used 0-In products to catch design issues early and to Strengthen the quality of our RTL," said Jonathan Sun, EDA Technologies Manager at Sun Microsystems, Inc., "V2.0 will enable us to further leverage the benefits of static and dynamic formal verification tools in our assertion-based verification flow."
"0-In tools are effective at finding tough, corner-case bugs that otherwise would go undetected," said Gordon Mortensen, Director of Engineering for the Internet Appliance Group at National Semiconductor. "On a latest SoC project, 0-In Search identified bugs that had a high probability of otherwise making it into silicon. We definitely had increased confidence after using the 0-In tools. That confidence was confirmed when the chip was fabricated and tested in the lab – we have not found any bugs in modules Verified with 0-In."
Packaging, pricing, and availability V2.0 of the 0-In ABV Suite products is available now. North American list prices for one-year time-based licenses are:
About 0-In 0-In Design Automation, Inc. (pronounced "zero-in") develops and supports functional verification products that help verify multi-million gate application-specific integrated circuit (ASIC) and system-on-chip (SoC) designs. The company delivers a comprehensive assertion-based verification (ABV) solution that provides value throughout the design and verification cycle – from the block level to the chip and system level. Twelve of the 15 largest electronics companies have adopted 0-In tools and methodologies in their integrated circuit (IC) design verification flows. 0-In was founded in 1996 and is based in San Jose, Calif. For more information, see http://www.0-in.com.
Sun, 26 Jun 2022 12:00:00 -0500entext/htmlhttps://www.design-reuse.com/news/4743/0-products-breakthrough-formal-verification-algorithms.htmlKillexams : A Good-Looking Couple of CanadiansNo result found, try new keyword!I didn't think so. After all, Gretzsky is "The Great One," Bret Hart is "The Hit Man," Pamela Anderson is, quite simply, feminine pulchritude, and Nortel has been and remains bullish. I thought it ...Mon, 01 Aug 2022 12:00:00 -0500en-ustext/htmlhttps://www.thestreet.com/opinion/a-good-looking-couple-of-canadians-844576Killexams : Troubling Natural Categories: Engaging the Medical Anthropology of Margaret Lock
Our systems have detected unusual traffic activity from your network. Please complete this reCAPTCHA to demonstrate that it's you making the requests and not a robot. If you are having trouble seeing or completing this challenge, this page may help. If you continue to experience issues, you can contact JSTOR support.
Block Reference: #182a70e9-1724-11ed-8916-46436a756a7a VID: # IP: 184.108.40.206 Date and time: Mon, 08 Aug 2022 14:12:11 GMT
Fri, 23 Mar 2018 16:22:00 -0500entext/htmlhttps://www.jstor.org/stable/j.ctt32b73fKillexams : SaaS DR/BC: If You Think Cloud Data is Forever, Think Again
SaaS is quickly becoming the default tool for how we build and scale businesses. It’s cheaper and faster than ever before. However, this reliance on SaaS comes with one glaring risk that’s rarely discussed.
The “Shared Responsibility Model” doesn’t just govern your relationship with AWS, it actually impacts all of cloud computing. Even for SaaS, users are on the hook for protecting their own data.
Human error, cyber threats and integrations that have gone wrong are the main causes of data loss in SaaS. And it’s not uncommon, in one study, about 40% of users have said they have lost data in SaaS applications.
It’s possible to create your own in-house solution to help automate some of the manual work around backing-up SaaS data. However, there are limitations to this approach and none of them will help you restore data back to its original state.
A data continuity strategy is essential in SaaS, otherwise you may be scambling to restore all information you rely on each and every day.
The Cloud is Not Forever and Neither is Your Data
When I began my career in technical operations (mostly what we call DevOps today) the world was dramatically different. This was before the dawn of the new millennium. When the world’s biggest and most well-known SaaS company, Salesforce, was operating out of an apartment in San Francisco.
Back then, on-premise ruled the roost. Rows of towers filled countless rooms. These systems were expensive to set up and maintain, from both a labour and parts perspective. Building a business using only SaaS applications was technically possible back then but logistically a nightmare. On-prem would continue to be the default way for running software for years to come.
But technology always progresses at lightspeed. So just three years after Salesforce began preaching the “end of software”, Amazon Web Services came online and changed the game completely.
Today a new SaaS tool can be built and deployed across the world in mere days. Businesses are now embracing SaaS solutions at a record pace. The average small to medium-sized business can easily have over 100 SaaS applications in their technology stack. Twenty years ago, having this many applications to run a business was unthinkable and would have cost millions of dollars in operational resources. However, at Rewind, where I oversee technical operations, I looked after our software needs with a modem and a laptop.
SaaS has created a completely different reality for modern businesses. We can build and grow businesses cheaper and faster than ever before. Like most “too good to be true” things, there’s a catch. All this convenience comes with one inherent risk. It’s a risk that people rarely discussed in my early days as a DevOps and is still rarely talked about. Yet this risk is important to understand, otherwise, all the vital SaaS data you rely on each and every day could disappear in the blink of an eye.
And it could be gone for good.
The Shared Responsibility of SaaS
This likely goes without saying but you rent SaaS applications, you don’t own them. Those giant on-prem server rooms companies housed years ago, now rest with the SaaS provider. You simply access their servers (and your data) through an operating system or API. Now you are probably thinking, “Dave, I know all this. So what?”
Well, this is where the conundrum lies.
If you look at the terms of service for SaaS companies, they do their best to ensure their applications are up and running at all times. It doesn’t matter if servers are compromised by fire, meteor strike, or just human error, SaaS companies strive to ensure that every time a user logs in, the software is available. The bad news is this is where their responsibility ends.
You, the user, are on the hook for backing up and restoring whatever data you’ve entered and stored in their services. Hence the term “Shared Responsibility Model”. This term is most associated with AWS but this model actually governs all of cloud computing.
The above chart breaks down the various scenarios for protecting elements of the cloud computing relationship. You can see that with the SaaS model, the largest onus is on the software provider. Yet there are still things a user is responsible for; User Access and Data.
I’ve talked to other folks in DevOps, site reliability, or IT roles in latest years and I can tell you that the level of skepticism is high. They often don’t believe their data isn’t backed up by the SaaS provider in real time. I empathize with them, though, because I was once in their shoes. So when I meet this resistance, I just point people to the various terms of service laid out by each SaaS provider. Here is GitHub’s, here is Shopify’s and the one for Office 365. It’s all there in black and white.
The reason the Shared Responsibility Model exists in the first place essentially comes down to the architecture of each application. A SaaS provider has built its software to maximize the use of its operating system, not continually snapshot and store the millions or billions of data points created by users. Now, this is not a “one-size fits all scenario”. Some SaaS providers may be able to restore lost data. However, if they do, in my experience, it’s often an old snapshot, it’s incomplete, and the process to get everything back can take days, if not weeks.
Again, it’s simply because SaaS providers are lumping all user data together, in a way that makes sense for the provider. Trying to find it again, once it’s deleted or compromised, is like looking for a needle in a haystack, within a field of haystacks.
How Data Loss Happens in SaaS
The likelihood of losing data from a SaaS tool is the next question that inevitably comes up. One study conducted by Oracle and KPMG found that 49% of SaaS users have previously lost data. Our own research found that 40% of users have previously lost data. There are really three ways that this happens; risks that you may already be very aware of. They are human error, cyberthreats, and 3rd party app integrations.
Humans and technology have always had co-dependent challenges. Let’s face it, it’s one of the main reasons my career exists! So it stands to reason that human inference, whether deliberate or not, is a common reason for losing information. This can be as innocuous as uploading a CSV file that corrupts data sets, accidentally deleting product listings, or overwriting code repositories with a forced push.
There’s also intentional human interference. This means someone who has authorized access, nuking a bunch of stuff. It may sound far-fetched but we have seen terminated employees or third-party contractors cause major issues. It’s not very common, but it happens.
Cyberthreats are next on the list, which are all issues that most technical operations teams are used to. Most of my peers are aware that the level of attacks increased during the global pandemic, but the rate of attacks had already been increasing prior to COVID-19. Ransomware, phishing, DDoS, and more are all being used to target and disrupt business operations. If this happens, data can be compromised or completely wiped out.
Finally, 3rd party app integrations can be a source of frustration when it comes to data loss. Go back and read the terms of service for apps connected to your favourite SaaS tool. They may save a ton of time but they may have a lot of control over all the data you create and store in these tools. We’ve seen apps override and permanently delete reams of data. By the time teams catch it, the damage is already done.
There are some other ways data can be lost but these are the most common. The good news is that you can take steps to mitigate downtime. I’ll outline a common one, which is writing your own backup script for a Git.
One approach to writing a GitHub backup script
There are a lot of ways to approach this. Simply Google “git backup script” and lots of options pop up. All of them have their quirks and limitations. Here is a quick rundown of some of them.
Creating a local backup in Cron Scripts
Essentially you are writing a script to clone a repo, at various intervals, using cron jobs. (Note the cron job tool you used will depend on the OS you use). This method essentially takes snapshots over time. To restore a lost repo, you just pick the snapshot you want to bring back. For a complete copy use git clone --mirror to mirror your repositories. This ensures all remote and local branches, tags, and refs get included.
The pros of using this method are a lack of reliance on external tools for backups and the only cost is your time.
The cons are a few. You actually won’t have a full backup. This clone won’t have hooks, reflogs, configuration, description files, and other metadata. It’s also a lot of manual work and becomes more complex if trying to add error monitoring, logging, and error notification. And finally, as the snapshots pile up, you’ll need to consider accounts for cleanups and archiving.
Syncthing is a GUI/CLI application that allows for file syncing across many devices. All the devices need to have Syncthing installed on them and be configured to connect with one another. Keep in mind that syncing and backing up are different, as you are not creating a copy, but rather ensuring a file is identical across multiple devices.
The pros are that it is free and one of the more intuitive methods for a DIY “backup” since it provides a GUI. Cons: Syncthing only works between individual devices, so you can’t directly back up your repository from a code hosting provider. Manual fixes are needed when errors occur. Also, syncing a git repo could lead to corruption and conflicts of a repository, especially if people work on different branches. Syncthing also sucks up a lot of resources with its continuous scanning, hashing, and encryption. Lastly, it only maintains one version, not multiple snapshots.
Using SCM Backup
SCM Backup creates an offline clone of a GitHub or BitBucket repository. It makes a significant difference if you are trying to back up many repos at once. After the initial configuration, it grabs a list of all the repositories through an API. You can also exclude certain repos if need be.
SCM lets you specify backup folder location, authentication credentials, email settings, and more.
Here’s the drawback though, the copied repositories do not contain hooks, reflogs, or configuration files, or metadata such as issues, pull requests, or releases. And configuration settings can change across different code hosting providers. Finally, in order to run it, you need to have .NET Core installed on your machine.
Now that’s just three ways to backup a git repository. As I mentioned before, just type a few words into Google and a litany of options comes up. But before you get the dev team to build a homegrown solution, keep these two things in mind.
First, any DIY solution will still require a significant amount of manual work because they only clone and/or backup; they can’t restore data. In fact, that’s actually the case with most SaaS tools, not just in-house backup solutions. So although you may have some snapshots or cloned files, it will likely be in a format that needs to be reuploaded into a SaaS tool. One way around this is to build a backup as a service program, but that will likely eat up a ton of developer time.
That brings us to the second thing to keep in mind, the constantly changing states of APIs. Let’s say you build a rigorous in-house tool: you’ll need a team to be constantly checking for API updates, and then making the necessary changes to this in-house tool so it’s always working. I can only speak for myself, but I’m constantly trying to help dev teams avoid repetitive menial tasks. So although creating a DIY backup script can work, you need to decide where you want development teams to spend their time.
Data Continuity Strategies for SaaS
So what’s the way forward in all of this? There are a few things to consider. And these steps won’t be uncommon to most technical operations teams. First, figure out whether you want to DIY or outsource your backup needs. We already covered the in-house options and the challenges it presents. So if you decide to look for a backup and recovery service, just remember to do your homework. There are a lot of choices, so as you go through due diligence, look at reviews, talk to peers, read technical documentation and honestly, figure out if company X seems trustworthy. They will have access to your data after all.
Next, audit all your third-party applications. I won’t sugarcoat it, this can be a lot of work. But remember the “terms of service” agreements? There are always a few surprises to be found. And you may not like what you see. I recommend you do this about once a year and make a pro/cons list. Is the value you get from this app worth the trade-off of access the app has? If it’s not, you may want to look for another tool. Fun fact: Compliance standards like SOC2 require a “vendor assessment” for a reason. External vendors or apps are a common culprit when it comes to accidental data loss.
And finally, limit who has access to each and every SaaS application. Most people acknowledge the benefits of using the least privileged approach, but it isn’t always put into practice. So make sure the right people have the right access, ensure all users have unique login credentials (use a password manager to manage the multiple login hellscape) and get MFA installed.
It’s not a laundry list of things nor is it incredibly complex. I truly believe that SaaS is the best way to build and run organizations. But I hope now it’s glaringly obvious to any DevOps, SRE or IT professional that you need to safeguard all the information that you are entrusting to these tools. There is an old saying I learned in those early days of my career, “There are two types of people in this world – those who have lost data and those who are about to lose data”.
You don’t want to be the person who has to inform your CIO that you are now one of those people. Of course, if that happens, feel free to send them my way. I’m certain I’ll be explaining the Shared Responsibility Model of SaaS until my career is over!
About the Author
Dave North has been a versatile member of the Ottawa technology sector for more than 25 years. Dave is currently working at Rewind leading 3 teams (devops, trust, IT) as the director of technical operations. Prior to Rewind, Dave was a long time member of Signiant, holding many roles in the organization including sales engineer, pro services, technical support manager, product owner and devops director. A proven leader and innovator, Dave holds 5 US patents and helped drive Signiant’s move to a cloud SAAS business model with the award winning Media Shuttle product. Prior to Signiant, Dave held several roles at Nortel, Bay Networks and ISOTRO Network Management working on the NetID product suite. Dave is fanatical about cloud computing, automation, gadgets and Formula 1 racing.