Hitoshi Kume, a recipient of the 1989 Deming Prize for use of quality principles, defines problems as "undesirable results of a job." Quality improvement efforts work best when problems are addressed systematically using a consistent and analytic approach; the methodology shouldn't change just because the problem changes. Keeping the steps to problem-solving simple allows workers to learn the process and how to use the tools effectively.
Easy to implement and follow up, the most commonly used and well-known quality process is the plan/do/check/act (PDCA) cycle (Figure 1). Other processes are a takeoff of this method, much in the way that computers today are takeoffs of the original IBM system. The PDCA cycle promotes continuous improvement and should thus be visualized as a spiral instead of a closed circle.
Another popular quality improvement process is the six-step PROFIT model in which the acronym stands for:
P = Problem definition.
R = Root cause identification and analysis.
O = Optimal solution based on root cause(s).
F = Finalize how the corrective action will be implemented.
I = Implement the plan.
T = Track the effectiveness of the implementation and verify that the desired results are met.
If the desired results are not met, the cycle is repeated. Both the PDCA and the PROFIT models can be used for problem solving as well as for continuous quality improvement. In companies that follow total quality principles, whichever model is chosen should be used consistently in every department or function in which quality improvement teams are working.
Figure 1. The most common process for quality improvement is the plan/do/check/act cycle outlined above. The cycle promotes continuous improvement and should be thought of as a spiral, not a circle.
Once the basic problem-solving or quality improvement process is understood, the addition of quality tools can make the process proceed more quickly and systematically. Seven simple tools can be used by any professional to ease the quality improvement process: flowcharts, check sheets, Pareto diagrams, cause and effect diagrams, histograms, scatter diagrams, and control charts. (Some books describe a graph instead of a flowchart as one of the seven tools.)
The concept behind the seven basic tools came from Kaoru Ishikawa, a renowned quality expert from Japan. According to Ishikawa, 95% of quality-related problems can be resolved with these basic tools. The key to successful problem resolution is the ability to identify the problem, use the appropriate tools based on the nature of the problem, and communicate the solution quickly to others. Inexperienced personnel might do best by starting with the Pareto chart and the cause and effect diagram before tackling the use of the other tools. Those two tools are used most widely by quality improvement teams.
Flowcharts describe a process in as much detail as possible by graphically displaying the steps in proper sequence. A good flowchart should show all process steps under analysis by the quality improvement team, identify critical process points for control, suggest areas for further improvement, and help explain and solve a problem.
The flowchart in Figure 2 illustrates a simple production process in which parts are received, inspected, and sent to subassembly operations and painting. After completing this loop, the parts can be shipped as subassemblies after passing a final test or they can complete a second cycle consisting of final assembly, inspection and testing, painting, final testing, and shipping.
Figure 2. A basic production process flowchart displays several paths a part can travel from the time it hits the receiving dock to final shipping.
Flowcharts can be simple, such as the one featured in Figure 2, or they can be made up of numerous boxes, symbols, and if/then directional steps. In more complex versions, flowcharts indicate the process steps in the appropriate sequence, the conditions in those steps, and the related constraints by using elements such as arrows, yes/no choices, or if/then statements.
Check sheets help organize data by category. They show how many times each particular value occurs, and their information is increasingly helpful as more data are collected. More than 50 observations should be available to be charted for this tool to be really useful. Check sheets minimize clerical work since the operator merely adds a mark to the tally on the prepared sheet rather than writing out a figure (Figure 3). By showing the frequency of a particular defect (e.g., in a molded part) and how often it occurs in a specific location, check sheets help operators spot problems. The check sheet example shows a list of molded part defects on a production line covering a week's time. One can easily see where to set priorities based on results shown on this check sheet. Assuming the production flow is the same on each day, the part with the largest number of defects carries the highest priority for correction.
Figure 3. Because it clearly organizes data, a check sheet is the easiest way to track information.
The Pareto diagram is named after Vilfredo Pareto, a 19th-century Italian economist who postulated that a large share of wealth is owned by a small percentage of the population. This basic principle translates well into quality problems—most quality problems result from a small number of causes. Quality experts often refer to the principle as the 80-20 rule; that is, 80% of problems are caused by 20% of the potential sources.
A Pareto diagram puts data in a hierarchical order (Figure 4), which allows the most significant problems to be corrected first. The Pareto analysis technique is used primarily to identify and evaluate nonconformities, although it can summarize all types of data. It is perhaps the diagram most often used in management presentations.
Figure 4. By rearranging random data, a Pareto diagram identifies and ranks nonconformities in the quality process in descending order.
To create a Pareto diagram, the operator collects random data, regroups the categories in order of frequency, and creates a bar graph based on the results.
Cause and effect diagrams
The cause and effect diagram is sometimes called an Ishikawa diagram after its inventor. It is also known as a fish bone diagram because of its shape. A cause and effect diagram describes a relationship between variables. The undesirable outcome is shown as effect, and related causes are shown as leading to, or potentially leading to, the said effect. This popular tool has one severe limitation, however, in that users can overlook important, complex interactions between causes. Thus, if a problem is caused by a combination of factors, it is difficult to use this tool to depict and solve it.
A fish bone diagram displays all contributing factors and their relationships to the outcome to identify areas where data should be collected and analyzed. The major areas of potential causes are shown as the main bones, e.g., materials, methods, people, measurement, machines, and design (Figure 5). Later, the subareas are depicted. Thorough analysis of each cause can eliminate causes one by one, and the most probable root cause can be selected for corrective action. Quantitative information can also be used to prioritize means for improvement, whether it be to machine, design, or operator.
Figure 5. Fish bone diagrams display the various possible causes of the final effect. Further analysis can prioritize them.
The histogram plots data in a frequency distribution table. What distinguishes the histogram from a check sheet is that its data are grouped into rows so that the identity of individual values is lost. Commonly used to present quality improvement data, histograms work best with small amounts of data that vary considerably. When used in process capability studies, histograms can display specification limits to show what portion of the data does not meet the specifications.
After the raw data are collected, they are grouped in value and frequency and plotted in a graphical form (Figure 6). A histogram's shape shows the nature of the distribution of the data, as well as central tendency (average) and variability. Specification limits can be used to display the capability of the process.
Figure 6. A histogram is an easy way to see the distribution of the data, its average, and variability.
A scatter diagram shows how two variables are related and is thus used to test for cause and effect relationships. It cannot prove that one variable causes the change in the other, only that a relationship exists and how strong it is. In a scatter diagram, the horizontal (x) axis represents the measurement values of one variable, and the vertical (y) axis represents the measurements of the second variable. Figure 7 shows part clearance values on the x-axis and the corresponding quantitative measurement values on the y-axis.
Figure 7. The plotted data points in a scatter diagram show the relationship between two variables.
A control chart displays statistically determined upper and lower limits drawn on either side of a process average. This chart shows if the collected data are within upper and lower limits previously determined through statistical calculations of raw data from earlier trials.
The construction of a control chart is based on statistical principles and statistical distributions, particularly the normal distribution. When used in conjunction with a manufacturing process, such charts can indicate trends and signal when a process is out of control. The center line of a control chart represents an estimate of the process mean; the upper and lower critical limits are also indicated. The process results are monitored over time and should remain within the control limits; if they do not, an investigation is conducted for the causes and corrective action taken. A control chart helps determine variability so it can be reduced as much as is economically justifiable.
In preparing a control chart, the mean upper control limit (UCL) and lower control limit (LCL) of an approved process and its data are calculated. A blank control chart with mean UCL and LCL with no data points is created; data points are added as they are statistically calculated from the raw data.
Figure 8. Data points that fall outside the upper and lower control limits lead to investigation and correction of the process.
Figure 8 is based on 25 samples or subgroups. For each sample, which in this case consisted of five rods, measurements are taken of a quality characteristic (in this example, length). These data are then grouped in table form (as shown in the figure) and the average and range from each subgroup are calculated, as are the grand average and average of all ranges. These figures are used to calculate UCL and LCL. For the control chart in the example, the formula is ± A2R, where A2 is a constant determined by the table of constants for variable control charts. The constant is based on the subgroup trial size, which is five in this example.
Many people in the medical device manufacturing industry are undoubtedly familiar with many of these tools and know their application, advantages, and limitations. However, manufacturers must ensure that these tools are in place and being used to their full advantage as part of their quality system procedures. Flowcharts and check sheets are most valuable in identifying problems, whereas cause and effect diagrams, histograms, scatter diagrams, and control charts are used for problem analysis. Pareto diagrams are effective for both areas. By properly using these tools, the problem-solving process can be more efficient and more effective.
Those manufacturers who have mastered the seven basic tools described here may wish to further refine their quality improvement processes. A future article will discuss seven new tools: relations diagrams, affinity diagrams (K-J method), systematic diagrams, matrix diagrams, matrix data diagrams, process decision programs, and arrow diagrams. These seven tools are used less frequently and are more complicated.
Ashweni Sahni is director of quality and regulatory affairs at Minnetronix, Inc. (St. Paul, MN), and a member of MD&DI's editorial advisory board.
Today’s skilled nursing facilities face many challenges in the delivery of high-quality and safe care for vulnerable patient populations.
Significantly, the very limited resources available in organizations are the result of inadequate funding streams combined with growing needs. Workforce shortages existed prior to, but have been compounded by, the pandemic experience. Staff members typically serve multiple roles, including efforts dedicated to meet quality regulatory requirements.
As acuity of care increases, clinical needs grow, and regulatory demands change and escalate. The stage is too often set for an already busy and burdened staff to address quality and safety functions with a compliance focus, rather than just collecting and reporting data. It may seem like the best that can be done.
It’s not, and the golden opportunity of actually improving patient safety and quality and lowering overall costs is missed.
Is your team ready to Improve quality and safety?
It’s one thing to comply with extensive regulatory requirements for data collection and reporting. And it’s quite another to take the data and to use it to Improve what pains you in a skilled nursing facility.
The good news is the data used for regulatory and compliance purposes can be parlayed into data that can drive improved quality and safety for your patients and residents while also driving down costs, making your business more sustainable.
Once you have the data, the next step is data analysis — understanding the why behind it. It’s a necessary step (and skill) to be learned and activated. When the why or upstream cause is understood, it’s more likely that your team can focus on innovation and improvements.
Not only that, but rapid tests of change and improvement that the data informs can generate excitement and change toward positive outcomes. It can be the motivator your team needs to feel even better about the work they are doing with their team for your patients and residents.
So, what comes next?
Identify your quality champions and find ways to enhance their knowledge and skills in quality and safety. These are the people on your team who are consummate problem-solvers and who live to make things better. Support their spirit of improvement and back it up with an investment in skilling or upskilling. Energy met with knowledge, skills and tools to do the work produces outstanding outcomes.
Most employers are surprised to learn that for minimal investment, they can unleash the potential of their teams. The result is a more skilled and a more engaged workforce that feels valued by their employer. This improves retention — a challenge that everyone is facing.
Some leaders may think that this type of investment, though it’s small, is just not feasible. Our response to that is: is status quo feasible? Why not try to support and leverage the largest expenditure in your budget: your staff?
Others may say, “The workforce is too transient. As soon as I train them, they will leave.” Hopefully not, but perhaps. However, if the workforce is that transient, wouldn’t the goal be for the industry to commit to a more skilled workforce, so no matter the transient nature of the industry, there is an expectation of skills?
The time to address the toughest challenges is now. Waiting for a silver bullet to fix your situation or accepting a slow, painful decline and demise of your business is not the answer. Your business can thrive, your staff will thrive and the community needs you.
Stephanie Mercado, CAE, CPHQ, is CEO and executive director of the National Association for Healthcare Quality, the leader in industry-standard healthcare quality and safety competencies, training, and certification in healthcare quality for individuals working in healthcare quality.
Lenard Parisi, RN, MA, CPHQ, FNAHQ, is senior director of nursing, quality and magnet recognition at Mount Sinai Hospital in New York City. He is a Certified Professional in Healthcare Quality (CPHQ) and Fellow of the National Association for Healthcare Quality (FNAHQ) and has held administrative and clinical leadership positions in many of New York’s renowned academic medical centers and health systems including NewYork-Presbyterian, Memorial Sloan Kettering Cancer Center and Metropolitan Jewish Health System.
The opinions expressed in McKnight’s Long-Term Care News guest submissions are the author’s and are not necessarily those of McKnight’s Long-Term Care News or its editors.
Have a column idea? See our submission guidelines here.
In the pharmaceutical industry, a vendor refers to an external entity or supplier that provides goods, materials, services, or specialized expertise to pharmaceutical companies. These vendors play a crucial role in supporting various aspects of the pharmaceutical business, ranging from research and development to manufacturing and distribution of medicinal products. They are an integral part of the pharmaceutical supply chain and contribute to the overall success of pharmaceutical operations. Vendors can encompass a diverse range of entities, each with its own specific function and contribution within the industry.
Raw Material and Ingredient Suppliers:
Vendors in this category supply the essential components required for pharmaceutical manufacturing. These can include Active Pharmaceutical Ingredients (APIs), excipients, solvents, chemicals, and other raw materials necessary for drug manufacturing and formulation.
Contract Manufacturing Organizations (CMOs):
CMOs provide manufacturing and production services on a contract basis. Pharmaceutical companies may partner with CMOs for various stages of drug production, from formulation development and clinical trial material manufacturing to commercial-scale production activities.
Testing and Quality Control Laboratories:
These vendors specialize in conducting rigorous testing and analysis of pharmaceutical products to ensure their quality, safety, and efficacy. They assist in confirming that products meet regulatory requirements and established specifications.
Packaging and Labeling Suppliers:
Packaging vendors provide materials and expertise for designing and producing pharmaceutical labeling and packaging materials. These include primary packaging materials such as blister packs, vials, and bottles and secondary packaging components like cartons, labels and inserts).
Technology and Equipment Suppliers:
Vendors in this category offer advanced machinery, equipment, and technology solutions used in pharmaceutical manufacturing, research, and quality control. Suppliers can provide off-the-shelf equipment for process automation and laboratory instruments to custom made, specialized production equipment and utilities.
Logistics and Distribution Partners:
These vendors handle the transportation, storage, and distribution of pharmaceutical products. They ensure that drugs are safely delivered to their intended destinations, while adhering to regulatory guidelines, following Good Distribution Practices (GDP).
Pharmaceutical consultants provide a broad range of specialized services to help companies navigating the complex landscape of pharmaceutical regulations or with expertise in specific areas.
Contract Research Organizations (CROs):
CROs offer research and development services to pharmaceutical companies, ranging from pre-clinical studies and clinical trial management to data analysis and regulatory submissions.
Specialized Service Providers:
This diverse category includes vendors offering specialized services such as clinical trial services, data management, statistical analysis, pharmacovigilance, market research, medical writing, and more.
Ensuring Quality Across the Vendor Spectrum: Accountability, Qualification, and Oversight
Every organization bears the duty of ensuring unwavering consistency in the quality of services and materials provided by their vendors. This quality assurance spans the entire product lifecycle, from the developmental stages of production to the culmination of commercial distribution. Vendor management is a pivotal process to ensure the quality, safety, and regulatory compliance of their products, and involves collaboration with vendors to optimize cost control, enhance value, and simultaneously uphold quality standards, while diligently mitigating potential risks and optimizing the company’s operations and value chain.
Addressing the accountability aspect, the EU-GMP guideline, Chapter 7 "Outsourced Activities"  specifically designates the Manufacturing Authorization Holder (MAH) with the responsibility of vendor qualification. When outsourcing activities, there must be a contract in place that covers what activities are outsourced to the vendor, with clear establishments of the duties for both the contract giver and contract acceptor. It must be well understood that the Contract Giver remains ultimately responsible for ensuring processes are in place to assure the control of outsourced activities. This includes that the Quality Management System of the contract giver must clearly state the way that the Qualified Person certifying each batch of product for release exercises his full responsibility.
Correspondingly, in Annex 16, Section 1.5.4 of the EU-GMP Guidelines, governing "Certification by a Qualified Person and Batch Release"  the Qualified Person (QP) representing the MAH holds the pivotal role of certifying drug products for the market. It is imperative for the QP to ensure comprehensive documentation across the entirety of the active substance and medicinal product's supply chain, up to the point of certification.
Chapter 2 "Personnel"  within the EU-GMP Guidelines states that the heads of Quality Control, Production, and Quality Assurance jointly bear the responsibility for endorsing and overseeing vendor activities. Their role encompasses the approval and ongoing monitoring of vendors, contributing to the maintenance of robust quality standards.
Fostering Excellence: A Comprehensive Approach to Vendor Selection, Evaluation, and Management
Optimal vendor selection, qualification and ongoing evaluation are pivotal stages within the product lifecycle, safeguarding the delivery of high-quality medicinal products in alignment with rigorous regulatory standards. Emphasizing a risk-based approach and transparent communication, both industry expectations and regulatory bodies emphasize the importance of robust vendor selection, unveiling potential gaps and assessing vendor competency.
Vendor management finds its comprehensive definition in a well-designed Vendor Management Program, a domain where PharmaLex thrives in collaborating with organizations to establish a value-driven Vendor Management Program. This program encompasses essential components crucial to its efficacy:
Vendor Categorization: Tailoring vendor classification must be based on the principles of Quality Risk Management and related on their impact on product quality, efficacy, and safety. Rigorous criteria and qualifications are applied to high-risk vendors, ensuring a thorough process that aligns with stringent standards. PharmaLex's expertise guides organizations in assessing priority levels through rigorous vendor regulatory analysis, risk evaluation, and suitable qualification strategies.
Informed Vendor Selection: The foundation of vendor selection lies in defining user requirements for the materials in question. The selection criteria vary between critical and non-critical vendors, mirroring the weight of their contribution and associated risks. An all-encompassing vendor management program goes beyond and accounts for factors such as capacity, contingency plans, and vendor capabilities. In the case of critical vendors, a multidisciplinary team carefully assesses the data, culminating in a select pool of potential candidates.
Thorough Evaluation: A well-structured program mandates a thorough evaluation of potential vendors, anchored in a deep understanding of their impact. Critical vendors undergo a more rigorous assessment process before considered to be approved compared to non-critical ones, ensuring every aspect aligns with stringent standards. This process extends to encompass the holistic vendor ecosystem, as PharmaLex ensures that each vendor aligns harmoniously with the organizational objectives and regulatory requirements.
Risk-Driven Approach: A pivotal component of an effective vendor management program is the integration of a risk-driven lens. Through a detailed analysis of risk factors, PharmaLex empowers organizations to navigate the complexities of vendor management, forging alliances with vendors that align not only with quality parameters, but also with the inherent risks of the product and the broader regulatory landscape.
The critical juncture of vendor selection and sustained evaluation is a cornerstone of the product lifecycle. The collaboration between PharmaLex and organizations ensures the development of a Vendor Management Program that navigates the complex interplay of risk assessment, vendor selection, and ongoing evaluation, creating a harmonious union between organizational excellence and regulatory compliance.
The Importance of Risk Assessment
Regulatory bodies have emphasized risk assessment as a cornerstone of ensuring quality.
Extending Risk Assessment:
The principles of Quality Risk Management extend beyond excipients to encompass APIs, service providers, and equipment. A risk-based approach enables a comprehensive evaluation of potential hazards, allowing organizations to allocate appropriate resources to mitigate risks. Collaborative risk assessment efforts ensure a holistic perspective, identifying risks associated with each vendor and associated processes.
Vendor Qualification and Risk Assessment:
Vendor qualification involves systematically assessing potential risks associated with APIs, excipients, service providers, and equipment in a pro-active manner. Adhering to GMP guidelines and leveraging established standards, such as the IPEC PQG GMP Guide and USP General Chapter 1078, aids in defining appropriate risk assessment criteria. ICH Q9 “Quality Risk Management”  is a document recognized by many international regulatory bodies that provides principles and examples of tools for quality risk management that can be applied to different aspects of pharmaceutical quality, including vendor management. Cross-functional teams ensure a well-rounded evaluation, bringing diverse expertise to the process.
Harmonizing Excellence and Efficiency: The Dance of Quality and Cost in Selecting Alternative Vendors
The journey of vendor selection and qualification marks a pivotal chapter. However, this phase is not merely a concluding note; it is the overture to an ongoing symphony known as Vendor Management. This symphony is a harmonious blend of vigilance, collaboration, and continuous improvement, where the interplay of quality and efficiency remains a constant melody.
Once a vendor is selected and qualified, the real orchestration begins. It is an intricate dance that requires continuous attention. The collaboration between pharmaceutical companies and their vendors must not wane after qualification; rather, it must crescendo. This symphony is composed of various movements, each playing a critical role in maintaining the rhythm of quality while dancing to the tune of cost-effectiveness in a constant motion.
In the ever-evolving landscape of the pharmaceutical industry, the quest for innovation, efficiency, and quality is unending. One critical crossroad on this journey is the selection of alternative vendors. As pharmaceutical companies seek to optimize their supply chains, reduce costs, and maintain the highest standards of quality, the challenge lies in striking the delicate balance between these often opposing forces. The decision to shift to an alternative vendor requires a meticulous dance between the pursuit of excellence and the necessity of cost management, ensuring that neither compromises the integrity of the end product. In this intricate choreography, every step matters, as the industry strives to harmonize quality and cost.
Quality and reliability are the cornerstones upon which the pharmaceutical industry is built. It is the embodiment of patient safety, regulatory compliance, and the reputation of pharmaceutical companies. Every substance, every process, every partner contributes to the grand tapestry of quality. When considering an alternative vendor, maintaining this high standard becomes of paramount importance. The assurance that the vendor adheres to Good Manufacturing Practices (GMP), has a robust quality management system, and can seamlessly integrate into the existing supply chain is non-negotiable. As the industry is beholden to regulatory bodies that demand uncompromising quality, the choice of an alternative vendor must be underpinned by thorough due diligence.
Cost, however, remains a constant factor in the equation. The competitive nature of the pharmaceutical industry requires companies to explore avenues for cost reduction without jeopardizing product quality. Alternative vendors can potentially offer more favorable pricing structures, streamlined processes, or access to economies of scale. Yet, the pursuit of cost savings should never be put above the resolute commitment to quality. A vendor that compromises quality to lower costs is a mistake that no pharmaceutical company can afford to make.
The intricate tango between quality and cost involves multifaceted considerations. The transition to an alternative vendor requires a comprehensive risk assessment that evaluates potential impacts on quality, regulatory compliance, business continuity and overall business operations. This risk assessment is a prelude to a deeper dive into the vendor's capabilities, track record, and alignment with the company's values and objectives.
Collaboration emerges as a key partner in this dance. Cross-functional teams, comprising experts from Quality Assurance, Supply Chain Management, Regulatory Affairs, and Procurement, must work in harmony to make an informed decision. Their collective expertise ensures that the potential vendor's offerings align with the company's quality standards, regulatory requirements, and operational needs. This collaboration becomes the choreography that guides the company towards the most suitable alternative vendor.
A critical tool in this dance is the concept of Total Cost of Ownership (TCO). Beyond the immediate costs, TCO considers various factors such as transportation, storage, regulatory compliance, capability of supply and potential risks associated with the vendor. By quantifying these factors, pharmaceutical companies gain a holistic perspective on the financial impact of their decision. This informed view prevents any missteps in which a seemingly cost-effective alternative vendor becomes a long-term financial burden due to unforeseen complications.
Due diligence becomes the overture to the symphony of change. Rigorous audits, site visits, and thorough documentation reviews offer insights into the potential vendor's operations. Every note in this symphony contributes to the narrative of quality and cost, ensuring that the transition is smooth, and the harmonization of these factors is seamless.
The journey to alternative vendors is always a high-stakes performance where quality and cost take center stage. The pharmaceutical industry, a guardian of public health, cannot afford to falter in either domain. The delicate dance between these two factors requires a commitment to due diligence, collaboration, and the pursuit of excellence. It is a performance where every step matters, every note resonates, and the final act results in a harmonious blend of quality, cost-effectiveness, and uncompromised patient safety. As the industry continues this dance, it navigates a path toward innovation, efficiency, and a brighter future for healthcare.
Elevating Quality Assurance: Tailored Evaluations and Rigorous Audits for Vendor Excellence
After identifying a shortlist of potential primary or alternative vendors based on market reputation and cost, an evaluation of the vendor's quality system becomes imperative. This step serves to ascertain whether the vendor possesses the requisite control over their operational processes. Conducting an evaluation provides a comprehensive understanding of the vendor's performance and competency, effectively highlighting areas that require improvement. The depth of this evaluation should align with the risk level associated with the vendor, and PharmaLex stands ready to collaborate with organizations in the creation of tailored vendor assessments that accurately reflect these risk profiles. Initiating this assessment early in the vendor selection process is paramount, and its integration into the overall risk assessment for the final product is vital.
For vendors classified as non-critical or low-risk, the quality assessment may encompass specific sections of the vendor questionnaire or the submission of an ISO 9001 or equivalent certificate. In contrast, high-risk vendors, particularly those providing critical materials or services, necessitate on-site audits. These audits are carefully tailored to the nature of the vendor's service and their existing quality management system, be it GMP, GLP, GCP, or GDP. Notably, when auditing finished products, future potential markets must be factored into the equation, ensuring alignment with diverse regulatory requirements as applicable per jurisdiction. Leveraging our Team of seasoned auditors, PharmaLex is primed to execute audits to the relevant standards demanded by organizations, delivering customized audit reports that comprehensively outline all pertinent information. These reports are designed to facilitate the Declaration of GMP Compliance for API vendors. The next crucial step is to perform a Risk Assessment to evaluate the potential impact of a vendor to the Quality Management System.
Sustaining Risk Assessment in Vendor Management:
As mentioned before, integrating risk assessments into daily operations and quality management systems is crucial for sustainable vendor management. Periodic reviews and real-time updates to risk assessment profiles ensure ongoing risk mitigation. The risk assessment process should be seen as complementary to regular audits, providing an additional layer of assurance.
Crucial to effective vendor management is the development of contractual Quality and Technical Agreements. Risk management should be integrated as a comprehensive control strategy within Quality and Technical Agreements between parties. These agreements function as a delineation of responsibilities, expectations, and quality processes, guaranteeing the consistent supply of products and services that meet established standards. In close adherence to section 7.14 of the EU-GMP guideline, Chapter 7 "Outsourced Activities", PharmaLex collaborates with vendors to draft agreements that clearly outline roles and communication processes for outsourced activities. This proactive approach ensures that vendor comprehension of organizational expectations is precise, with exact details encompassing requirements such as Key Performance Indicators (KPIs). This, in turn, ensures an unwavering alignment of performance with expectations, continuous risk control and the seamless supply of quality products or services.
Sustained monitoring and control post-Agreement signing are indispensable. PharmaLex excels in establishing value-added processes that comprehensively manage and evaluate vendor performance. Our collaboration extends to rectifying actions and optimizing vendor processes, bolstering a culture of open communication and continuous improvement.
Continuous monitoring for approved vendors is a crucial aspect of maintaining a robust supply chain in the pharmaceutical industry. Once a vendor has been qualified and approved, the journey does not end there. It is essential to establish a systematic and ongoing monitoring and communication process to ensure the vendor consistently meets the required quality standards and continues to operate in compliance with regulations.
This involves regular assessments, audits, and periodic performance reviews to track the vendor's performance over time. It is a proactive approach to identify any deviations, potential risks, or opportunities for improvement. By continuously monitoring of approved vendors and maintaining an open dialogue, pharmaceutical companies can mitigate the risks associated with changes in vendor operations, quality issues, or regulatory non-compliance.
Ultimately, continuous monitoring enhances the company's ability to deliver high-quality products to the market consistently, while minimizing disruptions and maintaining patient safety. It is a dynamic and integral part of vendor management that contributes to the overall success and reputation of the pharmaceutical business.
In the ever-evolving pharmaceutical landscape, where external trustful partnerships are the cornerstone of progress, the expertise of PharmaLex shines as a beacon. A partner dedicated to delivering beyond compliance, PharmaLex invites you to embark on a journey of excellence.
Whether your endeavor involves setting up a vendor management program, fortifying an existing vendor management process or outsourcing the oversight element within your organization, PharmaLex's seasoned experts stand ready. With a strong record of delivering beyond mere compliance, we invite you to connect with us and explore the array of services we offer. Discover the confidence that comes with PharmaLex's expertise; a partner dedicated to your success.
Harmonize with Confidence: Partner with PharmaLex, and let excellence be the legacy of your pharmaceutical journey.
In mid-August 2023, privacy communities around the Internet were abuzz with criticism of a terms of service (ToS) update made by Zoom in March regarding artificial intelligence (AI). Zoom, a video communication platform, updated their terms of service to indicate:
10.4 Customer License Grant. You agree to grant and hereby grant Zoom a perpetual, worldwide, non-exclusive, royalty-free, sublicensable, and transferable license and all other rights required or necessary to redistribute, publish, import, access, use, store, transmit, review, disclose, preserve, extract, modify, reproduce, share, use, display, copy, distribute, translate, transcribe, create derivative works, and process Customer Content and to perform all acts with respect to the Customer Content: (i) as may be necessary for Zoom to provide the Services to you, including to support the Services; (ii) for the purpose of product and service development, marketing, analytics, quality assurance, machine learning, artificial intelligence, training, testing, improvement of the Services, Software, or Zoom’s other products, services, and software, or any combination thereof; and (iii) for any other purpose relating to any use or other act permitted in accordance with Section 10.3. If you have any Proprietary Rights in or to Service Generated Data or Aggregated Anonymous Data, you hereby grant Zoom a perpetual, irrevocable, worldwide, non-exclusive, royalty-free, sublicensable, and transferable license and all other rights required or necessary to enable Zoom to exercise its rights pertaining to Service Generated Data and Aggregated Anonymous Data, as the case may be, in accordance with this Agreement.
Notwithstanding the above, Zoom will not use audio, video or chat Customer Content to train our artificial intelligence models without your consent.1
Based on the criticism Zoom received, it then updated its ToS several days later to indicate that “Zoom does not use any of your audio, video, chat, screen sharing, attachments or other communications-like Customer Content (such as poll results, whiteboard and reactions) to train Zoom or third-party artificial intelligence models.2 Seeing an enterprise respond to consumers’ privacy concerns is refreshing. But there are some lessons that can be learned from the initial change to the ToS.
Zoom is not alone in training AI with consumer content or ineffectively communicating how personal data may be used. Microsoft’s privacy statement lists the following broad uses of personal data:
It is unclear what products Microsoft uses personal data to Improve and develop. Its privacy statement also specifies, “Our automated methods often are related to and supported by our manual methods. For example, to build, train, and Improve the accuracy of our automated methods of processing (including artificial intelligence or AI), we manually review some of the predictions and inferences produced by the automated methods against the underlying data from which the predictions and inferences were made.”4
It is unclear why Zoom was the focus of so much scrutiny recently for its privacy practices but other enterprises doing something similar have not received this criticism. Although Zoom is being used in this example, many enterprises would benefit from exploring the issues with Zoom’s previous ToS and implementing the lessons learned.
There were numerous privacy-related concerns associated with Zoom’s initial ToS update. One of the main concerns is that the purpose for collecting customer content data was incredibly broad. Per Zoom’s previous ToS, customer content could be used for machine learning or AI purposes. It indicated that audio, video and chat would not be used to train AI models without consent, but what is consent in this context? Is consent an active and explicit opt-in or is consent granted by accepting Zoom’s long and broad ToS? Forcing users to accept a long, all-or-nothing ToS diminishes trust and does not provide consumers with any meaningful way to express privacy preferences.5
Another consent-related concern is that privacy preferences may only be shown to account owners, not all meeting participants. Attending a Zoom meeting does not require having a Zoom account. And even if all participants saw the ToS before a meeting, they may not have had the ability to decline Zoom’s excessive data collection. Those who are required to use Zoom for work meetings may have felt pressured to use the platform despite their concerns—and this issue is not specific to Zoom.
In addition, some medical providers leverage Zoom for telehealth services. Zoom has distinct business associate agreements (BAA) with healthcare providers, but Zoom indicated that—for healthcare and education customers—it “will not use customer content, including education records or protected health information, to train our artificial intelligence models without your consent.”6 This consent would have been provided to Zoom by the healthcare provider, not the patient. Training AI on data from healthcare accounts with BAAs should not be possible. Healthcare providers may not be savvy from a technical privacy perspective and might inadvertently share their patients’ protected health information (PHI).
Communication around Zoom’s update was unclear. It took many users months to realize this change to the ToS. To be fair, no enterprise can force consumers to read a ToS, but when Zoom received criticism about the update, it published a blog post that did not clarify how customer data could be used for AI training purposes. It emphasized, “For AI, we do not use audio, video, or chat content for training our models without customer consent,”7 but it did not address what customer consent looks like. It was not until a few days later that Zoom revised its ToS (and the accompanying blog post) to indicate that the organization would not use consumer content to train its AI models. Zoom should be commended for listening to consumer concerns and revising its data processing practices in accordance with those concerns.
The blog post initially had a screenshot of how account owners/administrators could opt in to Zoom’s generative AI features, but opting in to generative AI functionality is entirely different than consenting to data collection for the purposes of training AI. The blog post conflated agreeing to use generative AI tools with consenting to data collection, and the two are not the same. And while all meeting participants would be notified if Zoom’s AI features are running, their only options would be to leave the meeting or accept it,8 which does not actually offer attendees a meaningful choice.
Enterprises looking to change the way they process data or the scope of data collected must notify consumers. To ensure that data subjects are protected, any changes made should be in alignment with privacy by design (PbD) principles. The principles of “visibility and transparency: keep it open,” and “respect for user privacy: keep it user-centric”9 are especially important. Enterprises should state exactly what consent looks like and how consumers can exercise their privacy-related rights. Note that enterprises may have specific consent-related obligations depending on applicable privacy laws and regulations (e.g., under the EU General Data Protection Regulation [GDPR], data subjects have the right to withdraw their consent).10
When new, large-scale changes are made to a ToS or privacy notice, enterprises should ensure that messaging is consistent across the organization. Product managers, marketing teams and social media teams should be informed of the changes being made and how to accurately and clearly speak to them. Zoom received questions about what providing consent looked like, but the questions focused on how to opt out of Zoom’s generative AI features, not how to opt out of sharing data for AI training purposes. It is possible that those who wrote blog posts and answered questions about this issue did not have the technical know-how to understand the difference between collecting data to train AI and using an AI tool. A brief frequently asked questions (FAQ) document for staff can help ensure that all enterprise representatives are speaking accurately and consistently. Although blog posts and supplementary materials may be useful, the ToS is ultimately what consumers are agreeing to, so it is imperative that the content of those documents are in alignment.
A ToS document or privacy notice should explain what data are collected and why they are collected. Enterprises must know exactly how data will be used and be able to relay that information. Enterprises must avoid collecting data if they cannot explain to consumers exactly how those data will be used.
In addition, some information, if improperly disclosed, can cause more harm than other information. For example, credit card numbers and health information may be more detrimental if breached than email addresses. Zoom previously would have allowed healthcare providers to consent to sharing data for the purposes of training AI, but these providers deal with health-related information. This is highly sensitive and, depending on the jurisdiction, may be subject to stringent regulatory requirements. Enterprises must understand the data they collect and if they must prohibit data sharing practices for certain types of data or consumers/accounts.
There are numerous privacy-related concerns associated with AI.11 Enterprises desiring to use consumer data to train AI algorithms need to consider the implications of the data that may train AI. For example, transcripts of a conversation between a therapist and patient might result in biases and prejudice against certain mental health conditions if those conversations are used to train AI.
Consumers may not read every ToS for every service provider they interact with, but enterprises are still obligated to have clear and up-to-date ToSs in place. By practicing PbD, clearly communicating with data subjects, knowing the purposes for which data were collected and being able to clearly communicate that information, enterprises can empower and build trust with their consumers.
1 Zoom, “Zoom Terms of Service,” Internet Archive Wayback Machine, 7 August 2023
2 Zoom, “Zoom Terms of Service,” 11 August 2023
3 Microsoft, “Microsoft Privacy Statement,” August 2023
5 Kazi, S.; “Educated and Empowered Data Subjects: A Privacy Prerequisite,” ISACA industry News, 9 March 2022
6 Hashim, S.; “How Zoom’s Terms of Service and Practices Apply to AI Features,” Zoom Blog, Internet Archive Wayback Machine, 7 August 2023
9 Cavoukian, A.; Operationalizing Privacy by Design: A Guide to Implementing Strong Privacy Practices, Information and Privacy Commissioner, Canada, December 2012
10 Wolford, B.; “What Are the GDPR Consent Requirements?” GDPR EU
11 Tang, A.; “Making AI GDPR Compliant,” ISACA® Journal, vol. 5, 2019
Is a privacy professional practices principal at ISACA®. In this role, she focuses on the development of ISACA’s privacy-related resources, including books, white papers and review manuals. Kazi has worked at ISACA for 9 years, previously working on the ISACA® Journal and developing the award-winning ISACA Podcast. In 2021, she was a recipient of the AM&P Network’s Emerging Leader award, which recognizes innovative association publishing professionals under the age of 35.
Over its 25-year history, Prosemi has become a trusted source for testing electronic components among some of the world's largest CEMs and OEMs. As the largest test house in Southeast Asia, Prosemi is equipped to address the risks of counterfeit electronic and substandard components, which pose significant risks to the electronics industry, as they may compromise the functionality, reliability, and safety of devices. To go beyond baseline testing and assessment, Prosemi invests in the latest quality solutions and technology, as well as various certifications, to set the highest standards. This is where the C-mode scanning acoustic microscopy (C-SAM) tool is largely beneficial for customers concerned with defective, cloned, renewed, or recycled parts.
Deploying the C-SAM Tool
To address the risks in the market, the industry has developed several standardized testing methods, which include destructive analysis. Destructive analysis is particularly effective in the recycled and remarked areas. One way to conduct an analysis is with C-SAM, a destructive testing tool where parts are submerged in deionized water.
How C-SAM Works
C-SAM utilizes high-frequency ultrasound waves to inspect and create images of the internal structures of samples, providing valuable insights into their properties. This technique is commonly used for quality control and failure analysis in numerous industries, including electronics, semiconductor manufacturing, aerospace, and materials science. It is particularly valuable for counterfeit analysis due to its ability to identify anomalies and inconsistencies in the packaging and construction of semiconductor components.
By using C-SAM, the internal structure of the packaged semiconductor can be visualized without physically opening them. The tool tests for internal voids or delamination between molding to die and molding to lead frames. By comparing these images with known authentic components, any deviations or irregularities can be identified, indicating potential counterfeiting.
Delamination occurs when different layers within a semiconductor package separate. Through C-SAM, cracks, voids, or delamination between the die and die paddle can be detected, indicating potential repackaging or improper assembly, or handling or storage issues. These delamination problems are often found in counterfeit devices that have undergone improper handling or rework, particularly with recycled parts from printed circuit boards (PCBs).
Prosemi's Commitment to Authentication and Quality Assurance
Prosemi's acquisition of a high-end C-SAM machine and extensive expertise represents a significant advancement in authenticating electronic components and ensuring top-quality assurance. The company's commitment to unique value propositions and innovative approaches have been recognized with the prestigious Manufacturing Technology award that Prosemi recently received at the SBR (Singapore Business Review) National Business Awards, recognizing its exceptional in-house electrical testing solutions.
Combining the C-SAM tool with the addition of IDEA-STD-1010, and leveraging other cutting-edge equipment like Scanning Electron Microscope, Fourier Transform Infrared Spectroscopy, electrical functional, and JTAG boundary scanning test solutions, Prosemi confidently positions itself as a leader in driving advancements and shaping the future of testing methodologies and solidifies its prominent standing within the industry.
Are you thinking about the authenticity of electronic components? Get in touch with Prosemi today.
Prosemi has built a reputation among the world's largest CMs and OEMs for testing electronic components and providing baking, tape and reel, and IC programming services. Prosemi's world-class services are driven by their commitment to quality, extensive industry experience and exceptional technical capabilities.
Learn more about Prosemi's quality program and view current certifications here.
Speaking on the certifications received on Wednesday, the Director General, OEQA, Mrs. Abiola Seriki-Ayeni, who spoke with our correspondent at her office at the Secretariat, Alausa, on Thursday, said the certifications showed the important strides the agency had taken in revolutionising the education sector in Africa.
“The certifications (ISO 9001:2015 and ISO 21001:2018) came as a recognition of our achievements in regulating pre-tertiary education in Lagos State through inspection, monitoring, assessment, standardisation and providing assurance of good quality educational services related to teaching and learning,” Seriki-Ayeni said.
She added that the feat also makes OEQA the first state educational agency in Nigeria to receive an ISO certification.
The DG noted earlier in a statement that the OEQA has “curated learning outcomes to inspire intellectual growth for our students and our quest for excellence,” which has made the agency the first government educational agency in Africa “to receive both prestigious ISO certifications. It is a truly historic achievement for us.”
She further stated that the certifications “mark extraordinary firsts for Nigeria and Africa,” noting that the agency is committed to bolstering educational standards and transforming educational landscapes.
While stressing that the Mr. Babajide Sanwo-Olu-led Lagos State Government places a high premium on quality education, Seriki-Ayeni appealed for the support of stakeholders in the education sector, saying, “With your support, we will empower our students with the best education and equip them with the necessary skills to excel.”
All rights reserved. This material, and other digital content on this website, may not be reproduced, published, broadcast, rewritten or redistributed in whole or in part without prior express written permission from PUNCH.
Contact: [email protected]
Blue-light blocking glasses, widely used to alleviate eye strain and enhance sleep, may not provide the expected benefits, according to a exact comprehensive analysis of 17 studies.
The analysis found that these Blue-light blocking glasses, designed to shield eyes from the potentially harmful blue light emitted by screens, are unlikely to significantly diminish digital eye strain or Improve sleep quality.
The investigation, led by Laura Downie, an associate professor in optometry and vision sciences at The University of Melbourne, examined randomised controlled trials exploring the effects of blue-light glasses on vision, eye health, and sleep quality. It revealed that only a handful of studies demonstrated a marginal reduction in eye strain, and these were conducted over brief periods.
Notably, blue-light filtering lenses generally block only a small portion of blue light, ranging from 10% to 25%, while screens emit relatively limited amounts. Mark Rosenfield, a professor of biological and vision sciences, pointed out that the primary source of blue light exposure is the sun, not screens.
Regarding sleep, the six studies included in the analysis showed mixed results. While some studies indicated positive effects on sleep, these benefits were predominantly observed in specific groups, and there is inadequate evidence to support their general applicability.
The association between blue light and sleep disruption is derived from studies on circadian rhythms, which are more extensively explored in animals than in humans. However, experts suggest that other factors, such as the content viewed on screens and bedtime routines, play a more substantial role in sleep quality.
In light of these findings, experts recommend prioritising other strategies to mitigate eye strain and sleep problems. Maintaining consistent sleep schedules, avoiding caffeine close to bedtime, and refraining from screen usage before sleep are more likely to positively influence sleep quality.
While Blue-light blocking glasses may not be harmful, they may not provide the anticipated benefits either.
Hazy skies are expected in the Chicago area starting Friday and continuing into the weekend, but should you be concerned?
As of 1 p.m. Friday air quality levels remained in the "moderate" category in Chicago, with the city's PM2.5 -- an air pollutant also known as fine particulate matter -- sitting at around 67, according to AirNow. An "unhealthy for sensitive groups" level is reached when the PM2.5 rises above 100.
"If you are unusually sensitive to particle pollution, consider reducing your activity level or shorten the amount of time you are active outdoors," the government website states at the city's current level.
Hazy conditions were reported in and around the city, leading to concerns as residents may recall a series of previous alerts for unhealthy conditions numerous times this summer due to Canadian wildfires. And the worst fire season on record in Canada shows no signs of easing.
According to NBC 5 Storm Team Meteorologist Kevin Jeanes, the hazy conditions are still "due to the return of wildfire smoke and haze."
"The smoke isn’t too dense and most of it is expected to stay up in the atmosphere rather than getting mixed down to the ground," Jeanes said, noting, however, that unhealthy levels could still be reported during the afternoon hours.
Nearby, all of Wisconsin is under an air quality advisory.
"Canadian wildfire smoke, although less dense and delayed in arrival, is moving into the state from the northwest and will travel south southeast [Thursday night] into Friday morning," the advisory stated.
The National Weather Service predicts levels could become "unhealthy for sensitive groups" over the weekend "due to favorable weather conditions alongside the presence of wildfire smoke."
The haze could linger in the Chicago area into Saturday as well, but is expected to clear up Saturday evening and into the overnight hours, Jeanes said.
Meanwhile, the city will usher in a potentially dangerous heat wave, which could lead to different air quality concerns.
"While the smoke may no longer be an issue, higher levels of ozone could," Jeanes said. "I wouldn’t be surprised if we get some air quality alerts for ozone Sunday and again around Wednesday and Thursday next week."
Canada has seen a record number of wildfires this year — contributing to choking smoke in parts of the U.S. — with more than 5,700 fires burning more than 53,000 square miles from one end of Canada to the other, according to the Canadian Interagency Forest Fire Centre.
As of Friday morning, more than 1,000 wildfires were burning across the country, over half of them out of control.
AirNow said its air quality index determines the level of air pollution and the correlating health concerns.
"When AQI values are above 100, air quality is unhealthy: at first for certain sensitive groups of people, then for everyone as AQI values get higher," the website states.
Once levels reach above 300, the area enters the highest level of concern known as "hazardous."
In total, there are six categories: green, or good; yellow, or moderate; orange, or unhealthy for sensitive groups; red, or unhealthy; purple, or very unhealthy; and maroon, or hazardous.
|Daily AQI Color||Levels of Concern||Values of Index||Description of Air Quality|
|Green||Good||0 to 50||Air quality is satisfactory, and air pollution poses little or no risk.|
|Yellow||Moderate||51 to 100||Air quality is acceptable. However, there may be a risk for some people, particularly those who are unusually sensitive to air pollution.|
|Orange||Unhealthy for Sensitive Groups||101 to 150||Members of sensitive groups may experience health effects. The general public is less likely to be affected.|
|Red||Unhealthy||151 to 200||Some members of the general public may experience health effects; members of sensitive groups may experience more serious health effects.|
|Purple||Very Unhealthy||201 to 300||Health alert: The risk of health effects is increased for everyone.|
|Maroon||Hazardous||301 and higher||Health warning of emergency conditions: everyone is more likely to be affected.|
According to the Environmental Protection Agency, "PM stands for particulate matter (also called particle pollution): the term for a mixture of solid particles and liquid droplets found in the air."
"Some particles, such as dust, dirt, soot, or smoke, are large or dark enough to be seen with the naked eye. Others are so small they can only be detected using an electron microscope," the EPA states.
PM2.5 in particular involves "fine inhalable particles, with diameters that are generally 2.5 micrometers and smaller." By comparison, the average human hair strand is about 70 micrometers in diameter, or 30 times larger than these particles.
PM2.5 is one of five major air pollutants regulated by the Clean Air Act, which also includes ground-level ozone, carbon monoxide, sulfur dioxide and nitrogen dioxide.