936 resultados para Process analysis
Resumo:
With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. LAY ABSTRACT: Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here explains how a biopharmaceutical process FMEA can be conducted. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. With the help of this guideline, different details of the manufacturing process can be ranked according to their potential risks, and this can help pharmaceutical companies to identify aspects with high potential risks and to react accordingly to improve the safety of medicines.
Resumo:
As the demand for miniature products and components continues to increase, the need for manufacturing processes to provide these products and components has also increased. To meet this need, successful macroscale processes are being scaled down and applied at the microscale. Unfortunately, many challenges have been experienced when directly scaling down macro processes. Initially, frictional effects were believed to be the largest challenge encountered. However, in recent studies it has been found that the greatest challenge encountered has been with size effects. Size effect is a broad term that largely refers to the thickness of the material being formed and how this thickness directly affects the product dimensions and manufacturability. At the microscale, the thickness becomes critical due to the reduced number of grains. When surface contact between the forming tools and the material blanks occur at the macroscale, there is enough material (hundreds of layers of material grains) across the blank thickness to compensate for material flow and the effect of grain orientation. At the microscale, there may be under 10 grains across the blank thickness. With a decreased amount of grains across the thickness, the influence of the grain size, shape and orientation is significant. Any material defects (either natural occurring or ones that occur as a result of the material preparation) have a significant role in altering the forming potential. To date, various micro metal forming and micro materials testing equipment setups have been constructed at the Michigan Tech lab. Initially, the research focus was to create a micro deep drawing setup to potentially build micro sensor encapsulation housings. The research focus shifted to micro metal materials testing equipment setups. These include the construction and testing of the following setups: a micro mechanical bulge test, a micro sheet tension test (testing micro tensile bars), a micro strain analysis (with the use of optical lithography and chemical etching) and a micro sheet hydroforming bulge test. Recently, the focus has shifted to study a micro tube hydroforming process. The intent is to target fuel cells, medical, and sensor encapsulation applications. While the tube hydroforming process is widely understood at the macroscale, the microscale process also offers some significant challenges in terms of size effects. Current work is being conducted in applying direct current to enhance micro tube hydroforming formability. Initially, adding direct current to various metal forming operations has shown some phenomenal results. The focus of current research is to determine the validity of this process.
Resumo:
Balancing the frequently conflicting priorities of conservation and economic development poses a challenge to management of the Swiss Alps Jungfrau-Aletsch World Heritage Site (WHS). This is a complex societal problem that calls for a knowledge-based solution. This in turn requires a transdisciplinary research framework in which problems are defined and solved cooperatively by actors from the scientific community and the life-world. In this article we re-examine studies carried out in the region of the Swiss Alps Jungfrau-Aletsch WHS, covering three key issues prevalent in transdisciplinary settings: integration of stakeholders into participatory processes; perceptions and positions; and negotiability and implementation. In the case of the Swiss Alps Jungfrau-Aletsch WHS the transdisciplinary setting created a situation of mutual learning among stakeholders from different levels and backgrounds. However, the studies showed that the benefits of such processes of mutual learning are continuously at risk of being diminished by the power play inherent in participatory approaches.
Resumo:
With a steady increase of regulatory requirements for business processes, automation support of compliance management is a field garnering increasing attention in Information Systems research. Several approaches have been developed to support compliance checking of process models. One major challenge for such approaches is their ability to handle different modeling techniques and compliance rules in order to enable widespread adoption and application. Applying a structured literature search strategy, we reflect and discuss compliance-checking approaches in order to provide an insight into their generalizability and evaluation. The results imply that current approaches mainly focus on special modeling techniques and/or a restricted set of types of compliance rules. Most approaches abstain from real-world evaluation which raises the question of their practical applicability. Referring to the search results, we propose a roadmap for further research in model-based business process compliance checking.
Resumo:
BACKGROUND AND PURPOSE We report on workflow and process-based performance measures and their effect on clinical outcome in Solitaire FR Thrombectomy for Acute Revascularization (STAR), a multicenter, prospective, single-arm study of Solitaire FR thrombectomy in large vessel anterior circulation stroke patients. METHODS Two hundred two patients were enrolled across 14 centers in Europe, Canada, and Australia. The following time intervals were measured: stroke onset to hospital arrival, hospital arrival to baseline imaging, baseline imaging to groin puncture, groin puncture to first stent deployment, and first stent deployment to reperfusion. Effects of time of day, general anesthesia use, and multimodal imaging on workflow were evaluated. Patient characteristics and workflow processes associated with prolonged interval times and good clinical outcome (90-day modified Rankin score, 0-2) were analyzed. RESULTS Median times were onset of stroke to hospital arrival, 123 minutes (interquartile range, 163 minutes); hospital arrival to thrombolysis in cerebral infarction (TICI) 2b/3 or final digital subtraction angiography, 133 minutes (interquartile range, 99 minutes); and baseline imaging to groin puncture, 86 minutes (interquartile range, 24 minutes). Time from baseline imaging to puncture was prolonged in patients receiving intravenous tissue-type plasminogen activator (32-minute mean delay) and when magnetic resonance-based imaging at baseline was used (18-minute mean delay). Extracranial carotid disease delayed puncture to first stent deployment time on average by 25 minutes. For each 1-hour increase in stroke onset to final digital subtraction angiography (or TICI 2b/3) time, odds of good clinical outcome decreased by 38%. CONCLUSIONS Interval times in the STAR study reflect current intra-arterial therapy for patients with acute ischemic stroke. Improving workflow metrics can further improve clinical outcome. CLINICAL TRIAL REGISTRATION: URL http://www.clinicaltrials.gov. Unique identifier: NCT01327989.
Resumo:
Objective: Processes occurring in the course of psychotherapy are characterized by the simple fact that they unfold in time and that the multiple factors engaged in change processes vary highly between individuals (idiographic phenomena). Previous research, however, has neglected the temporal perspective by its traditional focus on static phenomena, which were mainly assessed at the group level (nomothetic phenomena). To support a temporal approach, the authors introduce time-series panel analysis (TSPA), a statistical methodology explicitly focusing on the quantification of temporal, session-to-session aspects of change in psychotherapy. TSPA-models are initially built at the level of individuals and are subsequently aggregated at the group level, thus allowing the exploration of prototypical models. Method: TSPA is based on vector auto-regression (VAR), an extension of univariate auto-regression models to multivariate time-series data. The application of TSPA is demonstrated in a sample of 87 outpatient psychotherapy patients who were monitored by postsession questionnaires. Prototypical mechanisms of change were derived from the aggregation of individual multivariate models of psychotherapy process. In a 2nd step, the associations between mechanisms of change (TSPA) and pre- to postsymptom change were explored. Results: TSPA allowed a prototypical process pattern to be identified, where patient's alliance and self-efficacy were linked by a temporal feedback-loop. Furthermore, therapist's stability over time in both mastery and clarification interventions was positively associated with better outcomes. Conclusions: TSPA is a statistical tool that sheds new light on temporal mechanisms of change. Through this approach, clinicians may gain insight into prototypical patterns of change in psychotherapy.
Resumo:
Analysis of recurrent events has been widely discussed in medical, health services, insurance, and engineering areas in recent years. This research proposes to use a nonhomogeneous Yule process with the proportional intensity assumption to model the hazard function on recurrent events data and the associated risk factors. This method assumes that repeated events occur for each individual, with given covariates, according to a nonhomogeneous Yule process with intensity function λx(t) = λ 0(t) · exp( x′β). One of the advantages of using a non-homogeneous Yule process for recurrent events is that it assumes that the recurrent rate is proportional to the number of events that occur up to time t. Maximum likelihood estimation is used to provide estimates of the parameters in the model, and a generalized scoring iterative procedure is applied in numerical computation. ^ Model comparisons between the proposed method and other existing recurrent models are addressed by simulation. One example concerning recurrent myocardial infarction events compared between two distinct populations, Mexican-American and Non-Hispanic Whites in the Corpus Christi Heart Project is examined. ^
Resumo:
Colorectal cancer is the forth most common diagnosed cancer in the United States. Every year about a hundred forty-seven thousand people will be diagnosed with colorectal cancer and fifty-six thousand people lose their lives due to this disease. Most of the hereditary nonpolyposis colorectal cancer (HNPCC) and 12% of the sporadic colorectal cancer show microsatellite instability. Colorectal cancer is a multistep progressive disease. It starts from a mutation in a normal colorectal cell and grows into a clone of cells that further accumulates mutations and finally develops into a malignant tumor. In terms of molecular evolution, the process of colorectal tumor progression represents the acquisition of sequential mutations. ^ Clinical studies use biomarkers such as microsatellite or single nucleotide polymorphisms (SNPs) to study mutation frequencies in colorectal cancer. Microsatellite data obtained from single genome equivalent PCR or small pool PCR can be used to infer tumor progression. Since tumor progression is similar to population evolution, we used an approach known as coalescent, which is well established in population genetics, to analyze this type of data. Coalescent theory has been known to infer the sample's evolutionary path through the analysis of microsatellite data. ^ The simulation results indicate that the constant population size pattern and the rapid tumor growth pattern have different genetic polymorphic patterns. The simulation results were compared with experimental data collected from HNPCC patients. The preliminary result shows the mutation rate in 6 HNPCC patients range from 0.001 to 0.01. The patients' polymorphic patterns are similar to the constant population size pattern which implies the tumor progression is through multilineage persistence instead of clonal sequential evolution. The results should be further verified using a larger dataset. ^
Resumo:
The purpose of this dissertation was to develop a conceptual framework which can be used to account for policy decisions made by the House Ways and Means Committee (HW&MC) of the Texas House of Representatives. This analysis will examine the actions of the committee over a ten-year period with the goal of explaining and predicting the success of failure of certain efforts to raise revenue.^ The basis framework for modelling the revenue decision-making process includes three major components--the decision alternatives, the external factors and two competing contingency theories. The decision alternatives encompass the particular options available to increase tax revenue. The options were classified as non-innovative or innovative. The non-innovative options included the sales, franchise, property and severance taxes. The innovative options were principally the personal and corporate income taxes.^ The external factors included political and economic constraints that affected the actions of the HW&MC. Several key political constraints on committee decision-making were addressed--including public attitudes, interest groups, political party strength and tradition and precedents. The economic constraints that affected revenue decisions included court mandates, federal mandates and the fiscal condition of the nation and the state.^ The third component of the revenue decision-making framework included two alternative contingency theories. The first alternative theory postulated that the committee structure, including the individual member roles and the overall committee style, resulted in distinctive revenue decisions. This theory will be favored if evidence points to the committee acting autonomously with less concern for the policies of the Speaker of the House. The Speaker assignment theory, postulated that the assignment of committee members shaped or changed the course of committee decision-making. This theory will be favored if there was evidence that the committee was strictly a vehicle for the Speaker to institute his preferred tax policies.^ The ultimate goal of this analysis is to develop an explanation for legislative decision-making about tax policy. This explanation will be based on the linkages across various tax options, political and economic constraints, member roles and committee style and the patterns of committee assignment. ^
Resumo:
A protocol of selection, training and validation of the members of the panel for bread sensory analysis is proposed to assess the influence of wheat cultivar on the sensory quality of bread. Three cultivars of bread wheat and two cultivars of spelt wheat organically-grown under the same edaphoclimatic conditions were milled and baked using the same milling and baking procedure. Through the use of triangle tests, differences were identified between the five breads. Significant differences were found between the spelt breads and those made with bread wheat for the attributes ?crumb cell homogeneity? and ?crumb elasticity?. Significant differences were also found for the odor and flavor attributes, with the bread made with ?Espelta Navarra? being the most complex, from a sensory point of view. Based on the results of this study, we propose that sensory properties should be considered as breeding criteria for future work on genetic improvement.