935 resultados para Electronic data processing -- Quality control
Resumo:
The increased awareness and evolved consumer habits have set more demanding standards for the quality and safety control of food products. The production of foodstuffs which fulfill these standards can be hampered by different low-molecular weight contaminants. Such compounds can consist of, for example residues of antibiotics in animal use or mycotoxins. The extremely small size of the compounds has hindered the development of analytical methods suitable for routine use, and the methods currently in use require expensive instrumentation and qualified personnel to operate them. There is a need for new, cost-efficient and simple assay concepts which can be used for field testing and are capable of processing large sample quantities rapidly. Immunoassays have been considered as the golden standard for such rapid on-site screening methods. The introduction of directed antibody engineering and in vitro display technologies has facilitated the development of novel antibody based methods for the detection of low-molecular weight food contaminants. The primary aim of this study was to generate and engineer antibodies against low-molecular weight compounds found in various foodstuffs. The three antigen groups selected as targets of antibody development cause food safety and quality defects in wide range of products: 1) fluoroquinolones: a family of synthetic broad-spectrum antibacterial drugs used to treat wide range of human and animal infections, 2) deoxynivalenol: type B trichothecene mycotoxin, a widely recognized problem for crops and animal feeds globally, and 3) skatole, or 3-methyindole is one of the two compounds responsible for boar taint, found in the meat of monogastric animals. This study describes the generation and engineering of antibodies with versatile binding properties against low-molecular weight food contaminants, and the consecutive development of immunoassays for the detection of the respective compounds.
Resumo:
Dues à leur importance croissante dans la dégénérescence musculaire, les mitochondries sont de plus en plus étudiées en relation à diverses myopathies. Leurs mécanismes de contrôle de qualité sont reconnus pour leur rôle important dans la santé mitochondrial. Dans cette étude, nous tentons de déterminer si le déficit de mitophagie chez les souris déficiente du gène Parkin causera une exacerbation des dysfonctions mitochondriales normalement induite par la doxorubicine. Nous avons analysé l’impact de l’ablation de Parkin en réponse à un traitement à la doxorubicine au niveau du fonctionnement cardiaque, des fonctions mitochondriales et de l’enzymologie mitochondriale. Nos résultats démontrent qu’à l’état basal, l’absence de Parkin n’induit pas de pathologie cardiaque mais est associé à des dysfonctions mitochondriales multiples. La doxorubicine induit des dysfonctions respiratoires, du stress oxydant mitochondrial et une susceptibilité à l’ouverture du pore de transition de perméabilité (PTP). Finalement, contrairement à notre hypothèse, l’absence de Parkin n’accentue pas les dysfonctions mitochondriales induites par la doxorubicine et semble même exercer un effet protecteur.
Resumo:
The application of computer vision based quality control has been slowly but steadily gaining importance mainly due to its speed in achieving results and also greatly due to its non- destnictive nature of testing. Besides, in food applications it also does not contribute to contamination. However, computer vision applications in quality control needs the application of an appropriate software for image analysis. Eventhough computer vision based quality control has several advantages, its application has limitations as to the type of work to be done, particularly so in the food industries. Selective applications, however, can be highly advantageous and very accurate.Computer vision based image analysis could be used in morphometric measurements of fish with the same accuracy as the existing conventional method. The method is non-destructive and non-contaminating thus providing anadvantage in seafood processing.The images could be stored in archives and retrieved at anytime to carry out morphometric studies for biologists.Computer vision and subsequent image analysis could be used in measurements of various food products to assess uniformity of size. One product namely cutlet and product ingredients namely coating materials such as bread crumbs and rava were selected for the study. Computer vision based image analysis was used in the measurements of length, width and area of cutlets. Also the width of coating materials like bread crumbs was measured.Computer imaging and subsequent image analysis can be very effectively used in quality evaluations of product ingredients in food processing. Measurement of width of coating materials could establish uniformity of particles or the lack of it. The application of image analysis in bacteriological work was also done
Resumo:
In this paper, various types of fault detection methods for fuel cells are compared. For example, those that use a model based approach or a data driven approach or a combination of the two. The potential advantages and drawbacks of each method are discussed and comparisons between methods are made. In particular, classification algorithms are investigated, which separate a data set into classes or clusters based on some prior knowledge or measure of similarity. In particular, the application of classification methods to vectors of reconstructed currents by magnetic tomography or to vectors of magnetic field measurements directly is explored. Bases are simulated using the finite integration technique (FIT) and regularization techniques are employed to overcome ill-posedness. Fisher's linear discriminant is used to illustrate these concepts. Numerical experiments show that the ill-posedness of the magnetic tomography problem is a part of the classification problem on magnetic field measurements as well. This is independent of the particular working mode of the cell but influenced by the type of faulty behavior that is studied. The numerical results demonstrate the ill-posedness by the exponential decay behavior of the singular values for three examples of fault classes.
Resumo:
The real-time quality control (RTQC) methods applied to Argo profiling float data by the United Kingdom (UK) Met Office, the United States (US) Fleet Numerical Meteorology and Oceanography Centre, the Australian Bureau of Meteorology and the Coriolis Centre are compared and contrasted. Data are taken from the period 2007 to 2011 inclusive and RTQC performance is assessed with respect to Argo delayed-mode quality control (DMQC). An intercomparison of RTQC techniques is performed using a common data set of profiles from 2010 and 2011. The RTQC systems are found to have similar power in identifying faulty Argo profiles but to vary widely in the number of good profiles incorrectly rejected. The efficacy of individual QC tests are inferred from the results of the intercomparison. Techniques to increase QC performance are discussed.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The business world has changed the way how people think and act on products and services. In this context, the most recent amendment of the scenarios of retail operations has been the use of technology in sales and distribution. The internet has revolutionized the way people communicate, and moreover as they purchase their goods and services. Thus, the e-commerce, specifically the relation business to customer, or simply B2C, has acted so convincingly in this change of paradigm, namely the purchases in the physical location for the virtual site. Quotes online, ease of payment, price, speed of delivery, have become real order winners of applications for companies that compete in this segment. With the focus on quality of services on e-commerce, the research examines the dimension related to the quality of services, and looks for what of these factors are winners of applications. © 2010 IFIP International Federation for Information Processing.
Resumo:
Nowadays, L1 SBAS signals can be used in a combined GPS+SBAS data processing. However, such situation restricts the studies over short baselines. Besides of increasing the satellite availability, SBAS satellites orbit configuration is different from that of GPS. In order to analyze how these characteristics can impact GPS positioning in the southeast area of Brazil, experiments involving GPS-only and combined GPS+SBAS data were performed. Solutions using single point and relative positioning were computed to show the impact over satellite geometry, positioning accuracy and short baseline ambiguity resolution. Results showed that the inclusion of SBAS satellites can improve the accuracy of positioning. Nevertheless, the bad quality of the data broadcasted by these satellites limits their usage. © Springer-Verlag Berlin Heidelberg 2012.
Resumo:
Every port is unique. Although all ports exist for the same basic purpose (to act as an interface in the transfer from one mode of transport to another), no two are ever organized in the same way.Ports may be classified according to: Physical conditions: location (geographical position, man-made or natural harbour, estuary location, difficult weather conditions, tides, etc.) and size (large, small or medium-sized). Use: commercial (general cargo, bulk solids, bulk liquids, oil, break bulk, mixed), passenger, sport and leisure, fishing, mixed, etc. Ownership: private, municipal, regional or State-owned. The Port Authority's role in management of the port: Overall control, i.e. the Port Authority plans, sets up and operates the whole range of services. Facilitator, i.e. the Port Authority plans and sets up the infrastructure and the superstructure, but services are provided by private companies. Landlord, i.e. the Port Authority allows private companies to be responsible for the superstructure and provide port services. Different combinations of port types will therefore give rise to different kinds of organization and different information flows, which means that the associated information systems may differ significantly from port to port. Since this paper relates to the port of Barcelona, with its own specific characteristics, the contents may not always be applicable to other ports.
Resumo:
Myocardial remodeling and heart failure (HF) are common sequelae of many forms of cardiovascular disease and a leading cause of mortality worldwide. Accumulation of damaged cardiac proteins in heart failure has been described. However, how protein quality control (PQC) is regulated and its contribution to HF development are not known. Here, we describe a novel role for activated protein kinase C isoform beta II (PKC beta II) in disrupting PQC. We show that active PKC beta II directly phosphorylated the proteasome and inhibited proteasomal activity in vitro and in cultured neonatal cardiomyocytes. Importantly, inhibition of PKC beta II, using a selective PKC beta II peptide inhibitor (beta IIV5-3), improved proteasomal activity and conferred protection in cultured neonatal cardiomyocytes. We also show that sustained inhibition of PKC beta II increased proteasomal activity, decreased accumulation of damaged and misfolded proteins and increased animal survival in two rat models of HF. Interestingly, beta IIV5-3-mediated protection was blunted by sustained proteasomal inhibition in HF. Finally, increased cardiac PKC beta II activity and accumulation of misfolded proteins associated with decreased proteasomal function were found also in remodeled and failing human hearts, indicating a potential clinical relevance of our findings. Together, our data highlights PKC beta II as a novel inhibitor of proteasomal function. PQC disruption by increased PKC beta II activity in vivo appears to contribute to the pathophysiology of heart failure, suggesting that PKC beta II inhibition may benefit patients with heart failure. (218 words)
Resumo:
Exercise training is a well-known coadjuvant in heart failure treatment; however, the molecular mechanisms underlying its beneficial effects remain elusive. Despite the primary cause, heart failure is often preceded by two distinct phenomena: mitochondria dysfunction and cytosolic protein quality control disruption. The objective of the study was to determine the contribution of exercise training in regulating cardiac mitochondria metabolism and cytosolic protein quality control in a post-myocardial infarction-induced heart failure (MI-HF) animal model. Our data demonstrated that isolated cardiac mitochondria from MI-HF rats displayed decreased oxygen consumption, reduced maximum calcium uptake and elevated H2O2 release. These changes were accompanied by exacerbated cardiac oxidative stress and proteasomal insufficiency. Declined proteasomal activity contributes to cardiac protein quality control disruption in our MI-HF model. Using cultured neonatal cardiomyocytes, we showed that either antimycin A or H2O2 resulted in inactivation of proteasomal peptidase activity, accumulation of oxidized proteins and cell death, recapitulating our in vivo model. Of interest, eight weeks of exercise training improved cardiac function, peak oxygen uptake and exercise tolerance in MI-HF rats. Moreover, exercise training restored mitochondrial oxygen consumption, increased Ca2+-induced permeability transition and reduced H2O2 release in MI-HF rats. These changes were followed by reduced oxidative stress and better cardiac protein quality control. Taken together, our findings uncover the potential contribution of mitochondrial dysfunction and cytosolic protein quality control disruption to heart failure and highlight the positive effects of exercise training in re-establishing cardiac mitochondrial physiology and protein quality control, reinforcing the importance of this intervention as a nonpharmacological tool for heart failure therapy.
Resumo:
QUESTIONS UNDER STUDY / PRINCIPLES: Interest groups advocate centre-specific outcome data as a useful tool for patients in choosing a hospital for their treatment and for decision-making by politicians and the insurance industry. Haematopoietic stem cell transplantation (HSCT) requires significant infrastructure and represents a cost-intensive procedure. It therefore qualifies as a prime target for such a policy. METHODS: We made use of the comprehensive database of the Swiss Blood Stem Cells Transplant Group (SBST) to evaluate potential use of mortality rates. Nine institutions reported a total of 4717 HSCT - 1427 allogeneic (30.3%), 3290 autologous (69.7%) - in 3808 patients between the years 1997 and 2008. Data were analysed for survival- and transplantation-related mortality (TRM) at day 100 and at 5 years. RESULTS: The data showed marked and significant differences between centres in unadjusted analyses. These differences were absent or marginal when the results were adjusted for disease, year of transplant and the EBMT risk score (a score incorporating patient age, disease stage, time interval between diagnosis and transplantation, and, for allogeneic transplants, donor type and donor-recipient gender combination) in a multivariable analysis. CONCLUSIONS: These data indicate comparable quality among centres in Switzerland. They show that comparison of crude centre-specific outcome data without adjustment for the patient mix may be misleading. Mandatory data collection and systematic review of all cases within a comprehensive quality management system might, in contrast, serve as a model to ascertain the quality of other cost-intensive therapies in Switzerland.
Resumo:
For virtually all hospitals, utilization rates are a critical managerial indicator of efficiency and are determined in part by turnover time. Turnover time is defined as the time elapsed between surgeries, during which the operating room is cleaned and preparedfor the next surgery. Lengthier turnover times result in lower utilization rates, thereby hindering hospitals’ ability to maximize the numbers of patients that can be attended to. In this thesis, we analyze operating room data from a two year period provided byEvangelical Community Hospital in Lewisburg, Pennsylvania, to understand the variability of the turnover process. From the recorded data provided, we derive our best estimation of turnover time. Recognizing the importance of being able to properly modelturnover times in order to improve the accuracy of scheduling, we seek to fit distributions to the set of turnover times. We find that log-normal and log-logistic distributions are well-suited to turnover times, although further research must validate this finding. Wepropose that the choice of distribution depends on the hospital and, as a result, a hospital must choose whether to use the log-normal or the log-logistic distribution. Next, we use statistical tests to identify variables that may potentially influence turnover time. We find that there does not appear to be a correlation between surgerytime and turnover time across doctors. However, there are statistically significant differences between the mean turnover times across doctors. The final component of our research entails analyzing and explaining the benefits of introducing control charts as a quality control mechanism for monitoring turnover times in hospitals. Although widely instituted in other industries, control charts are notwidely adopted in healthcare environments, despite their potential benefits. A major component of our work is the development of control charts to monitor the stability of turnover times. These charts can be easily instituted in hospitals to reduce the variabilityof turnover times. Overall, our analysis uses operations research techniques to analyze turnover times and identify manners for improvement in lowering the mean turnover time and thevariability in turnover times. We provide valuable insight into a component of the surgery process that has received little attention, but can significantly affect utilization rates in hospitals. Most critically, an ability to more accurately predict turnover timesand a better understanding of the sources of variability can result in improved scheduling and heightened hospital staff and patient satisfaction. We hope that our findings can apply to many other hospital settings.
Resumo:
Bovine spongiform encephalopathy (BSE) rapid tests and routine BSE-testing laboratories underlie strict regulations for approval. Due to the lack of BSE-positive control samples, however, full assay validation at the level of individual test runs and continuous monitoring of test performance on-site is difficult. Most rapid tests use synthetic prion protein peptides, but it is not known to which extend they reflect the assay performance on field samples, and whether they are sufficient to indicate on-site assay quality problems. To address this question we compared the test scores of the provided kit peptide controls to those of standardized weak BSE-positive tissue samples in individual test runs as well as continuously over time by quality control charts in two widely used BSE rapid tests. Our results reveal only a weak correlation between the weak positive tissue control and the peptide control scores. We identified kit-lot related shifts in the assay performances that were not reflected by the peptide control scores. Vice versa, not all shifts indicated by the peptide control scores indeed reflected a shift in the assay performance. In conclusion these data highlight that the use of the kit peptide controls for continuous quality control purposes may result in unjustified rejection or acceptance of test runs. However, standardized weak positive tissue controls in combination with Shewhart-CUSUM control charts appear to be reliable in continuously monitoring assay performance on-site to identify undesired deviations.
Resumo:
Introduction Since the quality of patient portrayal of standardized patients (SPs) during an Objective Structured Clinical Exam (OSCE) has a major impact on the reliability and validity of the exam, quality control should be initiated. Literature about quality control of SP’s performance focuses on feedback [1, 2] or completion of checklists [3, 4]. Since we did not find a published instrument meeting our needs for the assessment of patient portrayal, we developed such an instrument after being inspired by others [5] and used it in our high-stakes exam. Methods SP trainers from all five Swiss medical faculties collected and prioritized quality criteria for patient portrayal. Items were revised with the partners twice, based on experiences during OSCEs. The final instrument contains 14 criteria for acting (i.e. adequate verbal and non-verbal expression) and standardization (i.e. verbatim delivery of the first sentence). All partners used the instrument during a high-stakes OSCE. Both, SPs and trainers were introduced to the instrument. The tool was used in training (more than 100 observations) and during the exam (more than 250 observations). FAIR_OSCE The list of items to assess the quality of the simulation by SPs was primarily developed and used to provide formative feedback to the SPs in order to help them to improve their performance. It was therefore named “Feedbackstruckture for the Assessment of Interactive Role play in Objective Structured Clinical Exams (FAIR_OSCE). It was also used to assess the quality of patient portrayal during the exam. The results were calculated for each of the five faculties individually. Formative evaluation was given to the five faculties with individual feedback without revealing results of other faculties other than overall results. Results High quality of patient portrayal during the exam was documented. More than 90% of SP performances were rated to be completely correct or sufficient. An increase in quality of performance between training and exam was noted. In example the rate of completely correct reaction in medical tests increased from 88% to 95%. 95% completely correct reactions together with 4% sufficient reactions add up to 99% of the reactions meeting the requirements of the exam. SP educators using the instrument reported an augmentation of SPs performance induced by the use of the instrument. Disadvantages mentioned were high concentration needed to explicitly observe all criteria and cumbersome handling of the paper-based forms. Conclusion We were able to document a very high quality of SP performance in our exam. The data also indicate that our training is effective. We believe that the high concentration needed using the instrument is well invested, considering the observed augmentation of performance. The development of an iPad based application for the form is planned to address the cumbersome handling of the paper.