873 resultados para Medical care Quality control Statistical methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: No studies have attempted to determine whether nodal surgery utilization, time to initiation and completion of chemotherapy or surveillance mammography impact breast cancer survival. ^ Objectives and Methods: To determine whether receipt of nodal surgery, initiation and completion of chemotherapy, and surveillance mammography impact of racial disparities in survival among breast cancer patients in SEER areas, 1992-2005. ^ Results: Adjusting for nodal surgery did not reduce racial disparities in survival. Patients who initiated chemotherapy more than three months after surgery were 1.8 times more likely to die of breast cancer (95% CI 1.3-2.5) compared to those who initiated chemotherapy less than a month after surgery, even after controlling for known confounders or controlling for race. Despite correcting for chemotherapy initiation and completion and known predictors of outcome, African American women still had worse disease specific survival than their Caucasian counterparts. We found that non-whites underwent surveillance mammography less frequently compared with whites and mammography use during a one- or two-year time interval was associated with a small reduced risk of breast-cancer-specific and all-cause mortality. Women who received a mammogram during a two-year interval could expect the same disease-specific survival benefit or overall survival benefit as women who received a mammogram during a one-year interval. We found that while adjustment for surveillance mammography receipt and physician visits reduced differences in mortality between blacks and whites, these survival disparities were eliminated after adjusting for the number of surveillance mammograms received. ^ Conclusions: The disparities in survival among African American and Hispanic women with breast cancer are not explained by nodal surgery utilization or chemotherapy initiation and chemotherapy completion. Surveillance mammograms, physician visits and number of mammograms received may play a major role in achieving equal outcomes for breast cancer-specific mortality for women diagnosed with primary breast cancer. Racial disparities in all-cause mortality were explained by racial differences in surveillance mammograms to certain degree, but were no longer significant after controlling for differences in comorbidity. Focusing on access to quality care and post treatment surveillance might help achieve national goals to eliminate racial disparities in healthcare and outcomes. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Complex diseases such as cancer result from multiple genetic changes and environmental exposures. Due to the rapid development of genotyping and sequencing technologies, we are now able to more accurately assess causal effects of many genetic and environmental factors. Genome-wide association studies have been able to localize many causal genetic variants predisposing to certain diseases. However, these studies only explain a small portion of variations in the heritability of diseases. More advanced statistical models are urgently needed to identify and characterize some additional genetic and environmental factors and their interactions, which will enable us to better understand the causes of complex diseases. In the past decade, thanks to the increasing computational capabilities and novel statistical developments, Bayesian methods have been widely applied in the genetics/genomics researches and demonstrating superiority over some regular approaches in certain research areas. Gene-environment and gene-gene interaction studies are among the areas where Bayesian methods may fully exert its functionalities and advantages. This dissertation focuses on developing new Bayesian statistical methods for data analysis with complex gene-environment and gene-gene interactions, as well as extending some existing methods for gene-environment interactions to other related areas. It includes three sections: (1) Deriving the Bayesian variable selection framework for the hierarchical gene-environment and gene-gene interactions; (2) Developing the Bayesian Natural and Orthogonal Interaction (NOIA) models for gene-environment interactions; and (3) extending the applications of two Bayesian statistical methods which were developed for gene-environment interaction studies, to other related types of studies such as adaptive borrowing historical data. We propose a Bayesian hierarchical mixture model framework that allows us to investigate the genetic and environmental effects, gene by gene interactions (epistasis) and gene by environment interactions in the same model. It is well known that, in many practical situations, there exists a natural hierarchical structure between the main effects and interactions in the linear model. Here we propose a model that incorporates this hierarchical structure into the Bayesian mixture model, such that the irrelevant interaction effects can be removed more efficiently, resulting in more robust, parsimonious and powerful models. We evaluate both of the 'strong hierarchical' and 'weak hierarchical' models, which specify that both or one of the main effects between interacting factors must be present for the interactions to be included in the model. The extensive simulation results show that the proposed strong and weak hierarchical mixture models control the proportion of false positive discoveries and yield a powerful approach to identify the predisposing main effects and interactions in the studies with complex gene-environment and gene-gene interactions. We also compare these two models with the 'independent' model that does not impose this hierarchical constraint and observe their superior performances in most of the considered situations. The proposed models are implemented in the real data analysis of gene and environment interactions in the cases of lung cancer and cutaneous melanoma case-control studies. The Bayesian statistical models enjoy the properties of being allowed to incorporate useful prior information in the modeling process. Moreover, the Bayesian mixture model outperforms the multivariate logistic model in terms of the performances on the parameter estimation and variable selection in most cases. Our proposed models hold the hierarchical constraints, that further improve the Bayesian mixture model by reducing the proportion of false positive findings among the identified interactions and successfully identifying the reported associations. This is practically appealing for the study of investigating the causal factors from a moderate number of candidate genetic and environmental factors along with a relatively large number of interactions. The natural and orthogonal interaction (NOIA) models of genetic effects have previously been developed to provide an analysis framework, by which the estimates of effects for a quantitative trait are statistically orthogonal regardless of the existence of Hardy-Weinberg Equilibrium (HWE) within loci. Ma et al. (2012) recently developed a NOIA model for the gene-environment interaction studies and have shown the advantages of using the model for detecting the true main effects and interactions, compared with the usual functional model. In this project, we propose a novel Bayesian statistical model that combines the Bayesian hierarchical mixture model with the NOIA statistical model and the usual functional model. The proposed Bayesian NOIA model demonstrates more power at detecting the non-null effects with higher marginal posterior probabilities. Also, we review two Bayesian statistical models (Bayesian empirical shrinkage-type estimator and Bayesian model averaging), which were developed for the gene-environment interaction studies. Inspired by these Bayesian models, we develop two novel statistical methods that are able to handle the related problems such as borrowing data from historical studies. The proposed methods are analogous to the methods for the gene-environment interactions on behalf of the success on balancing the statistical efficiency and bias in a unified model. By extensive simulation studies, we compare the operating characteristics of the proposed models with the existing models including the hierarchical meta-analysis model. The results show that the proposed approaches adaptively borrow the historical data in a data-driven way. These novel models may have a broad range of statistical applications in both of genetic/genomic and clinical studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

On cover: QC manual.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

"Compiled by a task group under the chairmanship of E. J. Delate."

Relevância:

100.00% 100.00%

Publicador:

Resumo:

"OTA-H-386."--P. [4] of cover.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To develop an effective method for evaluating the quality of Cortex berberidis from different geographical origins. Methods: A simple, precise and accurate high performance liquid chromatography (HPLC) method was first developed for simultaneous quantification of four active alkaloids (magnoflorine, jatrorrhizine, palmatine, and berberine) in Cortex berberidis obtained from Qinghai, Tibet and Sichuan Provinces of China. Method validation was performed in terms of precision, repeatability, stability, accuracy, and linearity. Besides, partial least squares discriminant analysis (PLS-DA) and one-way analysis of variance (ANOVA) were applied to study the quality variations of Cortex berberidis from various geographical origins. Results: The proposed HPLC method showed good linearity, precision, repeatability, and accuracy. The four alkaloids were detected in all samples of Cortex berberidis. Among them, magnoflorine (36.46 - 87.30 mg/g) consistently showed the highest amounts in all the samples, followed by berberine (16.00 - 37.50 mg/g). The content varied in the range of 0.66 - 4.57 mg/g for palmatine and 1.53 - 16.26 mg/g for jatrorrhizine, respectively. The total content of the four alkaloids ranged from 67.62 to 114.79 mg/g. Moreover, the results obtained by the PLS-DA and ANOVA showed that magnoflorine level and the total content of these four alkaloids in Qinghai and Tibet samples were significantly higher (p < 0.01) than those in Sichuan samples. Conclusion: Quantification of multi-ingredients by HPLC combined with statistical methods provide an effective approach for achieving origin discrimination and quality evaluation of Cortex berberidis. The quality of Cortex berberidis closely correlates to the geographical origin of the samples, with Cortex berberidis samples from Qinghai and Tibet exhibiting superior qualities to those from Sichuan.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Under the concept of Total Quality Control, based on their experience, the authors discussed potential demand for quality of immunization services and possible solutions to these demands. Abstract in Chinese 全面质量管理(total quality control,TQC)是在20世纪60年代由美国人V,Feigonbaum和J.unan先后提出的新的质量管理观念,众所周知的ISO9000族标准即建立在TQC理念下的质量管理标准,该标准已成为当今世界全球一致、最具权威的质量管理和质量保证的国际规则[1-2].21世纪是质量世纪,推行TQC,不断改进产品和服务质量,目前已成为我国各行各业在不断激烈的市场竞争下完善自我、保证生存和发展的重要手段.实施预防接种是预防和控制传染病,保护人群健康的重要措施,预防接种工作中,产品即预防接种服务,需方(顾客)为接受预防接种服务的广大人群,是产品的消费者.随社会的迅速发展,人们对健康需求的不断提高,对预防接种工作也提出了更高的质量要求.本文对TQC模式下顾客对预防接种服务的质量要求进行了综合分析,并对如何改进服务质量进行了初步探讨.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A modeling framework is presented in this paper, integrating hydrologic scenarios projected from a General Circulation Model (GCM) with a water quality simulation model to quantify the future expected risk. Statistical downscaling with a Canonical Correlation Analysis (CCA) is carried out to develop the future scenarios of hydro-climate variables starting with simulations provided by a GCM. A Multiple Logistic Regression (MLR) is used to quantify the risk of Low Water Quality (LWQ) corresponding to a threshold quality level, by considering the streamflow and water temperature as explanatory variables. An Imprecise Fuzzy Waste Load Allocation Model (IFWLAM) presented in an earlier study is then used to develop adaptive policies to address the projected water quality risks. Application of the proposed methodology is demonstrated with the case study of Tunga-Bhadra river in India. The results showed that the projected changes in the hydro-climate variables tend to diminish DO levels, thus increasing the future risk levels of LWQ. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the background to the development of an analytical quality control procedure for the Trophic Diatom Index (TDI) is explained, highlighting some of the statistical and taxonomic problems encountered, and going on to demonstrate how the system works in practice. Most diatom-based pollution indices, including the TDI, use changes in the relative proportions of different taxa to indicate changing environmental conditions. The techniques involved are therefore much simpler than those involved in many studies of phytoplankton, for example, where absolute numbers are required.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to develop a quality control tool based on rheological test methods for solder paste and flux media. Design/methodology/approach – The rheological characterisation of solder pastes and flux media was carried out through the creep-recovery, thixotropy and viscosity test methods. A rheometer with a parallel plate measuring geometry of 40mm diameter and a gap height of 1mm was used to characterise the paste and associated flux media. Findings – The results from the study showed that the creep-recovery test can be used to study the deformation and recovery of the pastes, which can be used to understand the slump behaviour in solder pastes. In addition, the results from the thixotropic and viscosity test were unsuccessful in determining the differences in the rheological flow behaviour in the solder pastes and the flux medium samples. Research limitations/implications – More extensive rheological and printing testing is needed in order to correlate the findings from this study with the printing performance of the pastes. Practical implications – The rheological test method presented in the paper will provide important information for research and development, quality control and production staff to facilitate the manufacture of solder pastes and flux media. Originality/value – The paper explains how the rheological test can be used as a quality control tool to identify the suitability of a developmental solder paste and flux media used for the printing process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chemical Imaging (CI) is an emerging platform technology that integrates conventional imaging and spectroscopy to attain both spatial and spectral information from an object. Vibrational spectroscopic methods, such as Near Infrared (NIR) and Raman spectroscopy, combined with imaging are particularly useful for analysis of biological/pharmaceutical forms. The rapid, non-destructive and non-invasive features of CI mark its potential suitability as a process analytical tool for the pharmaceutical industry, for both process monitoring and quality control in the many stages of drug production. This paper provides an overview of CI principles, instrumentation and analysis. Recent applications of Raman and NIR-CI to pharmaceutical quality and process control are presented; challenges facing Cl implementation and likely future developments in the technology are also discussed. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Polymer extrusion, in which a polymer is melted and conveyed to a mould or die, forms the basis of most polymer processing techniques. Extruders frequently run at non-optimised conditions and can account for 15–20% of overall process energy losses. In times of increasing energy efficiency such losses are a major concern for the industry. Product quality, which depends on the homogeneity and stability of the melt flow which in turn depends on melt temperature and screw speed, is also an issue of concern of processors. Gear pumps can be used to improve the stability of the production line, but the cost is usually high. Likewise it is possible to introduce energy meters but they also add to the capital cost of the machine. Advanced control incorporating soft sensing capabilities offers opportunities to this industry to improve both quality and energy efficiency. Due to strong correlations between the critical variables, such as the melt temperature and melt pressure, traditional decentralized PID (Proportional–Integral–Derivative) control is incapable of handling such processes if stricter product specifications are imposed or the material is changed from one batch to another. In this paper, new real-time energy monitoring methods have been introduced without the need to install power meters or develop data-driven models. The effects of process settings on energy efficiency and melt quality are then studied based on developed monitoring methods. Process variables include barrel heating temperature, water cooling temperature, and screw speed. Finally, a fuzzy logic controller is developed for a single screw extruder to achieve high melt quality. The resultant performance of the developed controller has shown it to be a satisfactory alternative to the expensive gear pump. Energy efficiency of the extruder can further be achieved by optimising the temperature settings. Experimental results from open-loop control and fuzzy control on a Killion 25 mm single screw extruder are presented to confirm the efficacy of the proposed approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Health regulatory colleges promote quality practice and continued competence through Quality Assurance (QA) programs. For many colleges, a QA program includes the use of portfolios that incorporate self-directed learning. The purpose of this study was to determine some of the issues surrounding the effectiveness of QA portfolio programs. The literature review revealed that portfolios are valuable tools, but gaps in knowledge include a comparative analysis of QA programs and the perspective of regulatory college administrators. Data were collected through interviews with 6 administrators and a review of 14 portfolio models described on college websites. The results from the two data sources were applied to Robert Stake's responsive evaluation framework to identify issues related to the portfolio's effectiveness (Stake, 1967). The learning components of portfolios were analyzed through the humanist and constructivist lenses. All 14 portfolio models were found to have 3 main components: self-diagnosis, learning plan and activities, and self-evaluation. However, differences were uncovered in learners' autonomy in selecting learning activities, methods of portfolio evaluation, and the relationship between the portfolio and other QA components. The results revealed a dual philosophy of learning in portfolio models and an apparent contradiction between the needs of the individual learner and the organization. Paths for future research include the tenuous relationship between competence and learning, and the impact of technical approaches on selfdirected learning initiatives. A key recommendation is to acknowledge the unique identity of each profession so that health regulatory colleges can address legislative demands and learner needs.