937 resultados para Process control -- Statistical methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Engenharia Mecânica - FEIS

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: COX-2 is one of the most important prostaglandin involved in urologic cancer and seems to be associated with tumor progression, invasion, and metastasis. In addition, several effects have been reported for VEGF, including inducing angiogenesis, promoting cell migration, and inhibiting apoptosis. COX2 and VEGF up-regulation have been reported in human prostate cancer. Due to the importance of canine natural model for prostate cancer, the aim of this study was to evaluate COX-2 and VEGF protein expression in canine carcinogenic process. Material and Methods: Seventy-four prostatic tissues from dogs were selected to be evaluated for protein expression by immunohistochemistry (IHC), including: 10 normal prostatic tissues, 20 benign prostatic hyperplasias (BPH), 25 proliferative inflammatory atrophies (PIA) and 20 prostatic carcinomas (PCa). COX-2 and VEGF were detected using the monoclonal antibody CX-294 (1:50 dilution, Dako Cytomation and sc-53463 (1:100 dilution, Santa Cruz), respectively. The immunolabelling was performed by a polymer method (Histofine, Nichirei Biosciences). All reaction included negative controls by omitting the primary antibody. The percentage of C-MYC, E-cadherin, and p63- positive cells per lesion was evaluated according to Prowatke et al. (2007). The samples were scored separately according to staining intensity and graded semi-quantitatively as negative, weakly positive (1), moderately positive, and strongly positive. The score was done in one 400 magnification field, considering only the lesion, since this was done in a TMA core of 1 mm. For statistical analyses, the immunostaining classifications were reduced to two categories: negative and positive. The negative category included negative and weakly positive staining. Chi-square or Fisher exact test was used to determine the association between the categorical variables. Results: The COX-2 protein expression was elevated in the cytoplasm of the canine PCa and PIA compared to normal prostate (p=0.002). VEGF protein expression was increased in 94.75% of the PCa and 100% of the PIA compared with to normal prostate (p = 0.001). No difference was found when compared normal prostate with BPH. Conclusions: This study has demonstrated that the carcinogenesis of canine prostatic tissue may be related to gain of COX-2 and VEGF protein expression.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: The aim of this study was to assess the effect of nitrogen ion implantation on the flexibility of rotary nickel-titanium (NiTi) instruments as measured by the load required to bend implanted and nonimplanted instruments at a 30 degrees angle. Methods: Thirty K3 files, size #40, 0.02 taper and 25-mm length, were allocated into 2 groups as follows: group A, 15 files exposed to nitrogen ion implantation at a dose of 2.5 x 10(17) ions/cm(2), voltage 200 KeV, current density 1 mu A/cm(2), temperature 130 degrees C, and vacuum conditions of 10 x 10(-6) mm Hg for 6 hours; and group B, 15 nonimplanted files. One extra file was used for process control. All instruments were subjected to bend testing on a modified troptometer, with measurement of the load required for flexure to an angle of 30 degrees. The Mann-Whitney U test was used for statistical analysis. Findings with P <.05 were considered significant. Results: The mean load required to bend instruments at a 30 degrees angle was 376.26 g for implanted instruments and 383.78 g for nonimplanted instruments. The difference was not statistically significant. Conclusions: Our findings show that nitrogen ion implantation has no appreciable effect on the flexibility of NiTi instruments. (J Endod 2012;38:673-675)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The quality concepts represent one of the important factors for the success of organizations and among these concepts the stabilization of the production process contributes to the improvement, waste reduction and increased competitiveness. Thus, this study aimed to evaluate the production process of solid wood flooring on its predictability and capacity, based on its critical points. Therefore, the research was divided into three stages. The first one was the process mapping of the company and the elaboration of flowcharts for the activities. The second one was the identification and the evaluation of the critical points using FMEA (Failure Mode and Effect Analysis) adapted methodology. The third one was the evaluation of the critical points applying the statistical process control and the determination of the process capability for the C-pk index. The results showed the existence of six processes, two of them are critical. In those two ones, fifteen points were considered critical and two of them, related with the dimension of the pieces and defects caused by sandpaper, were selected for evaluation. The productive process of the company is unstable and not capable to produce wood flooring according to the specifications and, therefore these specifications should be reevaluated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hybrid vehicles represent the future for automakers, since they allow to improve the fuel economy and to reduce the pollutant emissions. A key component of the hybrid powertrain is the Energy Storage System, that determines the ability of the vehicle to store and reuse energy. Though electrified Energy Storage Systems (ESS), based on batteries and ultracapacitors, are a proven technology, Alternative Energy Storage Systems (AESS), based on mechanical, hydraulic and pneumatic devices, are gaining interest because they give the possibility of realizing low-cost mild-hybrid vehicles. Currently, most literature of design methodologies focuses on electric ESS, which are not suitable for AESS design. In this contest, The Ohio State University has developed an Alternative Energy Storage System design methodology. This work focuses on the development of driving cycle analysis methodology that is a key component of Alternative Energy Storage System design procedure. The proposed methodology is based on a statistical approach to analyzing driving schedules that represent the vehicle typical use. Driving data are broken up into power events sequence, namely traction and braking events, and for each of them, energy-related and dynamic metrics are calculated. By means of a clustering process and statistical synthesis methods, statistically-relevant metrics are determined. These metrics define cycle representative braking events. By using these events as inputs for the Alternative Energy Storage System design methodology, different system designs are obtained. Each of them is characterized by attributes, namely system volume and weight. In the last part the work, the designs are evaluated in simulation by introducing and calculating a metric related to the energy conversion efficiency. Finally, the designs are compared accounting for attributes and efficiency values. In order to automate the driving data extraction and synthesis process, a specific script Matlab based has been developed. Results show that the driving cycle analysis methodology, based on the statistical approach, allows to extract and synthesize cycle representative data. The designs based on cycle statistically-relevant metrics are properly sized and have satisfying efficiency values with respect to the expectations. An exception is the design based on the cycle worst-case scenario, corresponding to same approach adopted by the conventional electric ESS design methodologies. In this case, a heavy system with poor efficiency is produced. The proposed new methodology seems to be a valid and consistent support for Alternative Energy Storage System design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study was to develop a criteria catalogue serving as a guideline for authors to improve quality of reporting experiments in basic research in homeopathy. A Delphi Process was initiated including three rounds of adjusting and phrasing plus two consensus conferences. European researchers who published experimental work within the last 5 years were involved. A checklist for authors provide a catalogue with 23 criteria. The “Introduction” should focus on underlying hypotheses, the homeopathic principle investigated and state if experiments are exploratory or confirmatory. “Materials and methods” should comprise information on object of investigation, experimental setup, parameters, intervention and statistical methods. A more detailed description on the homeopathic substances, for example, manufacture, dilution method, starting point of dilution is required. A further result of the Delphi process is to raise scientists' awareness of reporting blinding, allocation, replication, quality control and system performance controls. The part “Results” should provide the exact number of treated units per setting which were included in each analysis and state missing samples and drop outs. Results presented in tables and figures are as important as appropriate measures of effect size, uncertainty and probability. “Discussion” in a report should depict more than a general interpretation of results in the context of current evidence but also limitations and an appraisal of aptitude for the chosen experimental model. Authors of homeopathic basic research publications are encouraged to apply our checklist when preparing their manuscripts. Feedback is encouraged on applicability, strength and limitations of the list to enable future revisions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The AEGISS (Ascertainment and Enhancement of Gastrointestinal Infection Surveillance and Statistics) project aims to use spatio-temporal statistical methods to identify anomalies in the space-time distribution of non-specific, gastrointestinal infections in the UK, using the Southampton area in southern England as a test-case. In this paper, we use the AEGISS project to illustrate how spatio-temporal point process methodology can be used in the development of a rapid-response, spatial surveillance system. Current surveillance of gastroenteric disease in the UK relies on general practitioners reporting cases of suspected food-poisoning through a statutory notification scheme, voluntary laboratory reports of the isolation of gastrointestinal pathogens and standard reports of general outbreaks of infectious intestinal disease by public health and environmental health authorities. However, most statutory notifications are made only after a laboratory reports the isolation of a gastrointestinal pathogen. As a result, detection is delayed and the ability to react to an emerging outbreak is reduced. For more detailed discussion, see Diggle et al. (2003). A new and potentially valuable source of data on the incidence of non-specific gastro-enteric infections in the UK is NHS Direct, a 24-hour phone-in clinical advice service. NHS Direct data are less likely than reports by general practitioners to suffer from spatially and temporally localized inconsistencies in reporting rates. Also, reporting delays by patients are likely to be reduced, as no appointments are needed. Against this, NHS Direct data sacrifice specificity. Each call to NHS Direct is classified only according to the general pattern of reported symptoms (Cooper et al, 2003). The current paper focuses on the use of spatio-temporal statistical analysis for early detection of unexplained variation in the spatio-temporal incidence of non-specific gastroenteric symptoms, as reported to NHS Direct. Section 2 describes our statistical formulation of this problem, the nature of the available data and our approach to predictive inference. Section 3 describes the stochastic model. Section 4 gives the results of fitting the model to NHS Direct data. Section 5 shows how the model is used for spatio-temporal prediction. The paper concludes with a short discussion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Detector uniformity is a fundamental performance characteristic of all modern gamma camera systems, and ensuring a stable, uniform detector response is critical for maintaining clinical images that are free of artifact. For these reasons, the assessment of detector uniformity is one of the most common activities associated with a successful clinical quality assurance program in gamma camera imaging. The evaluation of this parameter, however, is often unclear because it is highly dependent upon acquisition conditions, reviewer expertise, and the application of somewhat arbitrary limits that do not characterize the spatial location of the non-uniformities. Furthermore, as the goal of any robust quality control program is the determination of significant deviations from standard or baseline conditions, clinicians and vendors often neglect the temporal nature of detector degradation (1). This thesis describes the development and testing of new methods for monitoring detector uniformity. These techniques provide more quantitative, sensitive, and specific feedback to the reviewer so that he or she may be better equipped to identify performance degradation prior to its manifestation in clinical images. The methods exploit the temporal nature of detector degradation and spatially segment distinct regions-of-non-uniformity using multi-resolution decomposition. These techniques were tested on synthetic phantom data using different degradation functions, as well as on experimentally acquired time series floods with induced, progressively worsening defects present within the field-of-view. The sensitivity of conventional, global figures-of-merit for detecting changes in uniformity was evaluated and compared to these new image-space techniques. The image-space algorithms provide a reproducible means of detecting regions-of-non-uniformity prior to any single flood image’s having a NEMA uniformity value in excess of 5%. The sensitivity of these image-space algorithms was found to depend on the size and magnitude of the non-uniformities, as well as on the nature of the cause of the non-uniform region. A trend analysis of the conventional figures-of-merit demonstrated their sensitivity to shifts in detector uniformity. The image-space algorithms are computationally efficient. Therefore, the image-space algorithms should be used concomitantly with the trending of the global figures-of-merit in order to provide the reviewer with a richer assessment of gamma camera detector uniformity characteristics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coalescent theory represents the most significant progress in theoretical population genetics in the past three decades. The coalescent theory states that all genes or alleles in a given population are ultimately inherited from a single ancestor shared by all members of the population, known as the most recent common ancestor. It is now widely recognized as a cornerstone for rigorous statistical analyses of molecular data from population [1]. The scientists have developed a large number of coalescent models and methods[2,3,4,5,6], which are not only applied in coalescent analysis and process, but also in today’s population genetics and genome studies, even public health. The thesis aims at completing a statistical framework based on computers for coalescent analysis. This framework provides a large number of coalescent models and statistic methods to assist students and researchers in coalescent analysis, whose results are presented in various formats as texts, graphics and printed pages. In particular, it also supports to create new coalescent models and statistical methods. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current statistical methods for estimation of parametric effect sizes from a series of experiments are generally restricted to univariate comparisons of standardized mean differences between two treatments. Multivariate methods are presented for the case in which effect size is a vector of standardized multivariate mean differences and the number of treatment groups is two or more. The proposed methods employ a vector of independent sample means for each response variable that leads to a covariance structure which depends only on correlations among the $p$ responses on each subject. Using weighted least squares theory and the assumption that the observations are from normally distributed populations, multivariate hypotheses analogous to common hypotheses used for testing effect sizes were formulated and tested for treatment effects which are correlated through a common control group, through multiple response variables observed on each subject, or both conditions.^ The asymptotic multivariate distribution for correlated effect sizes is obtained by extending univariate methods for estimating effect sizes which are correlated through common control groups. The joint distribution of vectors of effect sizes (from $p$ responses on each subject) from one treatment and one control group and from several treatment groups sharing a common control group are derived. Methods are given for estimation of linear combinations of effect sizes when certain homogeneity conditions are met, and for estimation of vectors of effect sizes and confidence intervals from $p$ responses on each subject. Computational illustrations are provided using data from studies of effects of electric field exposure on small laboratory animals. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This volume contains the Proceedings of the Twenty-Sixth Annual Biochemical Engineering Symposium held at Kansas State University on September 21, 1996. The program included 10 oral presentations and 14 posters. Some of the papers describe the progress of ongoing projects, and others contain the results of completed projects. Only brief summaries are given of some of the papers; many of the papers will be published in full elsewhere. A listing of those who attended is given below. ContentsForeign Protein Production from SV40 Early Promoter in Continuous Cultures of Recombinant CHO Cells - Gautam Banik, Paul Todd, and Dhinakar Kampala Enhanced Cell Recruitment Due to Cell-Cell Interactions - Brad Farlow and Matthias Nollert The Recirculation of Hybridoma Suspension Cultures: Effects on Cell Death, Metabolism and Mab Productivity - Peng Jin and Carole A. Heath The Importance of Enzyme Inactivation and Self-Recovery in Cometabolic Biodegradation of Chlorinated Solvents - Xi-Hui Zhang, Shanka Banerji, and Rakesh Bajpai Phytoremediation of VOC contaminated Groundwater using Poplar Trees - Melissa Miller, Jason Dana, L.C. Davis, Murlidharan Narayanan, and L.E. Erickson Biological Treatment of Off-Gases from Aluminum Can Production: Experimental Results and Mathematical Modeling - Adeyma Y. Arroyo, Julio Zimbron, and Kenneth F. Reardon Inertial Migration Based Separation of Chlorella Microalgae in Branched Tubes - N.M. Poflee, A.L. Rakow, D.S. Dandy, M.L. Chappell, and M.N. Pons Contribution of Electrochemical Charge to Protein Partitioning in Aqueous Two-Phase Systems - Weiyu Fan and Charles C. Glatz Biodegradation of Some Commercial Surfactants Used in Bioremediation - Jun Gu, G.W. Preckshot, S.K. Banerji, and Rakesh Bajpai Modeling the Role of Biomass in Heavy Metal Transport Ln Vadose Zone - K.V. Nedunuri, L.E. Erickson, and R.S. Govindaraju Multivariable Statistical Methods for Monitoring Process Quality: Application to Bioinsecticide Production by 73 89 Bacillus Thuringiensis - c. Puente and M.N. Karim The Use of Polymeric Flocculants in Bacterial Lysate Streams - H. Graham, A.S. Cibulskas and E.H. Dunlop Effect of Water Content on transport of Trichloroethylene in a Chamber with Alfalfa Plants - Muralidharan Narayanan, Jiang Hu, Lawrence C. Davis, and Larry E. Erickson Detection of Specific Microorganisms using the Arbitrary Primed PCR in the Bacterial Community of Vegetated Soil - X. Wu and L.C. Davis Flux Enhancement Using Backpulsing - V.T. Kuberkar and R.H. Davis Chromatographic Purification of Oligonucleotides: Comparison with Electrophoresis - Stephen P. Cape, Ching-Yuan Lee, Kevin Petrini, Sean Foree, Micheal G. Sportiello and Paul Todd Determining Singular Arc Control Policies for Bioreactor Systems Using a Modified Iterative Dynamic Programming Algorithm - Arun Tholudur and W. Fred Ramirez Pressure Effect on Subtilisins Measured via FTIR, EPR and Activity Assays, and Its Impact on Crystallizations - J.N. Webb, R.Y. Waghmare, M.G. Bindewald, T.W. Randolph, J.F. Carpenter, C.E. Glatz Intercellular Calcium Changes in Endothelial Cells Exposed to Flow - Laura Worthen and Matthias Nollert Application of Liquid-Liquid Extraction in Propionic Acid Fermentation - Zhong Gu, Bonita A. Glatz, and Charles E. Glatz Purification of Recombinant T4 Lysozyme from E. Coli: Ion-Exchange Chromatography - Weiyu Fan, Matt L. Thatcher, and Charles E. Glatz Recovery and Purification of Recombinant Beta-Glucuronidase from Transgenic Corn - Ann R. Kusnadi, Roque Evangelista, Zivko L. Nikolov, and John Howard Effects of Auxins and cytokinins on Formation of Catharanthus Roseus G. Don Multiple Shoots - Ying-Jin Yuan, Yu-Min Yang, Tsung-Ting Hu, and Jiang Hu Fate and Effect of Trichloroethylene as Nonaqueous Phase Liquid in Chambers with Alfalfa - Qizhi Zhang, Brent Goplen, Sara Vanderhoof, Lawrence c. Davis, and Larry E. Erickson Oxygen Transport and Mixing Considerations for Microcarrier Culture of Mammalian Cells in an Airlift Reactor - Sridhar Sunderam, Frederick R. Souder, and Marylee Southard Effects of Cyclic Shear Stress on Mammalian Cells under Laminar Flow Conditions: Apparatus and Methods - M.L. Rigney, M.H. Liew, and M.Z. Southard

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing economic competition drives the industry to implement tools that improve their processes efficiencies. The process automation is one of these tools, and the Real Time Optimization (RTO) is an automation methodology that considers economic aspects to update the process control in accordance with market prices and disturbances. Basically, RTO uses a steady-state phenomenological model to predict the process behavior, and then, optimizes an economic objective function subject to this model. Although largely implemented in industry, there is not a general agreement about the benefits of implementing RTO due to some limitations discussed in the present work: structural plant/model mismatch, identifiability issues and low frequency of set points update. Some alternative RTO approaches have been proposed in literature to handle the problem of structural plant/model mismatch. However, there is not a sensible comparison evaluating the scope and limitations of these RTO approaches under different aspects. For this reason, the classical two-step method is compared to more recently derivative-based methods (Modifier Adaptation, Integrated System Optimization and Parameter estimation, and Sufficient Conditions of Feasibility and Optimality) using a Monte Carlo methodology. The results of this comparison show that the classical RTO method is consistent, providing a model flexible enough to represent the process topology, a parameter estimation method appropriate to handle measurement noise characteristics and a method to improve the sample information quality. At each iteration, the RTO methodology updates some key parameter of the model, where it is possible to observe identifiability issues caused by lack of measurements and measurement noise, resulting in bad prediction ability. Therefore, four different parameter estimation approaches (Rotational Discrimination, Automatic Selection and Parameter estimation, Reparametrization via Differential Geometry and classical nonlinear Least Square) are evaluated with respect to their prediction accuracy, robustness and speed. The results show that the Rotational Discrimination method is the most suitable to be implemented in a RTO framework, since it requires less a priori information, it is simple to be implemented and avoid the overfitting caused by the Least Square method. The third RTO drawback discussed in the present thesis is the low frequency of set points update, this problem increases the period in which the process operates at suboptimum conditions. An alternative to handle this problem is proposed in this thesis, by integrating the classic RTO and Self-Optimizing control (SOC) using a new Model Predictive Control strategy. The new approach demonstrates that it is possible to reduce the problem of low frequency of set points updates, improving the economic performance. Finally, the practical aspects of the RTO implementation are carried out in an industrial case study, a Vapor Recompression Distillation (VRD) process located in Paulínea refinery from Petrobras. The conclusions of this study suggest that the model parameters are successfully estimated by the Rotational Discrimination method; the RTO is able to improve the process profit in about 3%, equivalent to 2 million dollars per year; and the integration of SOC and RTO may be an interesting control alternative for the VRD process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: There is strong evidence of the efficacy of family psychosocial interventions for schizophrenia, but evidence of the role played by the attitudes of relatives in the therapeutic process is lacking. Method: To study the effect of a family intervention on family attitudes and to analyse their mediating role in the therapeutic process 50 patients with schizophrenia and their key relatives undergoing a trial on the efficacy of a family psychosocial intervention were studied by means of the Affective Style Coding System, the Scale of Empathy, and the Relational Control Coding System. Specific statistical methods were used to determine the nature of the relationship of the relatives’ attitudes to the outcome of family intervention. Results: Family psychosocial intervention was associated with a reduction in relatives’ guilt induction and dominance and an improvement in empathy. Empathy and lack of dominance were identified as independent mediators of the effect of family psychosocial intervention. The change in empathy and dominance during the first 9 months of the intervention predicted the outcome in the following 15 months. Conclusion: Relatives’ empathy and lack of dominance are mediators of the beneficial effect of family psychosocial intervention on patient’s outcome.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hypertrophic cardiomyopathy (HCM) is a cardiovascular disease where the heart muscle is partially thickened and blood flow is - potentially fatally - obstructed. It is one of the leading causes of sudden cardiac death in young people. Electrocardiography (ECG) and Echocardiography (Echo) are the standard tests for identifying HCM and other cardiac abnormalities. The American Heart Association has recommended using a pre-participation questionnaire for young athletes instead of ECG or Echo tests due to considerations of cost and time involved in interpreting the results of these tests by an expert cardiologist. Initially we set out to develop a classifier for automated prediction of young athletes’ heart conditions based on the answers to the questionnaire. Classification results and further in-depth analysis using computational and statistical methods indicated significant shortcomings of the questionnaire in predicting cardiac abnormalities. Automated methods for analyzing ECG signals can help reduce cost and save time in the pre-participation screening process by detecting HCM and other cardiac abnormalities. Therefore, the main goal of this dissertation work is to identify HCM through computational analysis of 12-lead ECG. ECG signals recorded on one or two leads have been analyzed in the past for classifying individual heartbeats into different types of arrhythmia as annotated primarily in the MIT-BIH database. In contrast, we classify complete sequences of 12-lead ECGs to assign patients into two groups: HCM vs. non-HCM. The challenges and issues we address include missing ECG waves in one or more leads and the dimensionality of a large feature-set. We address these by proposing imputation and feature-selection methods. We develop heartbeat-classifiers by employing Random Forests and Support Vector Machines, and propose a method to classify full 12-lead ECGs based on the proportion of heartbeats classified as HCM. The results from our experiments show that the classifiers developed using our methods perform well in identifying HCM. Thus the two contributions of this thesis are the utilization of computational and statistical methods for discovering shortcomings in a current screening procedure and the development of methods to identify HCM through computational analysis of 12-lead ECG signals.