926 resultados para Multivariate Statistical Process Monitoring
Resumo:
In this paper we propose the Double Sampling X̄ control chart for monitoring processes in which the observations follow a first order autoregressive model. We consider sampling intervals that are sufficiently long to meet the rational subgroup concept. The Double Sampling X̄ chart is substantially more efficient than the Shewhart chart and the Variable Sample Size chart. To study the properties of these charts we derived closed-form expressions for the average run length (ARL) taking into account the within-subgroup correlation. Numerical results show that this correlation has a significant impact on the chart properties.
Resumo:
This paper presents a new method to estimate hole diameters and surface roughness in precision drilling processes, using coupons taken from a sandwich plate composed of a titanium alloy plate (Ti6Al4V) glued onto an aluminum alloy plate (AA 2024T3). The proposed method uses signals acquired during the cutting process by a multisensor system installed on the machine tool. These signals are mathematically treated and then used as input for an artificial neural network. After training, the neural network system is qualified to estimate the surface roughness and hole diameter based on the signals and cutting process parameters. To evaluate the system, the estimated data were compared with experimental measurements and the errors were calculated. The results proved the efficiency of the proposed method, which yielded very low or even negligible errors of the tolerances used in most industrial drilling processes. This pioneering method opens up a new field of research, showing a promising potential for development and application as an alternative monitoring method for drilling processes. © 2012 Springer-Verlag London Limited.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
ABSTRACT: The present work uses multivariate statistical analysis as a form of establishing the main sources of error in the Quantitative Phase Analysis (QPA) using the Rietveld method. The quantitative determination of crystalline phases using x ray powder diffraction is a complex measurement process whose results are influenced by several factors. Ternary mixtures of Al2O3, MgO and NiO were prepared under controlled conditions and the diffractions were obtained using the Bragg-Brentano geometric arrangement. It was possible to establish four sources of critical variations: the experimental absorption and the scale factor of NiO, which is the phase with the greatest linear absorption coefficient of the ternary mixture; the instrumental characteristics represented by mechanical errors of the goniometer and sample displacement; the other two phases (Al2O3 and MgO); and the temperature and relative humidity of the air in the laboratory. The error sources excessively impair the QPA with the Rietveld method. Therefore it becomes necessary to control them during the measurement procedure.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Pós-graduação em Agronomia (Ciência do Solo) - FCAV
Resumo:
The polychaetes assemblage structure was used in order to investigate taxonomic sufficiency in a heavily polluted tropical bay. Species abundance was aggregated into progressively higher taxa matrices (genus, family, order) and was analyzed using univariate and multivariate techniques. Polychaetes distribution in Guanabara Bay (GB) was in accordance with a pollution gradient, probably ruled by the organic enrichment, consequent effects of hypoxia and altered redox conditions coupled with prevailing patterns of circulation. Within the sectors of GB, an increasing gradient in species richness and occurrence was observed, ranging from the azoic and impoverished stations in the inner sector to a well-structured community in terms of species composition and abundance inhabiting the outer sector. Multivariate statistical analysis showed similar results when species were aggregated into genera and families, while greater difference occurred at coarser taxonomic identification (order). The literature about taxonomic sufficiency has demonstrated that faunal patterns at different taxonomic levels tend to become similar with increased pollution. In GB, an analysis carried out solely at family level is perfectly adequate to describe the environmental gradient, considered a useful tool for a quick environmental assessment. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Most butterfly monitoring protocols rely on counts along transects (Pollard walks) to generate species abundance indices and track population trends. It is still too often ignored that a population count results from two processes: the biological process (true abundance) and the statistical process (our ability to properly quantify abundance). Because individual detectability tends to vary in space (e.g., among sites) and time (e.g., among years), it remains unclear whether index counts truly reflect population sizes and trends. This study compares capture-mark-recapture (absolute abundance) and count-index (relative abundance) monitoring methods in three species (Maculinea nausithous and Iolana iolas: Lycaenidae; Minois dryas: Satyridae) in contrasted habitat types. We demonstrate that intraspecific variability in individual detectability under standard monitoring conditions is probably the rule rather than the exception, which questions the reliability of count-based indices to estimate and compare specific population abundance. Our results suggest that the accuracy of count-based methods depends heavily on the ecology and behavior of the target species, as well as on the type of habitat in which surveys take place. Monitoring programs designed to assess the abundance and trends in butterfly populations should incorporate a measure of detectability. We discuss the relative advantages and inconveniences of current monitoring methods and analytical approaches with respect to the characteristics of the species under scrutiny and resources availability.
Resumo:
Detector uniformity is a fundamental performance characteristic of all modern gamma camera systems, and ensuring a stable, uniform detector response is critical for maintaining clinical images that are free of artifact. For these reasons, the assessment of detector uniformity is one of the most common activities associated with a successful clinical quality assurance program in gamma camera imaging. The evaluation of this parameter, however, is often unclear because it is highly dependent upon acquisition conditions, reviewer expertise, and the application of somewhat arbitrary limits that do not characterize the spatial location of the non-uniformities. Furthermore, as the goal of any robust quality control program is the determination of significant deviations from standard or baseline conditions, clinicians and vendors often neglect the temporal nature of detector degradation (1). This thesis describes the development and testing of new methods for monitoring detector uniformity. These techniques provide more quantitative, sensitive, and specific feedback to the reviewer so that he or she may be better equipped to identify performance degradation prior to its manifestation in clinical images. The methods exploit the temporal nature of detector degradation and spatially segment distinct regions-of-non-uniformity using multi-resolution decomposition. These techniques were tested on synthetic phantom data using different degradation functions, as well as on experimentally acquired time series floods with induced, progressively worsening defects present within the field-of-view. The sensitivity of conventional, global figures-of-merit for detecting changes in uniformity was evaluated and compared to these new image-space techniques. The image-space algorithms provide a reproducible means of detecting regions-of-non-uniformity prior to any single flood image’s having a NEMA uniformity value in excess of 5%. The sensitivity of these image-space algorithms was found to depend on the size and magnitude of the non-uniformities, as well as on the nature of the cause of the non-uniform region. A trend analysis of the conventional figures-of-merit demonstrated their sensitivity to shifts in detector uniformity. The image-space algorithms are computationally efficient. Therefore, the image-space algorithms should be used concomitantly with the trending of the global figures-of-merit in order to provide the reviewer with a richer assessment of gamma camera detector uniformity characteristics.
Resumo:
Deep hole drilling is one of the most complicated metal cutting processes and one of the most difficult to perform on CNC machine-tools or machining centres under conditions of limited manpower or unmanned operation. This research work investigates aspects of the deep hole drilling process with small diameter twist drills and presents a prototype system for real time process monitoring and adaptive control; two main research objectives are fulfilled in particular : First objective is the experimental investigation of the mechanics of the deep hole drilling process, using twist drills without internal coolant supply, in the range of diarneters Ø 2.4 to Ø4.5 mm and working length up to 40 diameters. The definition of the problems associated with the low strength of these tools and the study of mechanisms of catastrophic failure which manifest themselves well before and along with the classic mechanism of tool wear. The relationships between drilling thrust and torque with the depth of penetration and the various machining conditions are also investigated and the experimental evidence suggests that the process is inherently unstable at depths beyond a few diameters. Second objective is the design and implementation of a system for intelligent CNC deep hole drilling, the main task of which is to ensure integrity of the process and the safety of the tool and the workpiece. This task is achieved by means of interfacing the CNC system of the machine tool to an external computer which performs the following functions: On-line monitoring of the drilling thrust and torque, adaptive control of feed rate, spindle speed and tool penetration (Z-axis), indirect monitoring of tool wear by pattern recognition of variations of the drilling thrust with cumulative cutting time and drilled depth, operation as a data base for tools and workpieces and finally issuing of alarms and diagnostic messages.
Resumo:
Using water quality management programs is a necessary and inevitable way for preservation and sustainable use of water resources. One of the important issues in determining the quality of water in rivers is designing effective quality control networks, so that the measured quality variables in these stations are, as far as possible, indicative of overall changes in water quality. One of the methods to achieve this goal is increasing the number of quality monitoring stations and sampling instances. Since this will dramatically increase the annual cost of monitoring, deciding on which stations and parameters are the most important ones, along with increasing the instances of sampling, in a way that shows maximum change in the system under study can affect the future decision-making processes for optimizing the efficacy of extant monitoring network, removing or adding new stations or parameters and decreasing or increasing sampling instances. This end, the efficiency of multivariate statistical procedures was studied in this thesis. Multivariate statistical procedure, with regard to its features, can be used as a practical and useful method in recognizing and analyzing rivers’ pollution and consequently in understanding, reasoning, controlling, and correct decision-making in water quality management. This research was carried out using multivariate statistical techniques for analyzing the quality of water and monitoring the variables affecting its quality in Gharasou river, in Ardabil province in northwest of Iran. During a year, 28 physical and chemical parameters were sampled in 11 stations. The results of these measurements were analyzed by multivariate procedures such as: Cluster Analysis (CA), Principal Component Analysis (PCA), Factor Analysis (FA), and Discriminant Analysis (DA). Based on the findings from cluster analysis, principal component analysis, and factor analysis the stations were divided into three groups of highly polluted (HP), moderately polluted (MP), and less polluted (LP) stations Thus, this study illustrates the usefulness of multivariate statistical techniques for analysis and interpretation of complex data sets, and in water quality assessment, identification of pollution sources/factors and understanding spatial variations in water quality for effective river water quality management. This study also shows the effectiveness of these techniques for getting better information about the water quality and design of monitoring network for effective management of water resources. Therefore, based on the results, Gharasou river water quality monitoring program was developed and presented.
Resumo:
The antioxidant activity of natural and synthetic compounds was evaluated using five in vitro methods: ferric reducing/antioxidant power (FRAP), 2,2-diphenyl-1-picrylhydradzyl (DPPH), oxygen radical absorption capacity (ORAL), oxidation of an aqueous dispersion of linoleic acid accelerated by azo-initiators (LAOX), and oxidation of a meat homogenate submitted to a thermal treatment (TBARS). All results were expressed as Trolox equivalents. The application of multivariate statistical techniques suggested that the phenolic compounds (caffeic acid, carnosic acid, genistein and resveratrol), beyond their high antioxidant activity measured by the DPPH, FRAP and TBARS methods, showed the highest ability to react with the radicals in the ORAC methodology, compared to the other compounds evaluated in this study (ascorbic acid, erythorbate, tocopherol, BHT, Trolox, tryptophan, citric acid, EDTA, glutathione, lecithin, methionine and tyrosine). This property was significantly correlated with the number of phenolic rings and catecholic structure present in the molecule. Based on a multivariate analysis, it is possible to select compounds from different clusters and explore their antioxidant activity interactions in food products.
Resumo:
Objective: The aim of this article is to propose an integrated framework for extracting and describing patterns of disorders from medical images using a combination of linear discriminant analysis and active contour models. Methods: A multivariate statistical methodology was first used to identify the most discriminating hyperplane separating two groups of images (from healthy controls and patients with schizophrenia) contained in the input data. After this, the present work makes explicit the differences found by the multivariate statistical method by subtracting the discriminant models of controls and patients, weighted by the pooled variance between the two groups. A variational level-set technique was used to segment clusters of these differences. We obtain a label of each anatomical change using the Talairach atlas. Results: In this work all the data was analysed simultaneously rather than assuming a priori regions of interest. As a consequence of this, by using active contour models, we were able to obtain regions of interest that were emergent from the data. The results were evaluated using, as gold standard, well-known facts about the neuroanatomical changes related to schizophrenia. Most of the items in the gold standard was covered in our result set. Conclusions: We argue that such investigation provides a suitable framework for characterising the high complexity of magnetic resonance images in schizophrenia as the results obtained indicate a high sensitivity rate with respect to the gold standard. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
There is an undeniable positive effect of innovation for both firms and the economy, with particular regards to the financial performance of firms. However, there is an important role of the decision making process for the allocation of resources to finance the innovation process. The aim of this paper is to understand what factors explain the decision making process in innovation activities of Portuguese firms. This is an empirical study, based on the modern theoretical approaches, which has relied on five key aspects for innovation: barriers, sources, cooperation, funding; and the decision making process. Primary data was collected through surveys to firms that have applied for innovation programmes within the Portuguese innovation agency. Univariate and multivariate statistical techniques were used. Our results suggest that the factors that mostly influence the Portuguese firms’ innovation decision-making processes are economical and financial (namely those related to profit increase and labour costs reduction).
Resumo:
Dissertação para obtenção do Grau de Doutor em Engenharia Química, especialidade de Engenharia Bioquímica