887 resultados para Analysis and statistical methods
Resumo:
Assemblages of organic-walled dinoflagellate cysts (dinocysts) from 116 marine surface samples have been analysed to assess the relationship between the spatial distribution of dinocysts and modern local environmental conditions [e.g. sea surface temperature (SST), sea surface salinity (SSS), productivity] in the eastern Indian Ocean. Results from the percentage analysis and statistical methods such as multivariate ordination analysis and end-member modelling, indicate the existence of three distinct environmental and oceanographic regions in the study area. Region 1 is located in western and eastern Indonesia and controlled by high SSTs and a low nutrient content of the surface waters. The Indonesian Throughflow (ITF) region (Region 2) is dominated by heterotrophic dinocyst species reflecting the region's high productivity. Region 3 is encompassing the area offshore north-west and west Australia which is characterised by the water masses of the Leeuwin Current, a saline and nutrient depleted southward current featuring energetic eddies.
Resumo:
Accurate quantitative estimation of exposure using retrospective data has been one of the most challenging tasks in the exposure assessment field. To improve these estimates, some models have been developed using published exposure databases with their corresponding exposure determinants. These models are designed to be applied to reported exposure determinants obtained from study subjects or exposure levels assigned by an industrial hygienist, so quantitative exposure estimates can be obtained. ^ In an effort to improve the prediction accuracy and generalizability of these models, and taking into account that the limitations encountered in previous studies might be due to limitations in the applicability of traditional statistical methods and concepts, the use of computer science- derived data analysis methods, predominantly machine learning approaches, were proposed and explored in this study. ^ The goal of this study was to develop a set of models using decision trees/ensemble and neural networks methods to predict occupational outcomes based on literature-derived databases, and compare, using cross-validation and data splitting techniques, the resulting prediction capacity to that of traditional regression models. Two cases were addressed: the categorical case, where the exposure level was measured as an exposure rating following the American Industrial Hygiene Association guidelines and the continuous case, where the result of the exposure is expressed as a concentration value. Previously developed literature-based exposure databases for 1,1,1 trichloroethane, methylene dichloride and, trichloroethylene were used. ^ When compared to regression estimations, results showed better accuracy of decision trees/ensemble techniques for the categorical case while neural networks were better for estimation of continuous exposure values. Overrepresentation of classes and overfitting were the main causes for poor neural network performance and accuracy. Estimations based on literature-based databases using machine learning techniques might provide an advantage when they are applied to other methodologies that combine `expert inputs' with current exposure measurements, like the Bayesian Decision Analysis tool. The use of machine learning techniques to more accurately estimate exposures from literature-based exposure databases might represent the starting point for the independence from the expert judgment.^
Resumo:
Preparative liquid chromatography is one of the most selective separation techniques in the fine chemical, pharmaceutical, and food industries. Several process concepts have been developed and applied for improving the performance of classical batch chromatography. The most powerful approaches include various single-column recycling schemes, counter-current and cross-current multi-column setups, and hybrid processes where chromatography is coupled with other unit operations such as crystallization, chemical reactor, and/or solvent removal unit. To fully utilize the potential of stand-alone and integrated chromatographic processes, efficient methods for selecting the best process alternative as well as optimal operating conditions are needed. In this thesis, a unified method is developed for analysis and design of the following singlecolumn fixed bed processes and corresponding cross-current schemes: (1) batch chromatography, (2) batch chromatography with an integrated solvent removal unit, (3) mixed-recycle steady state recycling chromatography (SSR), and (4) mixed-recycle steady state recycling chromatography with solvent removal from fresh feed, recycle fraction, or column feed (SSR–SR). The method is based on the equilibrium theory of chromatography with an assumption of negligible mass transfer resistance and axial dispersion. The design criteria are given in general, dimensionless form that is formally analogous to that applied widely in the so called triangle theory of counter-current multi-column chromatography. Analytical design equations are derived for binary systems that follow competitive Langmuir adsorption isotherm model. For this purpose, the existing analytic solution of the ideal model of chromatography for binary Langmuir mixtures is completed by deriving missing explicit equations for the height and location of the pure first component shock in the case of a small feed pulse. It is thus shown that the entire chromatographic cycle at the column outlet can be expressed in closed-form. The developed design method allows predicting the feasible range of operating parameters that lead to desired product purities. It can be applied for the calculation of first estimates of optimal operating conditions, the analysis of process robustness, and the early-stage evaluation of different process alternatives. The design method is utilized to analyse the possibility to enhance the performance of conventional SSR chromatography by integrating it with a solvent removal unit. It is shown that the amount of fresh feed processed during a chromatographic cycle and thus the productivity of SSR process can be improved by removing solvent. The maximum solvent removal capacity depends on the location of the solvent removal unit and the physical solvent removal constraints, such as solubility, viscosity, and/or osmotic pressure limits. Usually, the most flexible option is to remove solvent from the column feed. Applicability of the equilibrium design for real, non-ideal separation problems is evaluated by means of numerical simulations. Due to assumption of infinite column efficiency, the developed design method is most applicable for high performance systems where thermodynamic effects are predominant, while significant deviations are observed under highly non-ideal conditions. The findings based on the equilibrium theory are applied to develop a shortcut approach for the design of chromatographic separation processes under strongly non-ideal conditions with significant dispersive effects. The method is based on a simple procedure applied to a single conventional chromatogram. Applicability of the approach for the design of batch and counter-current simulated moving bed processes is evaluated with case studies. It is shown that the shortcut approach works the better the higher the column efficiency and the lower the purity constraints are.
Resumo:
A fast, sensitive and cost-effective multiplex-PCR assay for Mycobacterium tuberculosis complex (MTC) and Mycobacterium avium (M. avium) identification for routine diagnosis was evaluated. A total of 158 isolates of mycobacteria from 448 clinical specimens from patients with symptoms of mycobacterial disease were analyzed. By conventional biochemical methods 151 isolates were identified as M. tuberculosis, five as M. avium and two as Mycobacterium chelonae (M. chelonae). Mycolic acid patterns confirmed these results. Multiplex-PCR detected only IS6110 in isolates identified as MTC, and IS1245 was found only in the M. avium isolates. The method applied to isolates from two patients, identified by conventional methods and mycolic acid analysis, one as M. avium and other as M. chelonae, resulted positive for IS6110, suggesting co-infection with M. tuberculosis. These patients were successfully submitted to tuberculosis treatment. The multiplex-PCR method may offer expeditious identification of MTC and M. avium, which may minimize risks for active transmission of these organisms and provide useful treatment information.
Resumo:
The main objective involved with this paper consists of presenting the results obtained from the application of artificial neural networks and statistical tools in the automatic identification and classification process of faults in electric power distribution systems. The developed techniques to treat the proposed problem have used, in an integrated way, several approaches that can contribute to the successful detection process of faults, aiming that it is carried out in a reliable and safe way. The compilations of the results obtained from practical experiments accomplished in a pilot distribution feeder have demonstrated that the developed techniques provide accurate results, identifying and classifying efficiently the several occurrences of faults observed in the feeder. © 2006 IEEE.
Resumo:
This work deals with the car sequencing (CS) problem, a combinatorial optimization problem for sequencing mixed-model assembly lines. The aim is to find a production sequence for different variants of a common base product, such that work overload of the respective line operators is avoided or minimized. The variants are distinguished by certain options (e.g., sun roof yes/no) and, therefore, require different processing times at the stations of the line. CS introduces a so-called sequencing rule H:N for each option, which restricts the occurrence of this option to at most H in any N consecutive variants. It seeks for a sequence that leads to no or a minimum number of sequencing rule violations. In this work, CS’ suitability for workload-oriented sequencing is analyzed. Therefore, its solution quality is compared in experiments to the related mixed-model sequencing problem. A new sequencing rule generation approach as well as a new lower bound for the problem are presented. Different exact and heuristic solution methods for CS are developed and their efficiency is shown in experiments. Furthermore, CS is adjusted and applied to a resequencing problem with pull-off tables.
Resumo:
This paper describes a CL-SR system that employs two different techniques: the first one is based on NLP rules that consist on applying logic forms to the topic processing while the second one basically consists on applying the IR-n statistical search engine to the spoken document collection. The application of logic forms to the topics allows to increase the weight of topic terms according to a set of syntactic rules. Thus, the weights of the topic terms are used by IR-n system in the information retrieval process.
Resumo:
We show the equivalence between the use of correspondence analysis (CA)of concadenated tables and the application of a particular version ofconjoint analysis called categorical conjoint measurement (CCM). Theconnection is established using canonical correlation (CC). The second part introduces the interaction e¤ects in all three variants of theanalysis and shows how to pass between the results of each analysis.
Resumo:
The strongest wish of the customer concerning chemical pulp features is consistent, uniform quality. Variation may be controlled and reduced by using statistical methods. However, studies addressing the application and benefits of statistical methods in forest product sector are scarce. Thus, the customer wish is the root cause of the motivation behind this dissertation. The research problem addressed by this dissertation is that companies in the chemical forest product sector require new knowledge for improving their utilization of statistical methods. To gain this new knowledge, the research problem is studied from five complementary viewpoints – challenges and success factors, organizational learning, problem solving, economic benefit, and statistical methods as management tools. The five research questions generated on the basis of these viewpoints are answered in four research papers, which are case studies based on empirical data collection. This research as a whole complements the literature dealing with the use of statistical methods in the forest products industry. Practical examples of the application of statistical process control, case-based reasoning, the cross-industry standard process for data mining, and performance measurement methods in the context of chemical forest products manufacturing are brought to the public knowledge of the scientific community. The benefit of the application of these methods is estimated or demonstrated. The purpose of this dissertation is to find pragmatic ideas for companies in the chemical forest product sector in order for them to improve their utilization of statistical methods. The main practical implications of this doctoral dissertation can be summarized in four points: 1. It is beneficial to reduce variation in chemical forest product manufacturing processes 2. Statistical tools can be used to reduce this variation 3. Problem-solving in chemical forest product manufacturing processes can be intensified through the use of statistical methods 4. There are certain success factors and challenges that need to be addressed when implementing statistical methods
Resumo:
We discuss statistical inference problems associated with identification and testability in econometrics, and we emphasize the common nature of the two issues. After reviewing the relevant statistical notions, we consider in turn inference in nonparametric models and recent developments on weakly identified models (or weak instruments). We point out that many hypotheses, for which test procedures are commonly proposed, are not testable at all, while some frequently used econometric methods are fundamentally inappropriate for the models considered. Such situations lead to ill-defined statistical problems and are often associated with a misguided use of asymptotic distributional results. Concerning nonparametric hypotheses, we discuss three basic problems for which such difficulties occur: (1) testing a mean (or a moment) under (too) weak distributional assumptions; (2) inference under heteroskedasticity of unknown form; (3) inference in dynamic models with an unlimited number of parameters. Concerning weakly identified models, we stress that valid inference should be based on proper pivotal functions —a condition not satisfied by standard Wald-type methods based on standard errors — and we discuss recent developments in this field, mainly from the viewpoint of building valid tests and confidence sets. The techniques discussed include alternative proposed statistics, bounds, projection, split-sampling, conditioning, Monte Carlo tests. The possibility of deriving a finite-sample distributional theory, robustness to the presence of weak instruments, and robustness to the specification of a model for endogenous explanatory variables are stressed as important criteria assessing alternative procedures.
Resumo:
This paper presents the development and evaluation of a method for enabling quantitative and automatic scoring of alternating tapping performance of patients with Parkinson’s disease (PD). Ten healthy elderly subjects and 95 patients in different clinical stages of PD have utilized a touch-pad handheld computer to perform alternate tapping tests in their home environments. First, a neurologist used a web-based system to visually assess impairments in four tapping dimensions (‘speed’, ‘accuracy’, ‘fatigue’ and ‘arrhythmia’) and a global tapping severity (GTS). Second, tapping signals were processed with time series analysis and statistical methods to derive 24 quantitative parameters. Third, principal component analysis was used to reduce the dimensions of these parameters and to obtain scores for the four dimensions. Finally, a logistic regression classifier was trained using a 10-fold stratified cross-validation to map the reduced parameters to the corresponding visually assessed GTS scores. Results showed that the computed scores correlated well to visually assessed scores and were significantly different across Unified Parkinson’s Disease Rating Scale scores of upper limb motor performance. In addition, they had good internal consistency, had good ability to discriminate between healthy elderly and patients in different disease stages, had good sensitivity to treatment interventions and could reflect the natural disease progression over time. In conclusion, the automatic method can be useful to objectively assess the tapping performance of PD patients and can be included in telemedicine tools for remote monitoring of tapping.
Resumo:
In this article, we will link neuroimaging, data analysis, and intervention methods in an important psychiatric condition: auditory verbal hallucinations (AVH). The clinical and phenomenological background as well as neurophysiological findings will be covered and discussed with respect to noninvasive brain stimulation. Additionally, methods of noninvasive brain stimulation will be presented as ways to intervene with AVH. Finally, preliminary conclusions and possible future perspectives will be proposed.
Resumo:
The Compact Muon Solenoid (CMS) detector is described. The detector operates at the Large Hadron Collider (LHC) at CERN. It was conceived to study proton-proton (and lead-lead) collisions at a centre-of-mass energy of 14 TeV (5.5 TeV nucleon-nucleon) and at luminosities up to 10(34)cm(-2)s(-1) (10(27)cm(-2)s(-1)). At the core of the CMS detector sits a high-magnetic-field and large-bore superconducting solenoid surrounding an all-silicon pixel and strip tracker, a lead-tungstate scintillating-crystals electromagnetic calorimeter, and a brass-scintillator sampling hadron calorimeter. The iron yoke of the flux-return is instrumented with four stations of muon detectors covering most of the 4 pi solid angle. Forward sampling calorimeters extend the pseudo-rapidity coverage to high values (vertical bar eta vertical bar <= 5) assuring very good hermeticity. The overall dimensions of the CMS detector are a length of 21.6 m, a diameter of 14.6 m and a total weight of 12500 t.
Resumo:
La acumulación de material sólido en embalses, cauces fluviales y en zonas marítimas hace que la extracción mecánica de estos materiales por medio de succión sea cada vez mas frecuente, por ello resulta importante estudiar el rendimiento de la succión de estos materiales analizando la forma de las boquillas y los parámetros del flujo incluyendo la bomba. Esta tesis estudia, mediante equipos experimentales, la eficacia de distintos dispositivos de extracción de sólidos (utilizando boquillas de diversas formas y bombas de velocidad variable). El dispositivo experimental ha sido desarrollado en el Laboratorio de Hidráulica de la E.T.S.I. de Caminos, C. y P. de la Universidad Politécnica de Madrid. Dicho dispositivo experimental incluye un lecho sumergido de distintos tipos de sedimentos, boquillas de extracción de sólidos y bomba de velocidad variable, así como un elemento de separación del agua y los sólidos extraídos. Los parámetros básicos analizados son el caudal líquido total bombeado, el caudal sólido extraído, diámetro de la tubería de succión, forma y sección de la boquilla extractora, así como los parámetros de velocidad y rendimiento en la bomba de velocidad variable. Los resultados de las medidas obtenidas en el dispositivo experimental han sido estudiados por medio del análisis dimensional y con métodos estadísticos. A partir de este estudio se ha desarrollado una nueva formulación, que relaciona el caudal sólido extraído con los diámetros de tubería y boquilla, caudal líquido bombeado y velocidad de giro de la bomba. Así mismo, desde el punto de vista práctico, se han analizado la influencia de la forma de la boquilla con la capacidad de extracción de sólidos a igualdad del resto de los parámetros, de forma que se puedan recomendar que forma de la boquilla es la más apropiada. The accumulation of solid material in reservoirs, river channels and sea areas causes the mechanical extraction of these materials by suction is becoming much more common, so it is important to study the performance of the suction of these materials analyzing the shape of the nozzles and flow parameters, including the pump. This thesis studies, using experimental equipment, the effectiveness of different solids removal devices (using nozzles of different shapes and variable speed pumps). The experimental device was developed at the Hydraulics Laboratory of the Civil University of the Polytechnic University of Madrid. The device included a submerged bed with different types of sediment solids, different removal nozzles and variable speed pump. It also includes a water separation element and the solids extracted. The key parameters analyzed are the total liquid volume pumped, the solid volume extracted, diameter of the suction pipe, a section of the nozzle and hood, and the parameters of speed and efficiency of the variable speed pump. The basic parameters analyzed are the total liquid volume pumped, the removed solid volume, the diameter of the suction pipe, the shape and cross-section of the nozzle, and the parameters of speed, efficiency and energy consumed by the variable speed pump. The measurements obtained on the experimental device have been studied with dimensional analysis and statistical methods. The outcome of this study is a new formulation, which relates the solid volume extracted with the pipe and nozzle diameters, the pumped liquid flow and the speed of the pump. Also, from a practical point of view, the influence of the shape of the nozzle was compared with the solid extraction capacity, keeping equal the rest of the parameters. So, a recommendation of the best shape of the nozzle can be given.