914 resultados para Dynamic data analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The paper presents some contemporary approaches to spatial environmental data analysis. The main topics are concentrated on the decision-oriented problems of environmental spatial data mining and modeling: valorization and representativity of data with the help of exploratory data analysis, spatial predictions, probabilistic and risk mapping, development and application of conditional stochastic simulation models. The innovative part of the paper presents integrated/hybrid model-machine learning (ML) residuals sequential simulations-MLRSS. The models are based on multilayer perceptron and support vector regression ML algorithms used for modeling long-range spatial trends and sequential simulations of the residuals. NIL algorithms deliver non-linear solution for the spatial non-stationary problems, which are difficult for geostatistical approach. Geostatistical tools (variography) are used to characterize performance of ML algorithms, by analyzing quality and quantity of the spatially structured information extracted from data with ML algorithms. Sequential simulations provide efficient assessment of uncertainty and spatial variability. Case study from the Chernobyl fallouts illustrates the performance of the proposed model. It is shown that probability mapping, provided by the combination of ML data driven and geostatistical model based approaches, can be efficiently used in decision-making process. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Linezolid is used off-label to treat multidrug-resistant tuberculosis (MDR-TB) in absence of systematic evidence. We performed a systematic review and meta-analysis on efficacy, safety and tolerability of linezolid-containing regimes based on individual data analysis. 12 studies (11 countries from three continents) reporting complete information on safety, tolerability, efficacy of linezolid-containing regimes in treating MDR-TB cases were identified based on Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. Meta-analysis was performed using the individual data of 121 patients with a definite treatment outcome (cure, completion, death or failure). Most MDR-TB cases achieved sputum smear (86 (92.5%) out of 93) and culture (100 (93.5%) out of 107) conversion after treatment with individualised regimens containing linezolid (median (inter-quartile range) times for smear and culture conversions were 43.5 (21-90) and 61 (29-119) days, respectively) and 99 (81.8%) out of 121 patients were successfully treated. No significant differences were detected in the subgroup efficacy analysis (daily linezolid dosage ≤600 mg versus >600 mg). Adverse events were observed in 63 (58.9%) out of 107 patients, of which 54 (68.4%) out of 79 were major adverse events that included anaemia (38.1%), peripheral neuropathy (47.1%), gastro-intestinal disorders (16.7%), optic neuritis (13.2%) and thrombocytopenia (11.8%). The proportion of adverse events was significantly higher when the linezolid daily dosage exceeded 600 mg. The study results suggest an excellent efficacy but also the necessity of caution in the prescription of linezolid.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper analyses the financial impact of the enlargement of the European Union (EU) to include 10 new Central and Eastern European Nations (CEEN) on firms’ business and financial structures. To this end, we employ quantitative analytic techniques and financial ratios. In this context, we hope to discover whether firms in the new EU member States tend to converge with business in the Europe of the 15 in terms of the structure of firms’ financial statements. We examine the extent to which the increasing integration of the former may foster the convergence of productive structures. The methodology followed consists of an analysis of the evolution of 12 financial ratios in a sample of firms obtained from the AMADEUS data base. To that end, we perform a Dynamic Factor Analysis that identifies the determining factors of the joint evolution of deviations in the financial ratios with respect to the average value of firms in the EU-15. This analysis allows us to analyse the convergence in each of the CEEN nations with respect to the EU-15.

Relevância:

90.00% 90.00%

Publicador:

Relevância:

90.00% 90.00%

Publicador:

Resumo:

ABSTRACT Dual-trap optical tweezers are often used in high-resolution measurements in single-molecule biophysics. Such measurements can be hindered by the presence of extraneous noise sources, the most prominent of which is the coupling of fluctuations along different spatial directions, which may affect any optical tweezers setup. In this article, we analyze, both from the theoretical and the experimental points of view, the most common source for these couplings in dual-trap optical-tweezers setups: the misalignment of traps and tether. We give criteria to distinguish different kinds of misalignment, to estimate their quantitative relevance and to include them in the data analysis. The experimental data is obtained in a, to our knowledge, novel dual-trap optical-tweezers setup that directly measures forces. In the case in which misalignment is negligible, we provide a method to measure the stiffness of traps and tether based on variance analysis. This method can be seen as a calibration technique valid beyond the linear trap region. Our analysis is then employed to measure the persistence length of dsDNA tethers of three different lengths spanning two orders of magnitude. The effective persistence length of such tethers is shown to decrease with the contour length, in accordance with previous studies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Sport betting is a lucrative business for bookmakers, for the lucky (or wise) punters, but also for governments and for sport. While not new or even recent, the deviances linked to sport betting, primarily match-fixing, have gained increased media exposure in the past decade. This exploratory study is a qualitative content analysis of the press coverage of sport betting-related deviances in football in two countries (UK and France), using in each case two leading national publications over a period of five years. Data analysis indicates a mounting coverage of sport betting scandals, with teams, players and criminals increasingly framed as culprits, while authorities and federations primarily assume a positive role. As for the origin of sport betting deviances, French newspapers tend to blame the system (in an abstract way); British newspapers, in contrast, focus more on individual weaknesses, notably greed. This article contributed to the growing body of literature on the importance of these deviances and on the way they are perceived by sport organizations, legislators and the public at large.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The agricultural sector has always been characterized by a predominance of small firms. International competition and the consequent need for restraining costs are permanent challenges for farms. This paper performs an empirical investigation of cost behavior in agriculture using panel data analysis. Our results show that transactions caused by complexity influence farm costs with opposite effects for specific and indirect costs. While transactions allow economies of scale in specific costs, they significantly increase indirect costs. However, the main driver for farm costs is volume. In addition, important differences exist for small and big farms, since transactional variables significantly influence the former but not the latter. While sophisticated management tools, such ABC, could provide only limited complementary useful information but no essential allocation bases for farms, they seem inappropriate for small farms

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The agricultural sector has always been characterized by a predominance of small firms. International competition and the consequent need for restraining costs are permanent challenges for farms. This paper performs an empirical investigation of cost behavior in agriculture using panel data analysis. Our results show that transactions caused by complexity influence farm costs with opposite effects for specific and indirect costs. While transactions allow economies of scale in specific costs, they significantly increase indirect costs. However, the main driver for farm costs is volume. In addition, important differences exist for small and big farms, since transactional variables significantly influence the former but not the latter. While sophisticated management tools, such ABC, could provide only limited complementary useful information but no essential allocation bases for farms, they seem inappropriate for small farms

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We present a participant study that compares biological data exploration tasks using volume renderings of laser confocal microscopy data across three environments that vary in level of immersion: a desktop, fishtank, and cave system. For the tasks, data, and visualization approach used in our study, we found that subjects qualitatively preferred and quantitatively performed better in the cave compared with the fishtank and desktop. Subjects performed real-world biological data analysis tasks that emphasized understanding spatial relationships including characterizing the general features in a volume, identifying colocated features, and reporting geometric relationships such as whether clusters of cells were coplanar. After analyzing data in each environment, subjects were asked to choose which environment they wanted to analyze additional data sets in - subjects uniformly selected the cave environment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Recent years have produced great advances in the instrumentation technology. The amount of available data has been increasing due to the simplicity, speed and accuracy of current spectroscopic instruments. Most of these data are, however, meaningless without a proper analysis. This has been one of the reasons for the overgrowing success of multivariate handling of such data. Industrial data is commonly not designed data; in other words, there is no exact experimental design, but rather the data have been collected as a routine procedure during an industrial process. This makes certain demands on the multivariate modeling, as the selection of samples and variables can have an enormous effect. Common approaches in the modeling of industrial data are PCA (principal component analysis) and PLS (projection to latent structures or partial least squares) but there are also other methods that should be considered. The more advanced methods include multi block modeling and nonlinear modeling. In this thesis it is shown that the results of data analysis vary according to the modeling approach used, thus making the selection of the modeling approach dependent on the purpose of the model. If the model is intended to provide accurate predictions, the approach should be different than in the case where the purpose of modeling is mostly to obtain information about the variables and the process. For industrial applicability it is essential that the methods are robust and sufficiently simple to apply. In this way the methods and the results can be compared and an approach selected that is suitable for the intended purpose. Differences in data analysis methods are compared with data from different fields of industry in this thesis. In the first two papers, the multi block method is considered for data originating from the oil and fertilizer industries. The results are compared to those from PLS and priority PLS. The third paper considers applicability of multivariate models to process control for a reactive crystallization process. In the fourth paper, nonlinear modeling is examined with a data set from the oil industry. The response has a nonlinear relation to the descriptor matrix, and the results are compared between linear modeling, polynomial PLS and nonlinear modeling using nonlinear score vectors.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A new analytical method was developed to non-destructively determine pH and degree of polymerisation (DP) of cellulose in fibres in 19th 20th century painting canvases, and to identify the fibre type: cotton, linen, hemp, ramie or jute. The method is based on NIR spectroscopy and multivariate data analysis, while for calibration and validation a reference collection of 199 historical canvas samples was used. The reference collection was analysed destructively using microscopy and chemical analytical methods. Partial least squares regression was used to build quantitative methods to determine pH and DP, and linear discriminant analysis was used to determine the fibre type. To interpret the obtained chemical information, an expert assessment panel developed a categorisation system to discriminate between canvases that may not be fit to withstand excessive mechanical stress, e.g. transportation. The limiting DP for this category was found to be 600. With the new method and categorisation system, canvases of 12 Dalí paintings from the Fundació Gala-Salvador Dalí (Figueres, Spain) were non-destructively analysed for pH, DP and fibre type, and their fitness determined, which informs conservation recommendations. The study demonstrates that collection-wide canvas condition surveys can be performed efficiently and non-destructively, which could significantly improve collection management.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Raw measurement data does not always immediately convey useful information, but applying mathematical statistical analysis tools into measurement data can improve the situation. Data analysis can offer benefits like acquiring meaningful insight from the dataset, basing critical decisions on the findings, and ruling out human bias through proper statistical treatment. In this thesis we analyze data from an industrial mineral processing plant with the aim of studying the possibility of forecasting the quality of the final product, given by one variable, with a model based on the other variables. For the study mathematical tools like Qlucore Omics Explorer (QOE) and Sparse Bayesian regression (SB) are used. Later on, linear regression is used to build a model based on a subset of variables that seem to have most significant weights in the SB model. The results obtained from QOE show that the variable representing the desired final product does not correlate with other variables. For SB and linear regression, the results show that both SB and linear regression models built on 1-day averaged data seriously underestimate the variance of true data, whereas the two models built on 1-month averaged data are reliable and able to explain a larger proportion of variability in the available data, making them suitable for prediction purposes. However, it is concluded that no single model can fit well the whole available dataset and therefore, it is proposed for future work to make piecewise non linear regression models if the same available dataset is used, or the plant to provide another dataset that should be collected in a more systematic fashion than the present data for further analysis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Workshop at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Results of subgroup analysis (SA) reported in randomized clinical trials (RCT) cannot be adequately interpreted without information about the methods used in the study design and the data analysis. Our aim was to show how often inaccurate or incomplete reports occur. First, we selected eight methodological aspects of SA on the basis of their importance to a reader in determining the confidence that should be placed in the author's conclusions regarding such analysis. Then, we reviewed the current practice of reporting these methodological aspects of SA in clinical trials in four leading journals, i.e., the New England Journal of Medicine, the Journal of the American Medical Association, the Lancet, and the American Journal of Public Health. Eight consecutive reports from each journal published after July 1, 1998 were included. Of the 32 trials surveyed, 17 (53%) had at least one SA. Overall, the proportion of RCT reporting a particular methodological aspect ranged from 23 to 94%. Information on whether the SA preceded/followed the analysis was reported in only 7 (41%) of the studies. Of the total possible number of items to be reported, NEJM, JAMA, Lancet and AJPH clearly mentioned 59, 67, 58 and 72%, respectively. We conclude that current reporting of SA in RCT is incomplete and inaccurate. The results of such SA may have harmful effects on treatment recommendations if accepted without judicious scrutiny. We recommend that editors improve the reporting of SA in RCT by giving authors a list of the important items to be reported.