896 resultados para High-dimensional data visualization


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper presents studies of Bose--Einstein Correlations (BEC) for pairs of like-sign charged particles measured in the kinematic range pT> 100 MeV and |η|< 2.5 in proton--proton collisions at centre-of-mass energies of 0.9 and 7 TeV with the ATLAS detector at the CERN Large Hadron Collider. The integrated luminosities are approximately 7 μb−1, 190 μb−1 and 12.4 nb−1 for 0.9 TeV, 7 TeV minimum-bias and 7 TeV high-multiplicity data samples, respectively. The multiplicity dependence of the BEC parameters characterizing the correlation strength and the correlation source size are investigated for charged-particle multiplicities of up to 240. A saturation effect in the multiplicity dependence of the correlation source size is observed using the high-multiplicity 7 TeV data sample. The dependence of the BEC parameters on the average transverse momentum of the particle pair is also investigated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Within last few years a new type of instruments called Terrestrial Laser Scanners (TLS) entered to the commercial market. These devices brought a possibility to obtain completely new type of spatial, three dimensional data describing the object of interest. TLS instruments are generating a type of data that needs a special treatment. Appearance of this technique made possible to monitor deformations of very large objects, like investigated here landslides, with new quality level. This change is visible especially with relation to the size and number of the details that can be observed with this new method. Taking into account this context presented here work is oriented on recognition and characterization of raw data received from the TLS instruments as well as processing phases, tools and techniques to do them. Main objective are definition and recognition of the problems related with usage of the TLS data, characterization of the quality single point generated by TLS, description and investigation of the TLS processing approach for landslides deformation measurements allowing to obtain 3D deformation characteristic and finally validation of the obtained results. The above objectives are based on the bibliography studies and research work followed by several experiments that will prove the conclusions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fractal geometry is a fundamental approach for describing the complex irregularities of the spatial structure of point patterns. The present research characterizes the spatial structure of the Swiss population distribution in the three Swiss geographical regions (Alps, Plateau and Jura) and at the entire country level. These analyses were carried out using fractal and multifractal measures for point patterns, which enabled the estimation of the spatial degree of clustering of a distribution at different scales. The Swiss population dataset is presented on a grid of points and thus it can be modelled as a "point process" where each point is characterized by its spatial location (geometrical support) and a number of inhabitants (measured variable). The fractal characterization was performed by means of the box-counting dimension and the multifractal analysis was conducted through the Renyi's generalized dimensions and the multifractal spectrum. Results showed that the four population patterns are all multifractals and present different clustering behaviours. Applying multifractal and fractal methods at different geographical regions and at different scales allowed us to quantify and describe the dissimilarities between the four structures and their underlying processes. This paper is the first Swiss geodemographic study applying multifractal methods using high resolution data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop methods for Bayesian inference in vector error correction models which are subject to a variety of switches in regime (e.g. Markov switches in regime or structural breaks). An important aspect of our approach is that we allow both the cointegrating vectors and the number of cointegrating relationships to change when the regime changes. We show how Bayesian model averaging or model selection methods can be used to deal with the high-dimensional model space that results. Our methods are used in an empirical study of the Fisher effect.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop methods for Bayesian inference in vector error correction models which are subject to a variety of switches in regime (e.g. Markov switches in regime or structural breaks). An important aspect of our approach is that we allow both the cointegrating vectors and the number of cointegrating relationships to change when the regime changes. We show how Bayesian model averaging or model selection methods can be used to deal with the high-dimensional model space that results. Our methods are used in an empirical study of the Fisher e ffect.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Somatic copy number aberrations (CNA) represent a mutation type encountered in the majority of cancer genomes. Here, we present the 2014 edition of arrayMap (http://www.arraymap.org), a publicly accessible collection of pre-processed oncogenomic array data sets and CNA profiles, representing a vast range of human malignancies. Since the initial release, we have enhanced this resource both in content and especially with regard to data mining support. The 2014 release of arrayMap contains more than 64,000 genomic array data sets, representing about 250 tumor diagnoses. Data sets included in arrayMap have been assembled from public repositories as well as additional resources, and integrated by applying custom processing pipelines. Online tools have been upgraded for a more flexible array data visualization, including options for processing user provided, non-public data sets. Data integration has been improved by mapping to multiple editions of the human reference genome, with the majority of the data now being available for the UCSC hg18 as well as GRCh37 versions. The large amount of tumor CNA data in arrayMap can be freely downloaded by users to promote data mining projects, and to explore special events such as chromothripsis-like genome patterns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Neurocritical care depends, in part, on careful patient monitoring but as yet there are little data on what processes are the most important to monitor, how these should be monitored, and whether monitoring these processes is cost-effective and impacts outcome. At the same time, bioinformatics is a rapidly emerging field in critical care but as yet there is little agreement or standardization on what information is important and how it should be displayed and analyzed. The Neurocritical Care Society in collaboration with the European Society of Intensive Care Medicine, the Society for Critical Care Medicine, and the Latin America Brain Injury Consortium organized an international, multidisciplinary consensus conference to begin to address these needs. International experts from neurosurgery, neurocritical care, neurology, critical care, neuroanesthesiology, nursing, pharmacy, and informatics were recruited on the basis of their research, publication record, and expertise. They undertook a systematic literature review to develop recommendations about specific topics on physiologic processes important to the care of patients with disorders that require neurocritical care. This review does not make recommendations about treatment, imaging, and intraoperative monitoring. A multidisciplinary jury, selected for their expertise in clinical investigation and development of practice guidelines, guided this process. The GRADE system was used to develop recommendations based on literature review, discussion, integrating the literature with the participants' collective experience, and critical review by an impartial jury. Emphasis was placed on the principle that recommendations should be based on both data quality and on trade-offs and translation into clinical practice. Strong consideration was given to providing pragmatic guidance and recommendations for bedside neuromonitoring, even in the absence of high quality data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To compare the effect of hyperthermia on maximal oxygen uptake (VO2max) in men and women, VO2max was measured in 11 male and 11 female runners under seven conditions involving various ambient temperatures (Ta at 50% RH) and preheating designed to manipulate the esophageal (Tes) and mean skin (Tsk) temperatures at VO2max. The conditions were: 25 degrees C, no preheating (control); 25, 35, 40, and 45 degrees C, with exercise-induced preheating by a 20-min walk at approximately 33% of control VO2max; 45 degrees C, no preheating; and 45 degrees C, with passive preheating during which Tes and Tsk were increased to the same degree as at the end of the 20-min walk at 45 degrees C. Compared to VO2max (l x min(-1)) in the control condition (4.52+/-0.46 in men, 3.01+/-0.45 in women), VO2max in men and women was reduced with exercise-induced or passive preheating and increased Ta, approximately 4% at 35 degrees C, approximately 9% at 40 degrees C and approximately 18% at 45 degrees C. Percentage reductions (7-36%) in physical performance (treadmill test time to exhaustion) were strongly related to reductions in VO2max (r=0.82-0.84). The effects of hyperthermia on VO2max and physical performance in men and women were almost identical. We conclude that men and women do not differ in their thermal responses to maximal exercise, or in the relationship of hyperthermia to reductions in VO2max and physical performance at high temperature. Data are reported as mean (SD) unless otherwise stated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper proposes an approach aimed at detecting optimal model parameter combinations to achieve the most representative description of uncertainty in the model performance. A classification problem is posed to find the regions of good fitting models according to the values of a cost function. Support Vector Machine (SVM) classification in the parameter space is applied to decide if a forward model simulation is to be computed for a particular generated model. SVM is particularly designed to tackle classification problems in high-dimensional space in a non-parametric and non-linear way. SVM decision boundaries determine the regions that are subject to the largest uncertainty in the cost function classification, and, therefore, provide guidelines for further iterative exploration of the model space. The proposed approach is illustrated by a synthetic example of fluid flow through porous media, which features highly variable response due to the parameter values' combination.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The mission of the Encyclopedia of DNA Elements (ENCODE) Project is to enable the scientific and medical communities to interpret the human genome sequence and apply it to understand human biology and improve health. The ENCODE Consortium is integrating multiple technologies and approaches in a collective effort to discover and define the functional elements encoded in the human genome, including genes, transcripts, and transcriptional regulatory regions, together with their attendant chromatin states and DNA methylation patterns. In the process, standards to ensure high-quality data have been implemented, and novel algorithms have been developed to facilitate analysis. Data and derived results are made available through a freely accessible database. Here we provide an overview of the project and the resources it is generating and illustrate the application of ENCODE data to interpret the human genome.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper proposes a numerical solution method for general equilibrium models with a continuum of heterogeneous agents, which combines elements of projection and of perturbation methods. The basic idea is to solve first for the stationary solutionof the model, without aggregate shocks but with fully specified idiosyncratic shocks. Afterwards one computes a first-order perturbation of the solution in the aggregate shocks. This approach allows to include a high-dimensional representation of the cross-sectional distribution in the state vector. The method is applied to a model of household saving with uninsurable income risk and liquidity constraints. The model includes not only productivity shocks, but also shocks to redistributive taxation, which cause substantial short-run variation in the cross-sectional distribution of wealth. If those shocks are operative, it is shown that a solution method based on very few statistics of the distribution is not suitable, while the proposed method can solve the model with high accuracy, at least for the case of small aggregate shocks. Techniques are discussed to reduce the dimension of the state space such that higher order perturbations are feasible.Matlab programs to solve the model can be downloaded.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work proposes novel network analysis techniques for multivariate time series.We define the network of a multivariate time series as a graph where verticesdenote the components of the process and edges denote non zero long run partialcorrelations. We then introduce a two step LASSO procedure, called NETS, toestimate high dimensional sparse Long Run Partial Correlation networks. This approachis based on a VAR approximation of the process and allows to decomposethe long run linkages into the contribution of the dynamic and contemporaneousdependence relations of the system. The large sample properties of the estimatorare analysed and we establish conditions for consistent selection and estimation ofthe non zero long run partial correlations. The methodology is illustrated with anapplication to a panel of U.S. bluechips.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the end of the last millennium, the focused ion beam scanning electron microscopy (FIB-SEM) has progressively found use in biological research. This instrument is a scanning electron microscope (SEM) with an attached gallium ion column and the 2 beams, electrons and ions (FIB) are focused on one coincident point. The main application is the acquisition of three-dimensional data, FIB-SEM tomography. With the ion beam, some nanometres of the surface are removed and the remaining block-face is imaged with the electron beam in a repetitive manner. The instrument can also be used to cut open biological structures to get access to internal structures or to prepare thin lamella for imaging by (cryo-) transmission electron microscopy. Here, we will present an overview of the development of FIB-SEM and discuss a few points about sample preparation and imaging.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

New precise zircon U-Pb ages are proposed for the Triassic-Jurassic (Rhetian-Hettangian) and the Hettangian-Sinemurian boundaries, The ages were obtained by ID-TIMS dating of single chemical-abraded zircons from volcanic ash layers within the Pucara Group, Aramachay Formation in the Utcubamba valley, northern Peru. Ash layers situated between last and first occurrences of boundary-defining ammonites yielded Pb-206/U-238 ages of 201.58 +/- 0.17/0.28 Ma (95% c.l., uncertainties without/with decay constant errors, respectively) for the Triassic-Jurassic and of 199.53 +/- 0.19/0.29 Ma for the Hettangian-Sinemurian boundaries. The former is established on a tuff located 1 m above the last local occurrence of the topmost Triassic genus Choristoceras, and 5 m below the Hettangian genus Psiloceras. The latter sample was obtained from a tuff collected within the Badouxia canadensis beds. Our new ages document total duration of the Hettagian of no more than c. 2 m.y., which has fundamental implications for the interpretation and significance of the ammonite recovery after the topmost Triassic extinction. The U-Pb age is about 0.8 +/- 0.5% older than Ar-40-Ar-39 dates determined on flood basalts of the Central Atlantic Magmatic Province (CAMP). Given the widely accepted hypothesis that inaccuracies in the K-40 decay constants or physical constants create a similar bias between the two dating methods, our new U-Pb zircon age determination for the T/J boundary corroborates the hypothesis that the CAMP was emplaced at the same time and may be responsible for a major climatic turnover and mass extinction. The zircon Pb-206/U-238 age for the T/J boundary is marginally older than the North Mountain Basalt (Newark Supergroup, Nova Scotia, Canada), which has been dated at 201.27 +/- 0.06 Ma [Schoene et al., 2006. Geochim. Cosmochim. Acta 70, 426-445]. It will be important to look for older eruptions of the CAMP and date them precisely by U-Pb techniques while addressing all sources of systematic uncertainty to further test the hypothesis of volcanic induced climate change leading to extinction. Such high-precision, high-accuracy data will be instrumental for constraining the contemporaneity of geological events at a 100 kyr level. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present paper studies the probability of ruin of an insurer, if excess of loss reinsurance with reinstatements is applied. In the setting of the classical Cramer-Lundberg risk model, piecewise deterministic Markov processes are used to describe the free surplus process in this more general situation. It is shown that the finite-time ruin probability is both the solution of a partial integro-differential equation and the fixed point of a contractive integral operator. We exploit the latter representation to develop and implement a recursive algorithm for numerical approximation of the ruin probability that involves high-dimensional integration. Furthermore we study the behavior of the finite-time ruin probability under various levels of initial surplus and security loadings and compare the efficiency of the numerical algorithm with the computational alternative of stochastic simulation of the risk process. (C) 2011 Elsevier Inc. All rights reserved.