850 resultados para High-dimensional data visualization
Resumo:
The identification of chemical mechanism that can exhibit oscillatory phenomena in reaction networks are currently of intense interest. In particular, the parametric question of the existence of Hopf bifurcations has gained increasing popularity due to its relation to the oscillatory behavior around the fixed points. However, the detection of oscillations in high-dimensional systems and systems with constraints by the available symbolic methods has proven to be difficult. The development of new efficient methods are therefore required to tackle the complexity caused by the high-dimensionality and non-linearity of these systems. In this thesis, we mainly present efficient algorithmic methods to detect Hopf bifurcation fixed points in (bio)-chemical reaction networks with symbolic rate constants, thereby yielding information about their oscillatory behavior of the networks. The methods use the representations of the systems on convex coordinates that arise from stoichiometric network analysis. One of the methods called HoCoQ reduces the problem of determining the existence of Hopf bifurcation fixed points to a first-order formula over the ordered field of the reals that can then be solved using computational-logic packages. The second method called HoCaT uses ideas from tropical geometry to formulate a more efficient method that is incomplete in theory but worked very well for the attempted high-dimensional models involving more than 20 chemical species. The instability of reaction networks may lead to the oscillatory behaviour. Therefore, we investigate some criterions for their stability using convex coordinates and quantifier elimination techniques. We also study Muldowney's extension of the classical Bendixson-Dulac criterion for excluding periodic orbits to higher dimensions for polynomial vector fields and we discuss the use of simple conservation constraints and the use of parametric constraints for describing simple convex polytopes on which periodic orbits can be excluded by Muldowney's criteria. All developed algorithms have been integrated into a common software framework called PoCaB (platform to explore bio- chemical reaction networks by algebraic methods) allowing for automated computation workflows from the problem descriptions. PoCaB also contains a database for the algebraic entities computed from the models of chemical reaction networks.
Resumo:
This report explores how recurrent neural networks can be exploited for learning high-dimensional mappings. Since recurrent networks are as powerful as Turing machines, an interesting question is how recurrent networks can be used to simplify the problem of learning from examples. The main problem with learning high-dimensional functions is the curse of dimensionality which roughly states that the number of examples needed to learn a function increases exponentially with input dimension. This thesis proposes a way of avoiding this problem by using a recurrent network to decompose a high-dimensional function into many lower dimensional functions connected in a feedback loop.
Resumo:
We have developed a technique called RISE (Random Image Structure Evolution), by which one may systematically sample continuous paths in a high-dimensional image space. A basic RISE sequence depicts the evolution of an object's image from a random field, along with the reverse sequence which depicts the transformation of this image back into randomness. The processing steps are designed to ensure that important low-level image attributes such as the frequency spectrum and luminance are held constant throughout a RISE sequence. Experiments based on the RISE paradigm can be used to address some key open issues in object perception. These include determining the neural substrates underlying object perception, the role of prior knowledge and expectation in object perception, and the developmental changes in object perception skills from infancy to adulthood.
Resumo:
MapFish is an open-source development framework for building webmapping applications. MapFish is based on the OpenLayers API and the Geo extension of Ext library, and extends the Pylons general-purpose web development framework with geo-specific functionnalities. This presentation first describes what the MapFish development framework provides and how it can help developers implement rich web-mapping applications. It then demonstrates through real web-mapping realizations what can be achieved using MapFish : Geo Business Intelligence applications, 2D/3D data visualization, on/off line data edition, advanced vectorial print functionnalities, advanced administration suite to build WebGIS applications from scratch, etc. In particular, the web-mapping application for the UN Refugee Agency (UNHCR) and a Regional Spatial Data Infrastructure will be demonstrated
Resumo:
This is a collection of 12 micro-lectures, to be used by students in advance of practical sessions. Durations range for 3 min to 10 min. Topics include: ****** 1. Introduction ****** 2. Data classes ****** 3. Matrices ****** 4. Getting help ****** 5. Index notation ****** 6. 1- and 2-dimensional data ****** 7. 3-dimensional data ****** 8. Booleans (True/False) ****** 9. Designing a programme (Algorithms) ****** 10. Flow control: If-then statements ****** 11. Flow control: For-do loops ****** 12. Making nicer figures ******
Resumo:
En este trabajo se implementa una metodología para incluir momentos de orden superior en la selección de portafolios, haciendo uso de la Distribución Hiperbólica Generalizada, para posteriormente hacer un análisis comparativo frente al modelo de Markowitz.
Resumo:
In this paper we reviewed the models of volatility for a group of five Latin American countries, mainly motivated by the recent periods of financial turbulence. Our results based on high frequency data suggest that Dynamic multivariate models are more powerful to study the volatilities of asset returns than Constant Conditional Correlation models. For the group of countries included, we identified that domestic volatilities of asset markets have been increasing; but the co-volatility of the region is still moderate.
Resumo:
It is generally agreed that changing climate variability, and the associated change in climate extremes, may have a greater impact on environmentally vulnerable regions than a changing mean. This research investigates rainfall variability, rainfall extremes, and their associations with atmospheric and oceanic circulations over southern Africa, a region that is considered particularly vulnerable to extreme events because of numerous environmental, social, and economic pressures. Because rainfall variability is a function of scale, high-resolution data are needed to identify extreme events. Thus, this research uses remotely sensed rainfall data and climate model experiments at high spatial and temporal resolution, with the overall aim being to investigate the ways in which sea surface temperature (SST) anomalies influence rainfall extremes over southern Africa. Extreme rainfall identification is achieved by the high-resolution microwave/infrared rainfall algorithm dataset. This comprises satellite-derived daily rainfall from 1993 to 2002 and covers southern Africa at a spatial resolution of 0.1° latitude–longitude. Extremes are extracted and used with reanalysis data to study possible circulation anomalies associated with extreme rainfall. Anomalously cold SSTs in the central South Atlantic and warm SSTs off the coast of southwestern Africa seem to be statistically related to rainfall extremes. Further, through a number of idealized climate model experiments, it would appear that both decreasing SSTs in the central South Atlantic and increasing SSTs off the coast of southwestern Africa lead to a demonstrable increase in daily rainfall and rainfall extremes over southern Africa, via local effects such as increased convection and remote effects such as an adjustment of the Walker-type circulation.
Resumo:
The application of particle filters in geophysical systems is reviewed. Some background on Bayesian filtering is provided, and the existing methods are discussed. The emphasis is on the methodology, and not so much on the applications themselves. It is shown that direct application of the basic particle filter (i.e., importance sampling using the prior as the importance density) does not work in high-dimensional systems, but several variants are shown to have potential. Approximations to the full problem that try to keep some aspects of the particle filter beyond the Gaussian approximation are also presented and discussed.
Resumo:
The skill of numerical Lagrangian drifter trajectories in three numerical models is assessed by comparing these numerically obtained paths to the trajectories of drifting buoys in the real ocean. The skill assessment is performed using the two-sample Kolmogorov–Smirnov statistical test. To demonstrate the assessment procedure, it is applied to three different models of the Agulhas region. The test can either be performed using crossing positions of one-dimensional sections in order to test model performance in specific locations, or using the total two-dimensional data set of trajectories. The test yields four quantities: a binary decision of model skill, a confidence level which can be used as a measure of goodness-of-fit of the model, a test statistic which can be used to determine the sensitivity of the confidence level, and cumulative distribution functions that aid in the qualitative analysis. The ordering of models by their confidence levels is the same as the ordering based on the qualitative analysis, which suggests that the method is suited for model validation. Only one of the three models, a 1/10° two-way nested regional ocean model, might have skill in the Agulhas region. The other two models, a 1/2° global model and a 1/8° assimilative model, might have skill only on some sections in the region
Resumo:
The microwave spectrum of 1-pyrazoline has been observed from 18 to 40 GHz in the six lowest states of the ring-puckering vibration. It is an a-type spectrum of a near oblate asymmetric top. Each vibrational state has been fitted to a separate effective Hamiltonian, and the vibrational dependence of both the rotational constants and the quartic centrifugal distortion constants has been observed and analyzed. The v = 0 and 1 states have also been analyzed using a coupled Hamiltonian; this gives consistent results, with an improved fit to the high J data. The preferred choice of Durig et al. [J. Chem. Phys. 52, 6096 (1970)] for the ring-puckering potential is confirmed as essentially correct, but the A and B inertial axes are shown to be interchanged from those assumed by Durig et al. in their analysis of the mid-infrared spectrum.
Resumo:
Matrix-assisted laser desorption/ionization (MALDI) is a key technique in mass spectrometry (MS)-based proteomics. MALDI MS is extremely sensitive, easy-to-apply, and relatively tolerant to contaminants. Its high-speed data acquisition and large-scale, off-line sample preparation has made it once again the focus for high-throughput proteomic analyses. These and other unique properties of MALDI offer new possibilities in applications such as rapid molecular profiling and imaging by MS. Proteomics and its employment in Systems Biology and other areas that require sensitive and high-throughput bioanalytical techniques greatly depend on these methodologies. This chapter provides a basic introduction to the MALDI methodology and its general application in proteomic research. It describes the basic MALDI sample preparation steps and two easy-to-follow examples for protein identification including extensive notes on these topics with practical tips that are often not available in the Subheadings 2 and 3 of research articles.
Resumo:
Cardiovascular diseases are the chief causes of death in the UK, and are associated with high circulating levels of total cholesterol in the plasma. Artichoke leaf extracts (ALEs) have been reported to reduce plasma lipids levels, including total cholesterol, although high quality data is lacking. The objective of this trial was to assess the effect of ALE on plasma lipid levels and general well-being in otherwise healthy adults with mild to moderate hypercholesterolemia. 131 adults were screened for total plasma cholesterol in the range 6.0-8.0 mmol/l, with 75 suitable volunteers randomised onto the trial. Volunteers consumed 1280 mg of a standardised ALE, or matched placebo, daily for 12 weeks. Plasma total cholesterol decreased in the treatment group by an average of 4.2% (from 7.16 (SD 0.62) mmol/l to 6.86 (SD 0.68) mmol/l) and increased in the control group by an average of 1.9% (6.90 (SD 0.49) mmol/l to 7.03 (0.61) mmol/l), the difference between groups being statistically significant (p = 0.025). No significant differences between groups were observed for LDL cholesterol, HDL cholesterol or triglyceride levels. General well-being improved significantly in both the treatment (11%) and control groups (9%) with no significant differences between groups. In conclusion, ALE consumption resulted in a modest but favourable statistically significant difference in total cholesterol after 12 weeks. In comparison with a previous trial, it is suggested that the apparent positive health status of the study population may have contributed to the modesty of the observed response. (C) 2008 Elsevier GmbH. All rights reserved.
Resumo:
The Self-Organizing Map (SOM) is a popular unsupervised neural network able to provide effective clustering and data visualization for data represented in multidimensional input spaces. In this paper, we describe Fast Learning SOM (FLSOM) which adopts a learning algorithm that improves the performance of the standard SOM with respect to the convergence time in the training phase. We show that FLSOM also improves the quality of the map by providing better clustering quality and topology preservation of multidimensional input data. Several tests have been carried out on different multidimensional datasets, which demonstrate better performances of the algorithm in comparison with the original SOM.
Resumo:
A 24-member ensemble of 1-h high-resolution forecasts over the Southern United Kingdom is used to study short-range forecast error statistics. The initial conditions are found from perturbations from an ensemble transform Kalman filter. Forecasts from this system are assumed to lie within the bounds of forecast error of an operational forecast system. Although noisy, this system is capable of producing physically reasonable statistics which are analysed and compared to statistics implied from a variational assimilation system. The variances for temperature errors for instance show structures that reflect convective activity. Some variables, notably potential temperature and specific humidity perturbations, have autocorrelation functions that deviate from 3-D isotropy at the convective-scale (horizontal scales less than 10 km). Other variables, notably the velocity potential for horizontal divergence perturbations, maintain 3-D isotropy at all scales. Geostrophic and hydrostatic balances are studied by examining correlations between terms in the divergence and vertical momentum equations respectively. Both balances are found to decay as the horizontal scale decreases. It is estimated that geostrophic balance becomes less important at scales smaller than 75 km, and hydrostatic balance becomes less important at scales smaller than 35 km, although more work is required to validate these findings. The implications of these results for high-resolution data assimilation are discussed.