44 resultados para discovery driven analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Atmosphere–ocean general circulation models (AOGCMs) predict a weakening of the Atlantic meridional overturning circulation (AMOC) in response to anthropogenic forcing of climate, but there is a large model uncertainty in the magnitude of the predicted change. The weakening of the AMOC is generally understood to be the result of increased buoyancy input to the north Atlantic in a warmer climate, leading to reduced convection and deep water formation. Consistent with this idea, model analyses have shown empirical relationships between the AMOC and the meridional density gradient, but this link is not direct because the large-scale ocean circulation is essentially geostrophic, making currents and pressure gradients orthogonal. Analysis of the budget of kinetic energy (KE) instead of momentum has the advantage of excluding the dominant geostrophic balance. Diagnosis of the KE balance of the HadCM3 AOGCM and its low-resolution version FAMOUS shows that KE is supplied to the ocean by the wind and dissipated by viscous forces in the global mean of the steady-state control climate, and the circulation does work against the pressure-gradient force, mainly in the Southern Ocean. In the Atlantic Ocean, however, the pressure-gradient force does work on the circulation, especially in the high-latitude regions of deep water formation. During CO2-forced climate change, we demonstrate a very good temporal correlation between the AMOC strength and the rate of KE generation by the pressure-gradient force in 50–70°N of the Atlantic Ocean in each of nine contemporary AOGCMs, supporting a buoyancy-driven interpretation of AMOC changes. To account for this, we describe a conceptual model, which offers an explanation of why AOGCMs with stronger overturning in the control climate tend to have a larger weakening under CO2 increase

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatic keyword or keyphrase extraction is concerned with assigning keyphrases to documents based on words from within the document. Previous studies have shown that in a significant number of cases author-supplied keywords are not appropriate for the document to which they are attached. This can either be because they represent what the author believes a paper is about not what it actually is, or because they include keyphrases which are more classificatory than explanatory e.g., “University of Poppleton” instead of “Knowledge Discovery in Databases”. Thus, there is a need for a system that can generate an appropriate and diverse range of keyphrases that reflect the document. This paper proposes two possible solutions that examine the synonyms of words and phrases in the document to find the underlying themes, and presents these as appropriate keyphrases. Using three different freely available thesauri, the work undertaken examines two different methods of producing keywords and compares the outcomes across multiple strands in the timeline. The primary method explores taking n-grams of the source document phrases, and examining the synonyms of these, while the secondary considers grouping outputs by their synonyms. The experiments undertaken show the primary method produces good results and that the secondary method produces both good results and potential for future work. In addition, the different qualities of the thesauri are examined and it is concluded that the more entries in a thesaurus, the better it is likely to perform. The age of the thesaurus or the size of each entry does not correlate to performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current methods for estimating event-related potentials (ERPs) assume stationarity of the signal. Empirical Mode Decomposition (EMD) is a data-driven decomposition technique that does not assume stationarity. We evaluated an EMD-based method for estimating the ERP. On simulated data, EMD substantially reduced background EEG while retaining the ERP. EMD-denoised single trials also estimated shape, amplitude, and latency of the ERP better than raw single trials. On experimental data, EMD-denoised trials revealed event-related differences between two conditions (condition A and B) more effectively than trials lowpass filtered at 40 Hz. EMD also revealed event-related differences on both condition A and condition B that were clearer and of longer duration than those revealed by low-pass filtering at 40 Hz. Thus, EMD-based denoising is a promising data-driven, nonstationary method for estimating ERPs and should be investigated further.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Natural ventilation relies on less controllable natural forces so that it needs more artificial control, and thus its prediction, design and analysis become more important. This paper presents both theoretical and numerical simulations for predicting the natural ventilation flow in a two-zone building with multiple openings which is subjected to the combined natural forces. To our knowledge, this is the first analytical solutions obtained so far for a building with more than one zones and in each zone with possibly more than 2 openings. The analytical solution offers a possibility for validating a multi-zone airflow program. A computer program MIX is employed to conduct the numerical simulation. Good agreement is achieved. Different airflow modes are identified and some design recommendations are also provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our aim is to reconstruct the brain-body loop of stroke patients via an EEG-driven robotic system. After the detection of motor command generation, the robotic arm should assist patient’s movement at the correct moment and in a natural way. In this study we performed EEG measurements from healthy subjects performing discrete spontaneous motion. An EEG analysis based on the temporal correlation of the brain activity was employed to determine the onset of single motion motor command generation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The two-way relationship between Rossby Wave-Breaking (RWB) and intensification of extra tropical cyclones is analysed over the Euro-Atlantic sector. In particular, the timing, intensity and location of cyclone development are related to RWB occurrences. For this purpose, two potential-temperature based indices are used to detect and classify anticyclonic and cyclonic RWB episodes from ERA-40 Re-Analysis data. Results show that explosive cyclogenesis over the North Atlantic (NA) is fostered by enhanced occurrence of RWB on days prior to the cyclone’s maximum intensification. Under such conditions, the eddy-driven jet stream is accelerated over the NA, thus enhancing conditions for cyclogenesis. For explosive cyclogenesis over the eastern NA, enhanced cyclonic RWB over eastern Greenland and anticyclonic RWB over the sub-tropical NA are observed. Typically only one of these is present in any given case, with the RWB over eastern Greenland being more frequent than its southern counterpart. This leads to an intensification of the jet over the eastern NA and enhanced probability of windstorms reaching Western Europe. Explosive cyclones evolving under simultaneous RWB on both sides of the jet feature a higher mean intensity and deepening rates than cyclones preceded by a single RWB event. Explosive developments over the western NA are typically linked to a single area of enhanced cyclonic RWB over western Greenland. Here, the eddy-driven jet is accelerated over the western NA. Enhanced occurrence of cyclonic RWB over southern Greenland and anticyclonic RWB over Europe is also observed after explosive cyclogenesis, potentially leading to the onset of Scandinavian Blocking. However, only very intense developments have a considerable influence on the large-scale atmospheric flow. Non-explosive cyclones depict no sign of enhanced RWB over the whole NA area. We conclude that the links between RWB and cyclogenesis over the Euro-Atlantic sector are sensitive to the cyclone’s maximum intensity, deepening rate and location.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study focuses on the analysis of winter (October-November-December-January-February-March; ONDJFM) storm events and their changes due to increased anthropogenic greenhouse gas concentrations over Europe. In order to assess uncertainties that are due to model formulation, 4 regional climate models (RCMs) with 5 high resolution experiments, and 4 global general circulation models (GCMs) are considered. Firstly, cyclone systems as synoptic scale processes in winter are investigated, as they are a principal cause of the occurrence of extreme, damage-causing wind speeds. This is achieved by use of an objective cyclone identification and tracking algorithm applied to GCMs. Secondly, changes in extreme near-surface wind speeds are analysed. Based on percentile thresholds, the studied extreme wind speed indices allow a consistent analysis over Europe that takes systematic deviations of the models into account. Relative changes in both intensity and frequency of extreme winds and their related uncertainties are assessed and related to changing patterns of extreme cyclones. A common feature of all investigated GCMs is a reduced track density over central Europe under climate change conditions, if all systems are considered. If only extreme (i.e. the strongest 5%) cyclones are taken into account, an increasing cyclone activity for western parts of central Europe is apparent; however, the climate change signal reveals a reduced spatial coherency when compared to all systems, which exposes partially contrary results. With respect to extreme wind speeds, significant positive changes in intensity and frequency are obtained over at least 3 and 20% of the European domain under study (35–72°N and 15°W–43°E), respectively. Location and extension of the affected areas (up to 60 and 50% of the domain for intensity and frequency, respectively), as well as levels of changes (up to +15 and +200% for intensity and frequency, respectively) are shown to be highly dependent on the driving GCM, whereas differences between RCMs when driven by the same GCM are relatively small.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

n the past decade, the analysis of data has faced the challenge of dealing with very large and complex datasets and the real-time generation of data. Technologies to store and access these complex and large datasets are in place. However, robust and scalable analysis technologies are needed to extract meaningful information from these datasets. The research field of Information Visualization and Visual Data Analytics addresses this need. Information visualization and data mining are often used complementary to each other. Their common goal is the extraction of meaningful information from complex and possibly large data. However, though data mining focuses on the usage of silicon hardware, visualization techniques also aim to access the powerful image-processing capabilities of the human brain. This article highlights the research on data visualization and visual analytics techniques. Furthermore, we highlight existing visual analytics techniques, systems, and applications including a perspective on the field from the chemical process industry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we investigate the price discovery process in single-name credit spreads obtained from bond, credit default swap (CDS), equity and equity option prices. We analyse short term price discovery by modelling daily changes in credit spreads in the four markets with a vector autoregressive model (VAR). We also look at price discovery in the long run with a vector error correction model (VECM). We find that in the short term the option market clearly leads the other markets in the sub-prime crisis (2007-2009). During the less severe sovereign debt crisis (2009-2012) and the pre-crisis period, options are still important but CDSs become more prominent. In the long run, deviations from the equilibrium relationship with the option market still lead to adjustments in the credit spreads observed or implied from other markets. However, options no longer dominate price discovery in any of the periods considered. Our findings have implications for traders, credit risk managers and financial regulators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The INSIG2 rs7566605 polymorphism was identified for obesity (BMI> or =30 kg/m(2)) in one of the first genome-wide association studies, but replications were inconsistent. We collected statistics from 34 studies (n = 74,345), including general population (GP) studies, population-based studies with subjects selected for conditions related to a better health status ('healthy population', HP), and obesity studies (OB). We tested five hypotheses to explore potential sources of heterogeneity. The meta-analysis of 27 studies on Caucasian adults (n = 66,213) combining the different study designs did not support overall association of the CC-genotype with obesity, yielding an odds ratio (OR) of 1.05 (p-value = 0.27). The I(2) measure of 41% (p-value = 0.015) indicated between-study heterogeneity. Restricting to GP studies resulted in a declined I(2) measure of 11% (p-value = 0.33) and an OR of 1.10 (p-value = 0.015). Regarding the five hypotheses, our data showed (a) some difference between GP and HP studies (p-value = 0.012) and (b) an association in extreme comparisons (BMI> or =32.5, 35.0, 37.5, 40.0 kg/m(2) versus BMI<25 kg/m(2)) yielding ORs of 1.16, 1.18, 1.22, or 1.27 (p-values 0.001 to 0.003), which was also underscored by significantly increased CC-genotype frequencies across BMI categories (10.4% to 12.5%, p-value for trend = 0.0002). We did not find evidence for differential ORs (c) among studies with higher than average obesity prevalence compared to lower, (d) among studies with BMI assessment after the year 2000 compared to those before, or (e) among studies from older populations compared to younger. Analysis of non-Caucasian adults (n = 4889) or children (n = 3243) yielded ORs of 1.01 (p-value = 0.94) or 1.15 (p-value = 0.22), respectively. There was no evidence for overall association of the rs7566605 polymorphism with obesity. Our data suggested an association with extreme degrees of obesity, and consequently heterogeneous effects from different study designs may mask an underlying association when unaccounted for. The importance of study design might be under-recognized in gene discovery and association replication so far.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present an efficient method of combining wide angle neutron scattering data with detailed atomistic models, allowing us to perform a quantitative and qualitative mapping of the organisation of the chain conformation in both glass and liquid phases. The structural refinement method presented in this work is based on the exploitation of the intrachain features of the diffraction pattern and its intimate linkage with atomistic models by the use of internal coordinates for bond lengths, valence angles and torsion rotations. Atomic connectivity is defined through these coordinates that are in turn assigned by pre-defined probability distributions, thus allowing for the models in question to be built stochastically. Incremental variation of these coordinates allows for the construction of models that minimise the differences between the observed and calculated structure factors. We present a series of neutron scattering data of 1,2 polybutadiene at the region 120-400K. Analysis of the experimental data yield bond lengths for C-C and C=C of 1.54Å and 1.35Å respectively. Valence angles of the backbone were found to be at 112° and the torsion distributions are characterised by five rotational states, a three-fold trans-skew± for the backbone and gauche± for the vinyl group. Rotational states of the vinyl group were found to be equally populated, indicating a largely atactic chan. The two backbone torsion angles exhibit different behaviour with respect to temperature of their trans population, with one of them adopting an almost all trans sequence. Consequently the resulting configuration leads to a rather persistent chain, something indicated by the value of the characteristic ratio extrapolated from the model. We compare our results with theoretical predictions, computer simulations, RIS models and previously reported experimental results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The global vegetation response to climate and atmospheric CO2 changes between the last glacial maximum and recent times is examined using an equilibrium vegetation model (BIOME4), driven by output from 17 climate simulations from the Palaeoclimate Modelling Intercomparison Project. Features common to all of the simulations include expansion of treeless vegetation in high northern latitudes; southward displacement and fragmentation of boreal and temperate forests; and expansion of drought-tolerant biomes in the tropics. These features are broadly consistent with pollen-based reconstructions of vegetation distribution at the last glacial maximum. Glacial vegetation in high latitudes reflects cold and dry conditions due to the low CO2 concentration and the presence of large continental ice sheets. The extent of drought-tolerant vegetation in tropical and subtropical latitudes reflects a generally drier low-latitude climate. Comparisons of the observations with BIOME4 simulations, with and without consideration of the direct physiological effect of CO2 concentration on C3 photosynthesis, suggest an important additional role of low CO2 concentration in restricting the extent of forests, especially in the tropics. Global forest cover was overestimated by all models when climate change alone was used to drive BIOME4, and estimated more accurately when physiological effects of CO2 concentration were included. This result suggests that both CO2 effects and climate effects were important in determining glacial-interglacial changes in vegetation. More realistic simulations of glacial vegetation and climate will need to take into account the feedback effects of these structural and physiological changes on the climate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Matrix-assisted laser desorption/ionisation (MALDI) mass spectrometry (MS) is a highly versatile and sensitive analytical technique, which is known for its soft ionisation of biomolecules such as peptides and proteins. Generally, MALDI MS analysis requires little sample preparation, and in some cases like MS profiling it can be automated through the use of robotic liquid-handling systems. For more than a decade now, MALDI MS has been extensively utilised in the search for biomarkers that could aid clinicians in diagnosis, prognosis, and treatment decision making. This review examines the various MALDI-based MS techniques like MS imaging, MS profiling and proteomics in-depth analysis where MALDI MS follows fractionation and separation methods such as gel electrophoresis, and how these have contributed to prostate cancer biomarker research. This article is part of a Special Issue entitled: Biomarkers: A Proteomic Challenge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Seamless phase II/III clinical trials are conducted in two stages with treatment selection at the first stage. In the first stage, patients are randomized to a control or one of k > 1 experimental treatments. At the end of this stage, interim data are analysed, and a decision is made concerning which experimental treatment should continue to the second stage. If the primary endpoint is observable only after some period of follow-up, at the interim analysis data may be available on some early outcome on a larger number of patients than those for whom the primary endpoint is available. These early endpoint data can thus be used for treatment selection. For two previously proposed approaches, the power has been shown to be greater for one or other method depending on the true treatment effects and correlations. We propose a new approach that builds on the previously proposed approaches and uses data available at the interim analysis to estimate these parameters and then, on the basis of these estimates, chooses the treatment selection method with the highest probability of correctly selecting the most effective treatment. This method is shown to perform well compared with the two previously described methods for a wide range of true parameter values. In most cases, the performance of the new method is either similar to or, in some cases, better than either of the two previously proposed methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concept of being ‘patient-centric’ is a challenge to many existing healthcare service provision practices. This paper focuses on the issue of referrals, where multiple stakeholders, i.e. general practitioners and patients, are encouraged to make a consensual decision based on patient needs. In this paper, we present an ontology-enabled healthcare service provision, which facilitates both patients and GPs in jointly deciding upon the referral decision. In the healthcare service provision model, we define three types of profile, which represents different stakeholders’ requirements. This model also comprises of a set of healthcare service discovery processes: articulating a service need, matching the need with the healthcare service offerings, and deciding on a best-fit service for acceptance. As a result, the healthcare service provision can carry out coherent analysis using personalised information and iterative processes that deal with requirements change over time.