103 resultados para Gradient-based approaches

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

High-resolution tomographic imaging of the shallow subsurface is becoming increasingly important for a wide range of environmental, hydrological and engineering applications. Because of their superior resolution power, their sensitivity to pertinent petrophysical parameters, and their far reaching complementarities, both seismic and georadar crosshole imaging are of particular importance. To date, corresponding approaches have largely relied on asymptotic, ray-based approaches, which only account for a very small part of the observed wavefields, inherently suffer from a limited resolution, and in complex environments may prove to be inadequate. These problems can potentially be alleviated through waveform inversion. We have developed an acoustic waveform inversion approach for crosshole seismic data whose kernel is based on a finite-difference time-domain (FDTD) solution of the 2-D acoustic wave equations. This algorithm is tested on and applied to synthetic data from seismic velocity models of increasing complexity and realism and the results are compared to those obtained using state-of-the-art ray-based traveltime tomography. Regardless of the heterogeneity of the underlying models, the waveform inversion approach has the potential of reliably resolving both the geometry and the acoustic properties of features of the size of less than half a dominant wavelength. Our results do, however, also indicate that, within their inherent resolution limits, ray-based approaches provide an effective and efficient means to obtain satisfactory tomographic reconstructions of the seismic velocity structure in the presence of mild to moderate heterogeneity and in absence of strong scattering. Conversely, the excess effort of waveform inversion provides the greatest benefits for the most heterogeneous, and arguably most realistic, environments where multiple scattering effects tend to be prevalent and ray-based methods lose most of their effectiveness.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The amount of sequence data available today highly facilitates the access to genes from many gene families. Primers amplifying the desired genes over a range of species are readily obtained by aligning conserved gene regions, and laborious gene isolation procedures can often be replaced by quicker PCR-based approaches. However, in the case of multigene families, PCR-based approaches bear the often ignored risk of incomplete isolation of family members. This problem is most prominent in gene families with highly variable and thus unpredictable number of gene copies among species, such as in the major histocompatibility complex (MHC). In this study, we (i) report new primers for the isolation of the MHC class IIB (MHCIIB) gene family in birds and (ii) share our experience with isolating MHCIIB genes from an unprecedented number of avian species from all over the avian phylogeny. We report important and usually underappreciated problems encountered during PCR-based multigene family isolation and provide a collection of measures to help significantly improving the chance of successfully isolating complete multigene families using PCR-based approaches.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Aim  Recently developed parametric methods in historical biogeography allow researchers to integrate temporal and palaeogeographical information into the reconstruction of biogeographical scenarios, thus overcoming a known bias of parsimony-based approaches. Here, we compare a parametric method, dispersal-extinction-cladogenesis (DEC), against a parsimony-based method, dispersal-vicariance analysis (DIVA), which does not incorporate branch lengths but accounts for phylogenetic uncertainty through a Bayesian empirical approach (Bayes-DIVA). We analyse the benefits and limitations of each method using the cosmopolitan plant family Sapindaceae as a case study.Location  World-wide.Methods  Phylogenetic relationships were estimated by Bayesian inference on a large dataset representing generic diversity within Sapindaceae. Lineage divergence times were estimated by penalized likelihood over a sample of trees from the posterior distribution of the phylogeny to account for dating uncertainty in biogeographical reconstructions. We compared biogeographical scenarios between Bayes-DIVA and two different DEC models: one with no geological constraints and another that employed a stratified palaeogeographical model in which dispersal rates were scaled according to area connectivity across four time slices, reflecting the changing continental configuration over the last 110 million years.Results  Despite differences in the underlying biogeographical model, Bayes-DIVA and DEC inferred similar biogeographical scenarios. The main differences were: (1) in the timing of dispersal events - which in Bayes-DIVA sometimes conflicts with palaeogeographical information, and (2) in the lower frequency of terminal dispersal events inferred by DEC. Uncertainty in divergence time estimations influenced both the inference of ancestral ranges and the decisiveness with which an area can be assigned to a node.Main conclusions  By considering lineage divergence times, the DEC method gives more accurate reconstructions that are in agreement with palaeogeographical evidence. In contrast, Bayes-DIVA showed the highest decisiveness in unequivocally reconstructing ancestral ranges, probably reflecting its ability to integrate phylogenetic uncertainty. Care should be taken in defining the palaeogeographical model in DEC because of the possibility of overestimating the frequency of extinction events, or of inferring ancestral ranges that are outside the extant species ranges, owing to dispersal constraints enforced by the model. The wide-spanning spatial and temporal model proposed here could prove useful for testing large-scale biogeographical patterns in plants.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Machine learning has been largely applied to analyze data in various domains, but it is still new to personalized medicine, especially dose individualization. In this paper, we focus on the prediction of drug concentrations using Support Vector Machines (S VM) and the analysis of the influence of each feature to the prediction results. Our study shows that SVM-based approaches achieve similar prediction results compared with pharmacokinetic model. The two proposed example-based SVM methods demonstrate that the individual features help to increase the accuracy in the predictions of drug concentration with a reduced library of training data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Depth-averaged velocities and unit discharges within a 30 km reach of one of the world's largest rivers, the Rio Parana, Argentina, were simulated using three hydrodynamic models with different process representations: a reduced complexity (RC) model that neglects most of the physics governing fluid flow, a two-dimensional model based on the shallow water equations, and a three-dimensional model based on the Reynolds-averaged Navier-Stokes equations. Row characteristics simulated using all three models were compared with data obtained by acoustic Doppler current profiler surveys at four cross sections within the study reach. This analysis demonstrates that, surprisingly, the performance of the RC model is generally equal to, and in some instances better than, that of the physics based models in terms of the statistical agreement between simulated and measured flow properties. In addition, in contrast to previous applications of RC models, the present study demonstrates that the RC model can successfully predict measured flow velocities. The strong performance of the RC model reflects, in part, the simplicity of the depth-averaged mean flow patterns within the study reach and the dominant role of channel-scale topographic features in controlling the flow dynamics. Moreover, the very low water surface slopes that typify large sand-bed rivers enable flow depths to be estimated reliably in the RC model using a simple fixed-lid planar water surface approximation. This approach overcomes a major problem encountered in the application of RC models in environments characterised by shallow flows and steep bed gradients. The RC model is four orders of magnitude faster than the physics based models when performing steady-state hydrodynamic calculations. However, the iterative nature of the RC model calculations implies a reduction in computational efficiency relative to some other RC models. A further implication of this is that, if used to simulate channel morphodynamics, the present RC model may offer only a marginal advantage in terms of computational efficiency over approaches based on the shallow water equations. These observations illustrate the trade off between model realism and efficiency that is a key consideration in RC modelling. Moreover, this outcome highlights a need to rethink the use of RC morphodynamic models in fluvial geomorphology and to move away from existing grid-based approaches, such as the popular cellular automata (CA) models, that remain essentially reductionist in nature. In the case of the world's largest sand-bed rivers, this might be achieved by implementing the RC model outlined here as one element within a hierarchical modelling framework that would enable computationally efficient simulation of the morphodynamics of large rivers over millennial time scales. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The paper presents some contemporary approaches to spatial environmental data analysis. The main topics are concentrated on the decision-oriented problems of environmental spatial data mining and modeling: valorization and representativity of data with the help of exploratory data analysis, spatial predictions, probabilistic and risk mapping, development and application of conditional stochastic simulation models. The innovative part of the paper presents integrated/hybrid model-machine learning (ML) residuals sequential simulations-MLRSS. The models are based on multilayer perceptron and support vector regression ML algorithms used for modeling long-range spatial trends and sequential simulations of the residuals. NIL algorithms deliver non-linear solution for the spatial non-stationary problems, which are difficult for geostatistical approach. Geostatistical tools (variography) are used to characterize performance of ML algorithms, by analyzing quality and quantity of the spatially structured information extracted from data with ML algorithms. Sequential simulations provide efficient assessment of uncertainty and spatial variability. Case study from the Chernobyl fallouts illustrates the performance of the proposed model. It is shown that probability mapping, provided by the combination of ML data driven and geostatistical model based approaches, can be efficiently used in decision-making process. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Insects are an important and probably the most challenging pest to control in agriculture, in particular when they feed on belowground parts of plants. The application of synthetic pesticides is problematic owing to side effects on the environment, concerns for public health and the rapid development of resistance. Entomopathogenic bacteria, notably Bacillus thuringiensis and Photorhabdus/Xenorhabdus species, are promising alternatives to chemical insecticides, for they are able to efficiently kill insects and are considered to be environmentally sound and harmless to mammals. However, they have the handicap of showing limited environmental persistence or of depending on a nematode vector for insect infection. Intriguingly, certain strains of plant root-colonizing Pseudomonas bacteria display insect pathogenicity and thus could be formulated to extend the present range of bioinsecticides for protection of plants against root-feeding insects. These entomopathogenic pseudomonads belong to a group of plant-beneficial rhizobacteria that have the remarkable ability to suppress soil-borne plant pathogens, promote plant growth, and induce systemic plant defenses. Here we review for the first time the current knowledge about the occurrence and the molecular basis of insecticidal activity in pseudomonads with an emphasis on plant-beneficial and prominent pathogenic species. We discuss how this fascinating Pseudomonas trait may be exploited for novel root-based approaches to insect control in an integrated pest management framework.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

1. Model-based approaches have been used increasingly in conservation biology over recent years. Species presence data used for predictive species distribution modelling are abundant in natural history collections, whereas reliable absence data are sparse, most notably for vagrant species such as butterflies and snakes. As predictive methods such as generalized linear models (GLM) require absence data, various strategies have been proposed to select pseudo-absence data. However, only a few studies exist that compare different approaches to generating these pseudo-absence data. 2. Natural history collection data are usually available for long periods of time (decades or even centuries), thus allowing historical considerations. However, this historical dimension has rarely been assessed in studies of species distribution, although there is great potential for understanding current patterns, i.e. the past is the key to the present. 3. We used GLM to model the distributions of three 'target' butterfly species, Melitaea didyma, Coenonympha tullia and Maculinea teleius, in Switzerland. We developed and compared four strategies for defining pools of pseudo-absence data and applied them to natural history collection data from the last 10, 30 and 100 years. Pools included: (i) sites without target species records; (ii) sites where butterfly species other than the target species were present; (iii) sites without butterfly species but with habitat characteristics similar to those required by the target species; and (iv) a combination of the second and third strategies. Models were evaluated and compared by the total deviance explained, the maximized Kappa and the area under the curve (AUC). 4. Among the four strategies, model performance was best for strategy 3. Contrary to expectations, strategy 2 resulted in even lower model performance compared with models with pseudo-absence data simulated totally at random (strategy 1). 5. Independent of the strategy model, performance was enhanced when sites with historical species presence data were not considered as pseudo-absence data. Therefore, the combination of strategy 3 with species records from the last 100 years achieved the highest model performance. 6. Synthesis and applications. The protection of suitable habitat for species survival or reintroduction in rapidly changing landscapes is a high priority among conservationists. Model-based approaches offer planning authorities the possibility of delimiting priority areas for species detection or habitat protection. The performance of these models can be enhanced by fitting them with pseudo-absence data relying on large archives of natural history collection species presence data rather than using randomly sampled pseudo-absence data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The emergence of omics technologies allowing the global analysis of a given biological or molecular system, rather than the study of its individual components, has revolutionized biomedical research, including cardiovascular medicine research in the past decade. These developments raised the prospect that classical, hypothesis-driven, single gene-based approaches may soon become obsolete. The experience accumulated so far, however, indicates that omic technologies only represent tools similar to those classically used by scientists in the past and nowadays, to make hypothesis and build models, with the main difference that they generate large amounts of unbiased information. Thus, omics and classical hypothesis-driven research are rather complementary approaches with the potential to effectively synergize to boost research in many fields, including cardiovascular medicine. In this article we discuss some general aspects of omics approaches, and review contributions in three areas of vascular biology, thrombosis and haemostasis, atherosclerosis and angiogenesis, in which omics approaches have already been applied (vasculomics).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Species range shifts in response to climate and land use change are commonly forecasted with species distribution models based on species occurrence or abundance data. Although appealing, these models ignore the genetic structure of species, and the fact that different populations might respond in different ways because of adaptation to their environment. Here, we introduced ancestry distribution models, that is, statistical models of the spatial distribution of ancestry proportions, for forecasting intra-specific changes based on genetic admixture instead of species occurrence data. Using multi-locus genotypes and extensive geographic coverage of distribution data across the European Alps, we applied this approach to 20 alpine plant species considering a global increase in temperature from 0.25 to 4 °C. We forecasted the magnitudes of displacement of contact zones between plant populations potentially adapted to warmer environments and other populations. While a global trend of movement in a north-east direction was predicted, the magnitude of displacement was species-specific. For a temperature increase of 2 °C, contact zones were predicted to move by 92 km on average (minimum of 5 km, maximum of 212 km) and by 188 km for an increase of 4 °C (minimum of 11 km, maximum of 393 km). Intra-specific turnover-measuring the extent of change in global population genetic structure-was generally found to be moderate for 2 °C of temperature warming. For 4 °C of warming, however, the models indicated substantial intra-specific turnover for ten species. These results illustrate that, in spite of unavoidable simplifications, ancestry distribution models open new perspectives to forecast population genetic changes within species and complement more traditional distribution-based approaches.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Bioactive small molecules, such as drugs or metabolites, bind to proteins or other macro-molecular targets to modulate their activity, which in turn results in the observed phenotypic effects. For this reason, mapping the targets of bioactive small molecules is a key step toward unraveling the molecular mechanisms underlying their bioactivity and predicting potential side effects or cross-reactivity. Recently, large datasets of protein-small molecule interactions have become available, providing a unique source of information for the development of knowledge-based approaches to computationally identify new targets for uncharacterized molecules or secondary targets for known molecules. Here, we introduce SwissTargetPrediction, a web server to accurately predict the targets of bioactive molecules based on a combination of 2D and 3D similarity measures with known ligands. Predictions can be carried out in five different organisms, and mapping predictions by homology within and between different species is enabled for close paralogs and orthologs. SwissTargetPrediction is accessible free of charge and without login requirement at http://www.swisstargetprediction.ch.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

AbstractFor a wide range of environmental, hydrological, and engineering applications there is a fast growing need for high-resolution imaging. In this context, waveform tomographic imaging of crosshole georadar data is a powerful method able to provide images of pertinent electrical properties in near-surface environments with unprecedented spatial resolution. In contrast, conventional ray-based tomographic methods, which consider only a very limited part of the recorded signal (first-arrival traveltimes and maximum first-cycle amplitudes), suffer from inherent limitations in resolution and may prove to be inadequate in complex environments. For a typical crosshole georadar survey the potential improvement in resolution when using waveform-based approaches instead of ray-based approaches is in the range of one order-of- magnitude. Moreover, the spatial resolution of waveform-based inversions is comparable to that of common logging methods. While in exploration seismology waveform tomographic imaging has become well established over the past two decades, it is comparably still underdeveloped in the georadar domain despite corresponding needs. Recently, different groups have presented finite-difference time-domain waveform inversion schemes for crosshole georadar data, which are adaptations and extensions of Tarantola's seminal nonlinear generalized least-squares approach developed for the seismic case. First applications of these new crosshole georadar waveform inversion schemes on synthetic and field data have shown promising results. However, there is little known about the limits and performance of such schemes in complex environments. To this end, the general motivation of my thesis is the evaluation of the robustness and limitations of waveform inversion algorithms for crosshole georadar data in order to apply such schemes to a wide range of real world problems.One crucial issue to making applicable and effective any waveform scheme to real-world crosshole georadar problems is the accurate estimation of the source wavelet, which is unknown in reality. Waveform inversion schemes for crosshole georadar data require forward simulations of the wavefield in order to iteratively solve the inverse problem. Therefore, accurate knowledge of the source wavelet is critically important for successful application of such schemes. Relatively small differences in the estimated source wavelet shape can lead to large differences in the resulting tomograms. In the first part of my thesis, I explore the viability and robustness of a relatively simple iterative deconvolution technique that incorporates the estimation of the source wavelet into the waveform inversion procedure rather than adding additional model parameters into the inversion problem. Extensive tests indicate that this source wavelet estimation technique is simple yet effective, and is able to provide remarkably accurate and robust estimates of the source wavelet in the presence of strong heterogeneity in both the dielectric permittivity and electrical conductivity as well as significant ambient noise in the recorded data. Furthermore, our tests also indicate that the approach is insensitive to the phase characteristics of the starting wavelet, which is not the case when directly incorporating the wavelet estimation into the inverse problem.Another critical issue with crosshole georadar waveform inversion schemes which clearly needs to be investigated is the consequence of the common assumption of frequency- independent electromagnetic constitutive parameters. This is crucial since in reality, these parameters are known to be frequency-dependent and complex and thus recorded georadar data may show significant dispersive behaviour. In particular, in the presence of water, there is a wide body of evidence showing that the dielectric permittivity can be significantly frequency dependent over the GPR frequency range, due to a variety of relaxation processes. The second part of my thesis is therefore dedicated to the evaluation of the reconstruction limits of a non-dispersive crosshole georadar waveform inversion scheme in the presence of varying degrees of dielectric dispersion. I show that the inversion algorithm, combined with the iterative deconvolution-based source wavelet estimation procedure that is partially able to account for the frequency-dependent effects through an "effective" wavelet, performs remarkably well in weakly to moderately dispersive environments and has the ability to provide adequate tomographic reconstructions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We have recently shown that at isotopic steady state (13)C NMR can provide a direct measurement of glycogen concentration changes, but that the turnover of glycogen was not accessible with this protocol. The aim of the present study was to design, implement and apply a novel dual-tracer infusion protocol to simultaneously measure glycogen concentration and turnover. After reaching isotopic steady state for glycogen C1 using [1-(13)C] glucose administration, [1,6-(13)C(2)] glucose was infused such that isotopic steady state was maintained at the C1 position, but the C6 position reflected (13)C label incorporation. To overcome the large chemical shift displacement error between the C1 and C6 resonances of glycogen, we implemented 2D gradient based localization using the Fourier series window approach, in conjunction with time-domain analysis of the resulting FIDs using jMRUI. The glycogen concentration of 5.1 +/- 1.6 mM measured from the C1 position was in excellent agreement with concomitant biochemical determinations. Glycogen turnover measured from the rate of label incorporation into the C6 position of glycogen in the alpha-chloralose anesthetized rat was 0.7 micromol/g/h.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

DNA-binding proteins mediate a variety of crucial molecular functions, such as transcriptional regulation and chromosome maintenance, replication and repair, which in turn control cell division and differentiation. The roles of these proteins in disease are currently being investigated using microarray-based approaches. However, these assays can be difficult to adapt to routine diagnosis of complex diseases such as cancer. Here, we review promising alternative approaches involving protein-binding microarrays (PBMs) that probe the interaction of proteins from crude cell or tissue extracts with large collections of synthetic or natural DNA sequences. Recent studies have demonstrated the use of these novel PBM approaches to provide rapid and unbiased characterization of DNA-binding proteins as molecular markers of disease, for example cancer progression or infectious diseases.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Abstract This paper shows how to calculate recursively the moments of the accumulated and discounted value of cash flows when the instantaneous rates of return follow a conditional ARMA process with normally distributed innovations. We investigate various moment based approaches to approximate the distribution of the accumulated value of cash flows and we assess their performance through stochastic Monte-Carlo simulations. We discuss the potential use in insurance and especially in the context of Asset-Liability Management of pension funds.