940 resultados para 010300 NUMERICAL AND COMPUTATIONAL MATHEMATICS


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study the families of periodic orbits of the spatial isosceles 3-body problem (for small enough values of the mass lying on the symmetry axis) coming via the analytic continuation method from periodic orbits of the circular Sitnikov problem. Using the first integral of the angular momentum, we reduce the dimension of the phase space of the problem by two units. Since periodic orbits of the reduced isosceles problem generate invariant two-dimensional tori of the nonreduced problem, the analytic continuation of periodic orbits of the (reduced) circular Sitnikov problem at this level becomes the continuation of invariant two-dimensional tori from the circular Sitnikov problem to the nonreduced isosceles problem, each one filled with periodic or quasi-periodic orbits. These tori are not KAM tori but just isotropic, since we are dealing with a three-degrees-of-freedom system. The continuation of periodic orbits is done in two different ways, the first going directly from the reduced circular Sitnikov problem to the reduced isosceles problem, and the second one using two steps: first we continue the periodic orbits from the reduced circular Sitnikov problem to the reduced elliptic Sitnikov problem, and then we continue those periodic orbits of the reduced elliptic Sitnikov problem to the reduced isosceles problem. The continuation in one or two steps produces different results. This work is merely analytic and uses the variational equations in order to apply Poincar´e’s continuation method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Simulated-annealing-based conditional simulations provide a flexible means of quantitatively integrating diverse types of subsurface data. Although such techniques are being increasingly used in hydrocarbon reservoir characterization studies, their potential in environmental, engineering and hydrological investigations is still largely unexploited. Here, we introduce a novel simulated annealing (SA) algorithm geared towards the integration of high-resolution geophysical and hydrological data which, compared to more conventional approaches, provides significant advancements in the way that large-scale structural information in the geophysical data is accounted for. Model perturbations in the annealing procedure are made by drawing from a probability distribution for the target parameter conditioned to the geophysical data. This is the only place where geophysical information is utilized in our algorithm, which is in marked contrast to other approaches where model perturbations are made through the swapping of values in the simulation grid and agreement with soft data is enforced through a correlation coefficient constraint. Another major feature of our algorithm is the way in which available geostatistical information is utilized. Instead of constraining realizations to match a parametric target covariance model over a wide range of spatial lags, we constrain the realizations only at smaller lags where the available geophysical data cannot provide enough information. Thus we allow the larger-scale subsurface features resolved by the geophysical data to have much more due control on the output realizations. Further, since the only component of the SA objective function required in our approach is a covariance constraint at small lags, our method has improved convergence and computational efficiency over more traditional methods. Here, we present the results of applying our algorithm to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the local-scale porosity structure. Our procedure is first tested on a synthetic data set, and then applied to data collected at the Boise Hydrogeophysical Research Site.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Low-copy-number molecules are involved in many functions in cells. The intrinsic fluctuations of these numbers can enable stochastic switching between multiple steady states, inducing phenotypic variability. Herein we present a theoretical and computational study based on Master Equations and Fokker-Planck and Langevin descriptions of stochastic switching for a genetic circuit of autoactivation. We show that in this circuit the intrinsic fluctuations arising from low-copy numbers, which are inherently state-dependent, drive asymmetric switching. These theoretical results are consistent with experimental data that have been reported for the bistable system of the gallactose signaling network in yeast. Our study unravels that intrinsic fluctuations, while not required to describe bistability, are fundamental to understand stochastic switching and the dynamical relative stability of multiple states.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent single-cell studies in monkeys (Romo et al., 2004) show that the activity of neurons in the ventral premotor cortex covaries with the animal's decisions in a perceptual comparison task regarding the frequency of vibrotactile events. The firing rate response of these neurons was dependent only on the frequency differences between the two applied vibrations, the sign of that difference being the determining factor for correct task performance. We present a biophysically realistic neurodynamical model that can account for the most relevant characteristics of this decision-making-related neural activity. One of the nontrivial predictions of this model is that Weber's law will underlie the perceptual discrimination behavior. We confirmed this prediction in behavioral tests of vibrotactile discrimination in humans and propose a computational explanation of perceptual discrimination that accounts naturally for the emergence of Weber's law. We conclude that the neurodynamical mechanisms and computational principles underlying the decision-making processes in this perceptual discrimination task are consistent with a fluctuation-driven scenario in a multistable regime.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The human motion study, which relies on mathematical and computational models ingeneral, and multibody dynamic biomechanical models in particular, has become asubject of many recent researches. The human body model can be applied to different physical exercises and many important results such as muscle forces, which are difficult to be measured through practical experiments, can be obtained easily. In the work, human skeletal lower limb model consisting of three bodies in build using the flexible multibody dynamics simulation approach. The floating frame of reference formulation is used to account for the flexibility in the bones of the human lower limb model. The main reason of considering the flexibility inthe human bones is to measure the strains in the bone result from different physical exercises. It has been perceived the bone under strain will become stronger in order to cope with the exercise. On the other hand, the bone strength is considered and important factors in reducing the bone fractures. The simulation approach and model developed in this work are used to measure the bone strain results from applying raising the sole of the foot exercise. The simulation results are compared to the results available in literature. The comparison shows goof agreement. This study sheds the light on the importance of using the flexible multibody dynamic simulation approach to build human biomechanical models, which can be used in developing some exercises to achieve the optimalbone strength.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electrocaloric cooling based on ability of material to change temperature by applying an electric field under adiabatic conditions is relatively new and challenging direction of ferroelectrics research. In this work we report about analytical, simulation and experimental data for BaSrTiO3 thin film and bulk ceramic samples. Detailed discussion of a theoretical base of the electrocaloric effect is included. Demonstrated experimental and computational results exemplify rational approach to a problem of solid-state cooler construction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Integration of biological data of various types and the development of adapted bioinformatics tools represent critical objectives to enable research at the systems level. The European Network of Excellence ENFIN is engaged in developing an adapted infrastructure to connect databases, and platforms to enable both the generation of new bioinformatics tools and the experimental validation of computational predictions. With the aim of bridging the gap existing between standard wet laboratories and bioinformatics, the ENFIN Network runs integrative research projects to bring the latest computational techniques to bear directly on questions dedicated to systems biology in the wet laboratory environment. The Network maintains internally close collaboration between experimental and computational research, enabling a permanent cycling of experimental validation and improvement of computational prediction methods. The computational work includes the development of a database infrastructure (EnCORE), bioinformatics analysis methods and a novel platform for protein function analysis FuncNet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Current advances in genomics, proteomics and other areas of molecular biology make the identification and reconstruction of novel pathways an emerging area of great interest. One such class of pathways is involved in the biogenesis of Iron-Sulfur Clusters (ISC). Results: Our goal is the development of a new approach based on the use and combination of mathematical, theoretical and computational methods to identify the topology of a target network. In this approach, mathematical models play a central role for the evaluation of the alternative network structures that arise from literature data-mining, phylogenetic profiling, structural methods, and human curation. As a test case, we reconstruct the topology of the reaction and regulatory network for the mitochondrial ISC biogenesis pathway in S. cerevisiae. Predictions regarding how proteins act in ISC biogenesis are validated by comparison with published experimental results. For example, the predicted role of Arh1 and Yah1 and some of the interactions we predict for Grx5 both matches experimental evidence. A putative role for frataxin in directly regulating mitochondrial iron import is discarded from our analysis, which agrees with also published experimental results. Additionally, we propose a number of experiments for testing other predictions and further improve the identification of the network structure. Conclusion: We propose and apply an iterative in silico procedure for predictive reconstruction of the network topology of metabolic pathways. The procedure combines structural bioinformatics tools and mathematical modeling techniques that allow the reconstruction of biochemical networks. Using the Iron Sulfur cluster biogenesis in S. cerevisiae as a test case we indicate how this procedure can be used to analyze and validate the network model against experimental results. Critical evaluation of the obtained results through this procedure allows devising new wet lab experiments to confirm its predictions or provide alternative explanations for further improving the models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tight regulation of the MAP kinase Hog1 is crucial for survival under changing osmotic conditions. Interestingly, we found that Hog1 phosphorylates multiple upstream components, implying feedback regulation within the signaling cascade. Taking advantage of an unexpected link between glucose availability and Hog1 activity, we used quantitative single cell measurements and computational modeling to unravel feedback regulation operating in addition to the well-known adaptation feedback triggered by glycerol accumulation. Indeed, we found that Hog1 phosphorylates its activating kinase Ssk2 on several sites, and cells expressing a non-phosphorylatable Ssk2 mutant are partially defective for feedback regulation and proper control of basal Hog1 activity. Together, our data suggest that Hog1 activity is controlled by intertwined regulatory mechanisms operating with varying kinetics, which together tune the Hog1 response to balance basal Hog1 activity and its steady-state level after adaptation to high osmolarity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Individuals with an inherited deficiency in gonadotropin-releasing hormone (GnRH) have impaired sexual reproduction. Previous genetic linkage studies and sequencing of plausible gene candidates have identified mutations associated with inherited GnRH deficiency, but the small number of affected families and limited success in validating candidates have impeded genetic diagnoses for most patients. Using a combination of exome sequencing and computational modeling, we have identified a shared point mutation in semaphorin 3E (SEMA3E) in 2 brothers with Kallmann syndrome (KS), which causes inherited GnRH deficiency. Recombinant wild-type SEMA3E protected maturing GnRH neurons from cell death by triggering a plexin D1-dependent (PLXND1-dependent) activation of PI3K-mediated survival signaling. In contrast, recombinant SEMA3E carrying the KS-associated mutation did not protect GnRH neurons from death. In murine models, lack of either SEMA3E or PLXND1 increased apoptosis of GnRH neurons in the developing brain, reducing innervation of the adult median eminence by GnRH-positive neurites. GnRH neuron deficiency in male mice was accompanied by impaired testes growth, a characteristic feature of KS. Together, these results identify SEMA3E as an essential gene for GnRH neuron development, uncover a neurotrophic function for SEMA3E in the developing brain, and elucidate SEMA3E/PLXND1/PI3K signaling as a mechanism that prevents GnRH neuron deficiency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tämän diplomityön tavoitteena oli tutkia kohinan poistoa spektrikuvista käyttäen pehmeitä morfologisia suodattimia. Työssä painotettiin impulssimaisen kohinan suodattamista. Suodattimien toimintaa arvioitiin numeerisesti keskimääräisen itseisarvovirheen, neliövirheen sekä signaali-kohinasuhteen avulla ja visuaalisesti tarkastelemalla suodatettuja kuvia sekä niiden yksittäisiä spektritasoja. Käytettyjä suodatusmenetelmiä olivat suodatus kuvapisteittäin spektrin suunnassa, suodatus koko spektrissä sekä kuutiomenetelmä ja komponenteittainen suodatus. Suodatettavat kuvat sisälsivät joko suola ja pippuri- tai bittivirhekohinaa. Parhaimmat suodatustulokset sekä numeeristen virhekriteerien että visuaalisen tarkastelun perusteella saatiin komponenteittaisella sekä kuvapisteittäisellä menetelmällä. Työssä käytetyt menetelmät on esitetty algoritmimuodossa. Suodatinalgoritmien toteutukset ja suodatuskokeet tehtiin Matlab-ohjelmistolla.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electrical resistivity tomography (ERT) is a well-established method for geophysical characterization and has shown potential for monitoring geologic CO2 sequestration, due to its sensitivity to electrical resistivity contrasts generated by liquid/gas saturation variability. In contrast to deterministic inversion approaches, probabilistic inversion provides the full posterior probability density function of the saturation field and accounts for the uncertainties inherent in the petrophysical parameters relating the resistivity to saturation. In this study, the data are from benchtop ERT experiments conducted during gas injection into a quasi-2D brine-saturated sand chamber with a packing that mimics a simple anticlinal geological reservoir. The saturation fields are estimated by Markov chain Monte Carlo inversion of the measured data and compared to independent saturation measurements from light transmission through the chamber. Different model parameterizations are evaluated in terms of the recovered saturation and petrophysical parameter values. The saturation field is parameterized (1) in Cartesian coordinates, (2) by means of its discrete cosine transform coefficients, and (3) by fixed saturation values in structural elements whose shape and location is assumed known or represented by an arbitrary Gaussian Bell structure. Results show that the estimated saturation fields are in overall agreement with saturations measured by light transmission, but differ strongly in terms of parameter estimates, parameter uncertainties and computational intensity. Discretization in the frequency domain (as in the discrete cosine transform parameterization) provides more accurate models at a lower computational cost compared to spatially discretized (Cartesian) models. A priori knowledge about the expected geologic structures allows for non-discretized model descriptions with markedly reduced degrees of freedom. Constraining the solutions to the known injected gas volume improved estimates of saturation and parameter values of the petrophysical relationship. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The variability observed in drug exposure has a direct impact on the overall response to drug. The largest part of variability between dose and drug response resides in the pharmacokinetic phase, i.e. in the dose-concentration relationship. Among possibilities offered to clinicians, Therapeutic Drug Monitoring (TDM; Monitoring of drug concentration measurements) is one of the useful tool to guide pharmacotherapy. TDM aims at optimizing treatments by individualizing dosage regimens based on blood drug concentration measurement. Bayesian calculations, relying on population pharmacokinetic approach, currently represent the gold standard TDM strategy. However, it requires expertise and computational assistance, thus limiting its large implementation in routine patient care. The overall objective of this thesis was to implement robust tools to provide Bayesian TDM to clinician in modern routine patient care. To that endeavour, aims were (i) to elaborate an efficient and ergonomic computer tool for Bayesian TDM: EzeCHieL (ii) to provide algorithms for drug concentration Bayesian forecasting and software validation, relying on population pharmacokinetics (iii) to address some relevant issues encountered in clinical practice with a focus on neonates and drug adherence. First, the current stage of the existing software was reviewed and allows establishing specifications for the development of EzeCHieL. Then, in close collaboration with software engineers a fully integrated software, EzeCHieL, has been elaborated. EzeCHieL provides population-based predictions and Bayesian forecasting and an easy-to-use interface. It enables to assess the expectedness of an observed concentration in a patient compared to the whole population (via percentiles), to assess the suitability of the predicted concentration relative to the targeted concentration and to provide dosing adjustment. It allows thus a priori and a posteriori Bayesian drug dosing individualization. Implementation of Bayesian methods requires drug disposition characterisation and variability quantification trough population approach. Population pharmacokinetic analyses have been performed and Bayesian estimators have been provided for candidate drugs in population of interest: anti-infectious drugs administered to neonates (gentamicin and imipenem). Developed models were implemented in EzeCHieL and also served as validation tool in comparing EzeCHieL concentration predictions against predictions from the reference software (NONMEM®). Models used need to be adequate and reliable. For instance, extrapolation is not possible from adults or children to neonates. Therefore, this work proposes models for neonates based on the developmental pharmacokinetics concept. Patients' adherence is also an important concern for drug models development and for a successful outcome of the pharmacotherapy. A last study attempts to assess impact of routine patient adherence measurement on models definition and TDM interpretation. In conclusion, our results offer solutions to assist clinicians in interpreting blood drug concentrations and to improve the appropriateness of drug dosing in routine clinical practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Huonetilojen lämpöolosuhteiden hallinta on tärkeä osa talotekniikan suunnittelua. Tavallisesti huonetilan lämpöolosuhteita mallinnetaan menetelmillä, joissa lämpödynamiikkaa lasketaan huoneilmassa yhdessä laskentapisteessä ja rakenteissa seinäkohtaisesti. Tarkastelun kohteena on yleensä vain huoneilman lämpötila. Tämän diplomityön tavoitteena oli kehittää huoneilman lämpöolosuhteiden simulointimalli, jossa rakenteiden lämpödynamiikka lasketaan epästationaarisesti energia-analyysilaskennalla ja huoneilman virtauskenttä mallinnetaan valittuna ajanhetkenä stationaarisesti virtauslaskennalla. Tällöin virtauskentälle saadaan jakaumat suunnittelun kannalta olennaisista suureista, joita tyypillisesti ovat esimerkiksi ilman lämpötila ja nopeus. Simulointimallin laskentatuloksia verrattiin testihuonetiloissa tehtyihin mittauksiin. Tulokset osoittautuivat riittävän tarkoiksi talotekniikan suunnitteluun. Mallilla simuloitiin kaksi huonetilaa, joissa tarvittiin tavallista tarkempaa mallinnusta. Vertailulaskelmia tehtiin eri turbulenssimalleilla, diskretointitarkkuuksilla ja hilatiheyksillä. Simulointitulosten havainnollistamiseksi suunniteltiin asiakastuloste, jossa on esitetty suunnittelun kannalta olennaiset asiat. Simulointimallilla saatiin lisätietoa varsinkin lämpötilakerrostumista, joita tyypillisesti on arvioitu kokemukseen perustuen. Simulointimallin kehityksen taustana käsiteltiin rakennusten sisäilmastoa, lämpöolosuhteita ja laskentamenetelmiä sekä mallinnukseen soveltuvia kaupallisia ohjelmia. Simulointimallilla saadaan entistä tarkempaa ja yksityiskohtaisempaa tietoa lämpöolosuhteiden hallinnan suunnitteluun. Mallin käytön ongelmia ovat vielä virtauslaskennan suuri laskenta-aika, turbulenssin mallinnus, tuloilmalaitteiden reunaehtojen tarkka määritys ja laskennan konvergointi. Kehitetty simulointimalli tarjoaa hyvän perustan virtauslaskenta- ja energia-analyysiohjelmien kehittämiseksi ja yhdistämiseksi käyttäjäystävälliseksi talotekniikan suunnittelutyökaluksi.