961 resultados para Time-dependent Analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we detail some results advanced in a recent letter [Prado et al., Phys. Rev. Lett. 102, 073008 (2009).] showing how to engineer reservoirs for two-level systems at absolute zero by means of a time-dependent master equation leading to a nonstationary superposition equilibrium state. We also present a general recipe showing how to build nonadiabatic coherent evolutions of a fermionic system interacting with a bosonic mode and investigate the influence of thermal reservoirs at finite temperature on the fidelity of the protected superposition state. Our analytical results are supported by numerical analysis of the full Hamiltonian model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Secondary neurodegeneration takes place in the surrounding tissue of spinal cord trauma and modifies substantially the prognosis, considering the small diameter of its transversal axis. We analyzed neuronal and glial responses in rat spinal cord after different degree of contusion promoted by the NYU Impactor. Rats were submitted to vertebrae laminectomy and received moderate or severe contusions. Control animals were sham operated. After 7 and 30 days post surgery, stereological analysis of Nissl staining cellular profiles showed a time progression of the lesion volume after moderate injury, but not after severe injury. The number of neurons was not altered cranial to injury. However, same degree of diminution was seen in the caudal cord 30 days after both severe and moderate injuries. Microdensitometric image analysis demonstrated a microglial reaction in the white matter 30 days after a moderate contusion and showed a widespread astroglial reaction in the white and gray matters 7 days after both severities. Astroglial activation lasted close to lesion and in areas related to Wallerian degeneration. Data showed a more protracted secondary degeneration in rat spinal cord after mild contusion, which offered an opportunity for neuroprotective approaches. Temporal and regional glial responses corroborated to diverse glial cell function in lesioned spinal cord. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computational simulations of the title reaction are presented, covering a temperature range from 300 to 2000 K. At lower temperatures we find that initial formation of the cyclopropene complex by addition of methylene to acetylene is irreversible, as is the stabilisation process via collisional energy transfer. Product branching between propargyl and the stable isomers is predicted at 300 K as a function of pressure for the first time. At intermediate temperatures (1200 K), complex temporal evolution involving multiple steady states begins to emerge. At high temperatures (2000 K) the timescale for subsequent unimolecular decay of thermalized intermediates begins to impinge on the timescale for reaction of methylene, such that the rate of formation of propargyl product does not admit a simple analysis in terms of a single time-independent rate constant until the methylene supply becomes depleted. Likewise, at the elevated temperatures the thermalized intermediates cannot be regarded as irreversible product channels. Our solution algorithm involves spectral propagation of a symmetrised version of the discretized master equation matrix, and is implemented in a high precision environment which makes hitherto unachievable low-temperature modelling a reality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many large-scale stochastic systems, such as telecommunications networks, can be modelled using a continuous-time Markov chain. However, it is frequently the case that a satisfactory analysis of their time-dependent, or even equilibrium, behaviour is impossible. In this paper, we propose a new method of analyzing Markovian models, whereby the existing transition structure is replaced by a more amenable one. Using rates of transition given by the equilibrium expected rates of the corresponding transitions of the original chain, we are able to approximate its behaviour. We present two formulations of the idea of expected rates. The first provides a method for analysing time-dependent behaviour, while the second provides a highly accurate means of analysing equilibrium behaviour. We shall illustrate our approach with reference to a variety of models, giving particular attention to queueing and loss networks. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The receiver-operating characteristic (ROC) curve is the most widely used measure for evaluating the performance of a diagnostic biomarker when predicting a binary disease outcome. The ROC curve displays the true positive rate (or sensitivity) and the false positive rate (or 1-specificity) for different cut-off values used to classify an individual as healthy or diseased. In time-to-event studies, however, the disease status (e.g. death or alive) of an individual is not a fixed characteristic, and it varies along the study. In such cases, when evaluating the performance of the biomarker, several issues should be taken into account: first, the time-dependent nature of the disease status; and second, the presence of incomplete data (e.g. censored data typically present in survival studies). Accordingly, to assess the discrimination power of continuous biomarkers for time-dependent disease outcomes, time-dependent extensions of true positive rate, false positive rate, and ROC curve have been recently proposed. In this work, we present new nonparametric estimators of the cumulative/dynamic time-dependent ROC curve that allow accounting for the possible modifying effect of current or past covariate measures on the discriminatory power of the biomarker. The proposed estimators can accommodate right-censored data, as well as covariate-dependent censoring. The behavior of the estimators proposed in this study will be explored through simulations and illustrated using data from a cohort of patients who suffered from acute coronary syndrome.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper the two main drawbacks of the heat balance integral methods are examined. Firstly we investigate the choice of approximating function. For a standard polynomial form it is shown that combining the Heat Balance and Refined Integral methods to determine the power of the highest order term will either lead to the same, or more often, greatly improved accuracy on standard methods. Secondly we examine thermal problems with a time-dependent boundary condition. In doing so we develop a logarithmic approximating function. This new function allows us to model moving peaks in the temperature profile, a feature that previous heat balance methods cannot capture. If the boundary temperature varies so that at some time t & 0 it equals the far-field temperature, then standard methods predict that the temperature is everywhere at this constant value. The new method predicts the correct behaviour. It is also shown that this function provides even more accurate results, when coupled with the new CIM, than the polynomial profile. Analysis primarily focuses on a specified constant boundary temperature and is then extended to constant flux, Newton cooling and time dependent boundary conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Paracoccidioides brasiliensis causes infection through inhalation by the host of airborne propagules from the mycelium phase of the fungus. This fungus reaches the lungs, differentiates into the yeast form and is then disseminated to virtually all parts of the body. Here we review the identification of differentially-expressed genes in host-interaction conditions. These genes were identified by analyzing expressed sequence tags (ESTs) from P. brasiliensis cDNA libraries. The P. brasiliensis was recovered from infected mouse liver as well as from fungal yeast cells incubated in human blood and plasma, mimicking fungal dissemination to organs and tissues and sites of infection with inflammation, respectively. In addition, ESTs from a cDNA library of P. brasiliensis mycelium undergoing the transition to yeast were previously analyzed. Together, these studies reveal significant changes in the expression of a number of genes of potential importance in the host-fungus interaction. In addition, the unique and divergent representation of transcripts when the cDNA libraries are compared suggests differential gene expression in response to specific niches in the host. This analysis of gene expression patterns provides details about host-pathogen interactions and peculiarities of sites within the host.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We show that time-dependent couplings may lead to nontrivial scaling properties of the surface fluctuations of the asymptotic regime in nonequilibrium kinetic roughening models. Three typical situations are studied. In the case of a crossover between two different rough regimes, the time-dependent coupling may result in anomalous scaling for scales above the crossover length. In a different setting, for a crossover from a rough to either a flat or damping regime, the time-dependent crossover length may conspire to produce a rough surface, although the most relevant term tends to flatten the surface. In addition, our analysis sheds light into an existing debate in the problem of spontaneous imbibition, where time-dependent couplings naturally arise in theoretical models and experiments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Contexte. Les études cas-témoins sont très fréquemment utilisées par les épidémiologistes pour évaluer l’impact de certaines expositions sur une maladie particulière. Ces expositions peuvent être représentées par plusieurs variables dépendant du temps, et de nouvelles méthodes sont nécessaires pour estimer de manière précise leurs effets. En effet, la régression logistique qui est la méthode conventionnelle pour analyser les données cas-témoins ne tient pas directement compte des changements de valeurs des covariables au cours du temps. Par opposition, les méthodes d’analyse des données de survie telles que le modèle de Cox à risques instantanés proportionnels peuvent directement incorporer des covariables dépendant du temps représentant les histoires individuelles d’exposition. Cependant, cela nécessite de manipuler les ensembles de sujets à risque avec précaution à cause du sur-échantillonnage des cas, en comparaison avec les témoins, dans les études cas-témoins. Comme montré dans une étude de simulation précédente, la définition optimale des ensembles de sujets à risque pour l’analyse des données cas-témoins reste encore à être élucidée, et à être étudiée dans le cas des variables dépendant du temps. Objectif: L’objectif général est de proposer et d’étudier de nouvelles versions du modèle de Cox pour estimer l’impact d’expositions variant dans le temps dans les études cas-témoins, et de les appliquer à des données réelles cas-témoins sur le cancer du poumon et le tabac. Méthodes. J’ai identifié de nouvelles définitions d’ensemble de sujets à risque, potentiellement optimales (le Weighted Cox model and le Simple weighted Cox model), dans lesquelles différentes pondérations ont été affectées aux cas et aux témoins, afin de refléter les proportions de cas et de non cas dans la population source. Les propriétés des estimateurs des effets d’exposition ont été étudiées par simulation. Différents aspects d’exposition ont été générés (intensité, durée, valeur cumulée d’exposition). Les données cas-témoins générées ont été ensuite analysées avec différentes versions du modèle de Cox, incluant les définitions anciennes et nouvelles des ensembles de sujets à risque, ainsi qu’avec la régression logistique conventionnelle, à des fins de comparaison. Les différents modèles de régression ont ensuite été appliqués sur des données réelles cas-témoins sur le cancer du poumon. Les estimations des effets de différentes variables de tabac, obtenues avec les différentes méthodes, ont été comparées entre elles, et comparées aux résultats des simulations. Résultats. Les résultats des simulations montrent que les estimations des nouveaux modèles de Cox pondérés proposés, surtout celles du Weighted Cox model, sont bien moins biaisées que les estimations des modèles de Cox existants qui incluent ou excluent simplement les futurs cas de chaque ensemble de sujets à risque. De plus, les estimations du Weighted Cox model étaient légèrement, mais systématiquement, moins biaisées que celles de la régression logistique. L’application aux données réelles montre de plus grandes différences entre les estimations de la régression logistique et des modèles de Cox pondérés, pour quelques variables de tabac dépendant du temps. Conclusions. Les résultats suggèrent que le nouveau modèle de Cox pondéré propose pourrait être une alternative intéressante au modèle de régression logistique, pour estimer les effets d’expositions dépendant du temps dans les études cas-témoins

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this thesis is to study the time dependent behaviour of some complex queueing and inventory models. It contains a detailed analysis of the basic stochastic processes underlying these models. In the theory of queues, analysis of time dependent behaviour is an area.very little developed compared to steady state theory. Tine dependence seems certainly worth studying from an application point of view but unfortunately, the analytic difficulties are considerable. Glosod form solutions are complicated even for such simple models as M/M /1. Outside M/>M/1, time dependent solutions have been found only in special cases and involve most often double transforms which provide very little insight into the behaviour of the queueing systems themselves. In inventory theory also There is not much results available giving the time dependent solution of the system size probabilities. Our emphasis is on explicit results free from all types of transforms and the method used may be of special interest to a wide variety of problems having regenerative structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Soil fertility constraints to crop production have been recognized widely as a major obstacle to food security and agro-ecosystem sustainability in sub-Saharan West Africa. As such, they have led to a multitude of research projects and policy debates on how best they should be overcome. Conclusions, based on long-term multi-site experiments, are lacking with respect to a regional assessment of phosphorus and nitrogen fertilizer effects, surface mulched crop residues, and legume rotations on total dry matter of cereals in this region. A mixed model time-trend analysis was used to investigate the effects of four nitrogen and phosphorus rates, annually applied crop residue dry matter at 500 and 2000 kg ha^-1, and cereal-legume rotation versus continuous cereal cropping on the total dry matter of cereals and legumes. The multi-factorial experiment was conducted over four years at eight locations, with annual rainfall ranging from 510 to 1300 mm, in Niger, Burkina Faso, and Togo. With the exception of phosphorus, treatment effects on legume growth were marginal. At most locations, except for typical Sudanian sites with very low base saturation and high rainfall, phosphorus effects on cereal total dry matter were much lower with rock phosphate than with soluble phosphorus, unless the rock phosphate was combined with an annual seed-placement of 4 kg ha^-1 phosphorus. Across all other treatments, nitrogen effects were negligible at 500 mm annual rainfall but at 900 mm, the highest nitrogen rate led to total dry matter increases of up to 77% and, at 1300 mm, to 183%. Mulch-induced increases in cereal total dry matter were larger with lower base saturation, reaching 45% on typical acid sandy Sahelian soils. Legume rotation effects tended to increase over time but were strongly species-dependent.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the extension of a methodology to solve moving boundary value problems from the second-order case to the case of the third-order linear evolution PDE qt + qxxx = 0. This extension is the crucial step needed to generalize this methodology to PDEs of arbitrary order. The methodology is based on the derivation of inversion formulae for a class of integral transforms that generalize the Fourier transform and on the analysis of the global relation associated with the PDE. The study of this relation and its inversion using the appropriate generalized transform are the main elements of the proof of our results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper demonstrates by means of joint time-frequency analysis that the acoustic noise produced by the breaking of biscuits is dependent on relative humidity and water activity. It also shows that the time-frequency coefficients calculated using the adaptive Gabor transformation algorithm is dependent on the period of time a biscuit is exposed to humidity. This is a new methodology that can be used to assess the crispness of crisp foods. (c) 2007 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The real effects of an imperfectly credible disinflation depend critically on the extent of price rigidity. Therefore, the study of how policymakers’ credibility affects the outcome of an announced disinflation should not be dissociated from the analysis of the determinants of the frequency of price adjustments. In this paper we examine how the policymaker’s credibility affects the outcome of an announced disinflation in a model with endogenous time-dependent pricing rules. Both the initial degree of price ridigity, calculated optimally, and, more notably, the changes in contract length during disinflation play an important role in the explanation of the effects of imperfect credibility. We initially evalute the costs of disinflation in a setup where credibility is exogenous, and then allow agents to update beliefs about the “type” of monetary authority that they face. We show that, in both cases, the interaction between the endogeneity of time-dependent rules and imperfect credibility increases the output costs of disinflation.