957 resultados para Dynamic Emission Models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A class of multi-process models is developed for collections of time indexed count data. Autocorrelation in counts is achieved with dynamic models for the natural parameter of the binomial distribution. In addition to modeling binomial time series, the framework includes dynamic models for multinomial and Poisson time series. Markov chain Monte Carlo (MCMC) and Po ́lya-Gamma data augmentation (Polson et al., 2013) are critical for fitting multi-process models of counts. To facilitate computation when the counts are high, a Gaussian approximation to the P ́olya- Gamma random variable is developed.

Three applied analyses are presented to explore the utility and versatility of the framework. The first analysis develops a model for complex dynamic behavior of themes in collections of text documents. Documents are modeled as a “bag of words”, and the multinomial distribution is used to characterize uncertainty in the vocabulary terms appearing in each document. State-space models for the natural parameters of the multinomial distribution induce autocorrelation in themes and their proportional representation in the corpus over time.

The second analysis develops a dynamic mixed membership model for Poisson counts. The model is applied to a collection of time series which record neuron level firing patterns in rhesus monkeys. The monkey is exposed to two sounds simultaneously, and Gaussian processes are used to smoothly model the time-varying rate at which the neuron’s firing pattern fluctuates between features associated with each sound in isolation.

The third analysis presents a switching dynamic generalized linear model for the time-varying home run totals of professional baseball players. The model endows each player with an age specific latent natural ability class and a performance enhancing drug (PED) use indicator. As players age, they randomly transition through a sequence of ability classes in a manner consistent with traditional aging patterns. When the performance of the player significantly deviates from the expected aging pattern, he is identified as a player whose performance is consistent with PED use.

All three models provide a mechanism for sharing information across related series locally in time. The models are fit with variations on the P ́olya-Gamma Gibbs sampler, MCMC convergence diagnostics are developed, and reproducible inference is emphasized throughout the dissertation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Urban problems have several features that make them inherently dynamic. Large transaction costs all but guarantee that homeowners will do their best to consider how a neighborhood might change before buying a house. Similarly, stores face large sunk costs when opening, and want to be sure that their investment will pay off in the long run. In line with those concerns, different areas of Economics have made recent advances in modeling those questions within a dynamic framework. This dissertation contributes to those efforts.

Chapter 2 discusses how to model an agent’s location decision when the agent must learn about an exogenous amenity that may be changing over time. The model is applied to estimating the marginal willingness to pay to avoid crime, in which agents are learning about the crime rate in a neighborhood, and the crime rate can change in predictable (Markovian) ways.

Chapters 3 and 4 concentrate on location decision problems when there are externalities between decision makers. Chapter 3 focuses on the decision of business owners to open a store, when its demand is a function of other nearby stores, either through competition, or through spillovers on foot traffic. It uses a dynamic model in continuous time to model agents’ decisions. A particular challenge is isolating the contribution of spillovers from the contribution of other unobserved neighborhood attributes that could also lead to agglomeration. A key contribution of this chapter is showing how we can use information on storefront ownership to help separately identify spillovers.

Finally, chapter 4 focuses on a class of models in which families prefer to live

close to similar neighbors. This chapter provides the first simulation of such a model in which agents are forward looking, and shows that this leads to more segregation than it would have been observed with myopic agents, which is the standard in this literature. The chapter also discusses several extensions of the model that can be used to investigate relevant questions such as the arrival of a large contingent high skilled tech workers in San Francisco, the immigration of hispanic families to several southern American cities, large changes in local amenities, such as the construction of magnet schools or metro stations, and the flight of wealthy residents from cities in the Rust belt, such as Detroit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

RNA viruses are an important cause of global morbidity and mortality. The rapid evolutionary rates of RNA virus pathogens, caused by high replication rates and error-prone polymerases, can make the pathogens difficult to control. RNA viruses can undergo immune escape within their hosts and develop resistance to the treatment and vaccines we design to fight them. Understanding the spread and evolution of RNA pathogens is essential for reducing human suffering. In this dissertation, I make use of the rapid evolutionary rate of viral pathogens to answer several questions about how RNA viruses spread and evolve. To address each of the questions, I link mathematical techniques for modeling viral population dynamics with phylogenetic and coalescent techniques for analyzing and modeling viral genetic sequences and evolution. The first project uses multi-scale mechanistic modeling to show that decreases in viral substitution rates over the course of an acute infection, combined with the timing of infectious hosts transmitting new infections to susceptible individuals, can account for discrepancies in viral substitution rates in different host populations. The second project combines coalescent models with within-host mathematical models to identify driving evolutionary forces in chronic hepatitis C virus infection. The third project compares the effects of intrinsic and extrinsic viral transmission rate variation on viral phylogenies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To provide biological insights into transcriptional regulation, a couple of groups have recently presented models relating the promoter DNA-bound transcription factors (TFs) to downstream gene’s mean transcript level or transcript production rates over time. However, transcript production is dynamic in response to changes of TF concentrations over time. Also, TFs are not the only factors binding to promoters; other DNA binding factors (DBFs) bind as well, especially nucleosomes, resulting in competition between DBFs for binding at same genomic location. Additionally, not only TFs, but also some other elements regulate transcription. Within core promoter, various regulatory elements influence RNAPII recruitment, PIC formation, RNAPII searching for TSS, and RNAPII initiating transcription. Moreover, it is proposed that downstream from TSS, nucleosomes resist RNAPII elongation.

Here, we provide a machine learning framework to predict transcript production rates from DNA sequences. We applied this framework in the S. cerevisiae yeast for two scenarios: a) to predict the dynamic transcript production rate during the cell cycle for native promoters; b) to predict the mean transcript production rate over time for synthetic promoters. As far as we know, our framework is the first successful attempt to have a model that can predict dynamic transcript production rates from DNA sequences only: with cell cycle data set, we got Pearson correlation coefficient Cp = 0.751 and coefficient of determination r2 = 0.564 on test set for predicting dynamic transcript production rate over time. Also, for DREAM6 Gene Promoter Expression Prediction challenge, our fitted model outperformed all participant teams, best of all teams, and a model combining best team’s k-mer based sequence features and another paper’s biologically mechanistic features, in terms of all scoring metrics.

Moreover, our framework shows its capability of identifying generalizable fea- tures by interpreting the highly predictive models, and thereby provide support for associated hypothesized mechanisms about transcriptional regulation. With the learned sparse linear models, we got results supporting the following biological insights: a) TFs govern the probability of RNAPII recruitment and initiation possibly through interactions with PIC components and transcription cofactors; b) the core promoter amplifies the transcript production probably by influencing PIC formation, RNAPII recruitment, DNA melting, RNAPII searching for and selecting TSS, releasing RNAPII from general transcription factors, and thereby initiation; c) there is strong transcriptional synergy between TFs and core promoter elements; d) the regulatory elements within core promoter region are more than TATA box and nucleosome free region, suggesting the existence of still unidentified TAF-dependent and cofactor-dependent core promoter elements in yeast S. cerevisiae; e) nucleosome occupancy is helpful for representing +1 and -1 nucleosomes’ regulatory roles on transcription.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Terrestrial ecosystems, occupying more than 25% of the Earth's surface, can serve as

`biological valves' in regulating the anthropogenic emissions of atmospheric aerosol

particles and greenhouse gases (GHGs) as responses to their surrounding environments.

While the signicance of quantifying the exchange rates of GHGs and atmospheric

aerosol particles between the terrestrial biosphere and the atmosphere is

hardly questioned in many scientic elds, the progress in improving model predictability,

data interpretation or the combination of the two remains impeded by

the lack of precise framework elucidating their dynamic transport processes over a

wide range of spatiotemporal scales. The diculty in developing prognostic modeling

tools to quantify the source or sink strength of these atmospheric substances

can be further magnied by the fact that the climate system is also sensitive to the

feedback from terrestrial ecosystems forming the so-called `feedback cycle'. Hence,

the emergent need is to reduce uncertainties when assessing this complex and dynamic

feedback cycle that is necessary to support the decisions of mitigation and

adaptation policies associated with human activities (e.g., anthropogenic emission

controls and land use managements) under current and future climate regimes.

With the goal to improve the predictions for the biosphere-atmosphere exchange

of biologically active gases and atmospheric aerosol particles, the main focus of this

dissertation is on revising and up-scaling the biotic and abiotic transport processes

from leaf to canopy scales. The validity of previous modeling studies in determining

iv

the exchange rate of gases and particles is evaluated with detailed descriptions of their

limitations. Mechanistic-based modeling approaches along with empirical studies

across dierent scales are employed to rene the mathematical descriptions of surface

conductance responsible for gas and particle exchanges as commonly adopted by all

operational models. Specically, how variation in horizontal leaf area density within

the vegetated medium, leaf size and leaf microroughness impact the aerodynamic attributes

and thereby the ultrane particle collection eciency at the leaf/branch scale

is explored using wind tunnel experiments with interpretations by a porous media

model and a scaling analysis. A multi-layered and size-resolved second-order closure

model combined with particle

uxes and concentration measurements within and

above a forest is used to explore the particle transport processes within the canopy

sub-layer and the partitioning of particle deposition onto canopy medium and forest

oor. For gases, a modeling framework accounting for the leaf-level boundary layer

eects on the stomatal pathway for gas exchange is proposed and combined with sap

ux measurements in a wind tunnel to assess how leaf-level transpiration varies with

increasing wind speed. How exogenous environmental conditions and endogenous

soil-root-stem-leaf hydraulic and eco-physiological properties impact the above- and

below-ground water dynamics in the soil-plant system and shape plant responses

to droughts is assessed by a porous media model that accommodates the transient

water

ow within the plant vascular system and is coupled with the aforementioned

leaf-level gas exchange model and soil-root interaction model. It should be noted

that tackling all aspects of potential issues causing uncertainties in forecasting the

feedback cycle between terrestrial ecosystem and the climate is unrealistic in a single

dissertation but further research questions and opportunities based on the foundation

derived from this dissertation are also brie

y discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dynamic positron emission tomography (PET) imaging can be used to track the distribution of injected radio-labelled molecules over time in vivo. This is a powerful technique, which provides researchers and clinicians the opportunity to study the status of healthy and pathological tissue by examining how it processes substances of interest. Widely used tracers include 18F-uorodeoxyglucose, an analog of glucose, which is used as the radiotracer in over ninety percent of PET scans. This radiotracer provides a way of quantifying the distribution of glucose utilisation in vivo. The interpretation of PET time-course data is complicated because the measured signal is a combination of vascular delivery and tissue retention effects. If the arterial time-course is known, the tissue time-course can typically be expressed in terms of a linear convolution between the arterial time-course and the tissue residue function. As the residue represents the amount of tracer remaining in the tissue, this can be thought of as a survival function; these functions been examined in great detail by the statistics community. Kinetic analysis of PET data is concerned with estimation of the residue and associated functionals such as ow, ux and volume of distribution. This thesis presents a Markov chain formulation of blood tissue exchange and explores how this relates to established compartmental forms. A nonparametric approach to the estimation of the residue is examined and the improvement in this model relative to compartmental model is evaluated using simulations and cross-validation techniques. The reference distribution of the test statistics, generated in comparing the models, is also studied. We explore these models further with simulated studies and an FDG-PET dataset from subjects with gliomas, which has previously been analysed with compartmental modelling. We also consider the performance of a recently proposed mixture modelling technique in this study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Brain injury due to lack of oxygen or impaired blood flow around the time of birth, may cause long term neurological dysfunction or death in severe cases. The treatments need to be initiated as soon as possible and tailored according to the nature of the injury to achieve best outcomes. The Electroencephalogram (EEG) currently provides the best insight into neurological activities. However, its interpretation presents formidable challenge for the neurophsiologists. Moreover, such expertise is not widely available particularly around the clock in a typical busy Neonatal Intensive Care Unit (NICU). Therefore, an automated computerized system for detecting and grading the severity of brain injuries could be of great help for medical staff to diagnose and then initiate on-time treatments. In this study, automated systems for detection of neonatal seizures and grading the severity of Hypoxic-Ischemic Encephalopathy (HIE) using EEG and Heart Rate (HR) signals are presented. It is well known that there is a lot of contextual and temporal information present in the EEG and HR signals if examined at longer time scale. The systems developed in the past, exploited this information either at very early stage of the system without any intelligent block or at very later stage where presence of such information is much reduced. This work has particularly focused on the development of a system that can incorporate the contextual information at the middle (classifier) level. This is achieved by using dynamic classifiers that are able to process the sequences of feature vectors rather than only one feature vector at a time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Changes in the emission, transport and deposition of aeolian dust have profound effects on regional climate, so that characterizing the lifecycle of dust in observations and improving the representation of dust in global climate models is necessary. A fundamental aspect of characterizing the dust cycle is quantifying surface dust fluxes, yet no spatially explicit estimates of this flux exist for the World's major source regions. Here we present a novel technique for creating a map of the annual mean emitted dust flux for North Africa based on retrievals of dust storm frequency from the Meteosat Second Generation Spinning Enhanced Visible and InfraRed Imager (SEVIRI) and the relationship between dust storm frequency and emitted mass flux derived from the output of five models that simulate dust. Our results suggest that 64 (±16)% of all dust emitted from North Africa is from the Bodélé depression, and that 13 (±3)% of the North African dust flux is from a depression lying in the lee of the Aïr and Hoggar Mountains, making this area the second most important region of emission within North Africa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bridges are a critical part of North America’s transportation network that need to be assessed frequently to inform bridge management decision making. Visual inspections are usually implemented for this purpose, during which inspectors must observe and report any excess displacements or vibrations. Unfortunately, these visual inspections are subjective and often highly variable and so a monitoring technology that can provide quantitative measurements to supplement inspections is needed. Digital Image Correlation (DIC) is a novel monitoring technology that uses digital images to measure displacement fields without any contact with the bridge. In this research, DIC and accelerometers were used to investigate the dynamic response of a railway bridge reported to experience large lateral displacements. Displacements were estimated using accelerometer measurements and were compared to DIC measurements. It was shown that accelerometers can provide reasonable estimates of displacement for zero-mean lateral displacements. By comparing measurements in the girder and in the piers, it was shown that for the bridge monitored, the large lateral displacements originated in the steel casting bearings positioned above the piers, and not in the piers themselves. The use of DIC for evaluating the effectiveness of rehabilitation of the LaSalle Causeway lift bridge in Kingston, Ontario was also investigated. Vertical displacements were measured at midspan and at the lifting end of the bridge during a static test and under dynamic live loading. The bridge displacements were well within the operating limits, however a gap at the lifting end of the bridge was identified. Rehabilitation of the bridge was conducted and by comparing measurements before and after rehabilitation, it was shown that the gap was successfully closed. Finally, DIC was used to monitor the midspan vertical and lateral displacements in a monitoring campaign of five steel rail bridges. DIC was also used to evaluate the effectiveness of structural rehabilitation of the lateral bracing of a bridge. Simple finite element models are developed using DIC measurements of displacement. Several lessons learned throughout this monitoring campaign are discussed in the hope of aiding future researchers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context: The stellar population of the 30 Doradus star-forming region in the Large Magellanic Cloud contains a subset of apparently single, rapidly rotating O-type stars. The physical processes leading to the formation of this cohort are currently uncertain. 

Aims: One member of this group, the late O-type star VFTS 399, is found to be unexpectedly X-ray bright for its bolometric luminosity-in this study we aim to determine its physical nature and the cause of this behaviour. 

Methods: To accomplish this we performed a time-resolved analysis of optical, infrared and X-ray observations. 

Results: We found VFTS 399 to be an aperiodic photometric variable with an apparent near-IR excess. Its optical spectrum demonstrates complex emission profiles in the lower Balmer series and select He i lines-taken together these suggest an OeBe classification. The highly variable X-ray luminosity is too great to be produced by a single star, while the hard, non-thermal nature suggests the presence of an accreting relativistic companion. Finally, the detection of periodic modulation of the X-ray lightcurve is most naturally explained under the assumption that the accretor is a neutron star. 

Conclusions: VFTS 399 appears to be the first high-mass X-ray binary identified within 30 Dor, sharing many observational characteristics with classical Be X-ray binaries. Comparison of the current properties of VFTS 399 to binary-evolution models suggests a progenitor mass 25 M for the putative neutron star, which may host a magnetic field comparable in strength to those of magnetars. VFTS 399 is now the second member of the cohort of rapidly rotating "single" O-type stars in 30 Dor to show evidence of binary interaction resulting in spin-up, suggesting that this may be a viable evolutionary pathway for the formation of a subset of this stellar population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Schistosomiasis remains a major public health issue, with an estimated 230 million people infected worldwide. Novel tools for early diagnosis and surveillance of schistosomiasis are currently needed. Elevated levels of circulating microRNAs (miRNAs) are commonly associated with the initiation and progression of human disease pathology. Hence, serum miRNAs are emerging as promising biomarkers for the diagnosis of a variety of human diseases. This study investigated circulating host miRNAs commonly associated with liver diseases and schistosome parasite-derived miRNAs during the progression of hepatic schistosomiasis japonica in two murine models.

METHODOLOGY/PRINCIPAL FINDINGS: Two mouse strains (C57BL/6 and BALB/c) were infected with a low dosage of Schistosoma japonicum cercariae. The dynamic patterns of hepatopathology, the serum levels of liver injury-related enzymes and the serum circulating miRNAs (both host and parasite-derived) levels were then assessed in the progression of schistosomiasis japonica. For the first time, an inverse correlation between the severity of hepatocyte necrosis and the level of liver fibrosis was revealed during S. japonicum infection in BALB/c, but not in C57BL/6 mice. The inconsistent levels of the host circulating miRNAs, miR-122, miR-21 and miR-34a in serum were confirmed in the two murine models during infection, which limits their potential value as individual diagnostic biomarkers for schistosomiasis. However, their serum levels in combination may serve as a novel biomarker to mirror the hepatic immune responses induced in the mammalian host during schistosome infection and the degree of hepatopathology. Further, two circulating parasite-specific miRNAs, sja-miR-277 and sja-miR-3479-3p, were shown to have potential as diagnostic markers for schistosomiasis japonica.

CONCLUSIONS/SIGNIFICANCE: We provide the first evidence for the potential of utilizing circulating host miRNAs to indicate different immune responses and the severity of hepatopathology outcomes induced in two murine strains infected with S. japonicum. This study also establishes a basis for the early and cell-free diagnosis of schistosomiasis by targeting circulating schistosome parasite-derived miRNAs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The observed line intensity ratios of the Si ii λ1263 and λ1307 multiplets to that of Si ii λ1814 in the broad-line region (BLR) of quasars are both an order of magnitude larger than the theoretical values. This was first pointed out by Baldwin et al., who termed it the "Si ii disaster," and it has remained unresolved. We investigate the problem in the light of newly published atomic data for Si ii. Specifically, we perform BLR calculations using several different atomic data sets within the CLOUDY modeling code under optically thick quasar cloud conditions. In addition, we test for selective pumping by the source photons or intrinsic galactic reddening as possible causes for the discrepancy, and we also consider blending with other species. However, we find that none of the options investigated resolve the Si ii disaster, with the potential exception of microturbulent velocity broadening and line blending. We find that a larger microturbulent velocity () may solve the Si ii disaster through continuum pumping and other effects. The CLOUDY models indicate strong blending of the Si ii λ1307 multiplet with emission lines of O i, although the predicted degree of blending is incompatible with the observed λ1263/λ1307 intensity ratios. Clearly, more work is required on the quasar modeling of not just the Si ii lines but also nearby transitions (in particular those of O i) to fully investigate whether blending may be responsible for the Si ii disaster.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ground-source heat pump (GSHP) systems represent one of the most promising techniques for heating and cooling in buildings. These systems use the ground as a heat source/sink, allowing a better efficiency thanks to the low variations of the ground temperature along the seasons. The ground-source heat exchanger (GSHE) then becomes a key component for optimizing the overall performance of the system. Moreover, the short-term response related to the dynamic behaviour of the GSHE is a crucial aspect, especially from a regulation criteria perspective in on/off controlled GSHP systems. In this context, a novel numerical GSHE model has been developed at the Instituto de Ingeniería Energética, Universitat Politècnica de València. Based on the decoupling of the short-term and the long-term response of the GSHE, the novel model allows the use of faster and more precise models on both sides. In particular, the short-term model considered is the B2G model, developed and validated in previous research works conducted at the Instituto de Ingeniería Energética. For the long-term, the g-function model was selected, since it is a previously validated and widely used model, and presents some interesting features that are useful for its combination with the B2G model. The aim of the present paper is to describe the procedure of combining these two models in order to obtain a unique complete GSHE model for both short- and long-term simulation. The resulting model is then validated against experimental data from a real GSHP installation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of new learning models has been of great importance throughout recent years, with a focus on creating advances in the area of deep learning. Deep learning was first noted in 2006, and has since become a major area of research in a number of disciplines. This paper will delve into the area of deep learning to present its current limitations and provide a new idea for a fully integrated deep and dynamic probabilistic system. The new model will be applicable to a vast number of areas initially focusing on applications into medical image analysis with an overall goal of utilising this approach for prediction purposes in computer based medical systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study had three objectives: (1) to develop a comprehensive truck simulation that executes rapidly, has a modular program construction to allow variation of vehicle characteristics, and is able to realistically predict vehicle motion and the tire-road surface interaction forces; (2) to develop a model of doweled portland cement concrete pavement that can be used to determine slab deflection and stress at predetermined nodes, and that allows for the variation of traditional thickness design factors; and (3) to implement these two models on a work station with suitable menu driven modules so that both existing and proposed pavements can be evaluated with respect to design life, given specific characteristics of the heavy vehicles that will be using the facility. This report summarizes the work that has been performed during the first year of the study. Briefly, the following has been accomplished: A two dimensional model of a typical 3-S2 tractor-trailer combination was created. A finite element structural analysis program, ANSYS, was used to model the pavement. Computer runs have been performed varying the parameters defining both vehicle and road elements. The resulting time specific displacements for each node are plotted, and the displacement basin is generated for defined vehicles. Relative damage to the pavement can then be estimated. A damage function resulting from load replications must be assumed that will be reflected by further pavement deterioration. Comparison with actual damage on Interstate 80 will eventually allow verification of these procedures.