78 resultados para Background Traffic Modeling
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Background: Few studies have used longitudinal ultrasound measurements to assess the effect of traffic-related air pollution on fetal growth.Objective: We examined the relationship between exposure to nitrogen dioxide (NO2) and aromatic hydrocarbons [benzene, toluene, ethylbenzene, m/p-xylene, and o-xylene (BTEX)] on fetal growth assessed by 1,692 ultrasound measurements among 562 pregnant women from the Sabadell cohort of the Spanish INMA (Environment and Childhood) study.Methods: We used temporally adjusted land-use regression models to estimate exposures to NO2 and BTEX. We fitted mixed-effects models to estimate longitudinal growth curves for femur length (FL), head circumference (HC), abdominal circumference (AC), biparietal diameter (BPD), and estimated fetal weight (EFW). Unconditional and conditional SD scores were calculated at 12, 20, and 32 weeks of gestation. Sensitivity analyses were performed considering time–activity patterns during pregnancy.Results: Exposure to BTEX from early pregnancy was negatively associated with growth in BPD during weeks 20–32. None of the other fetal growth parameters were associated with exposure to air pollution during pregnancy. When considering only women who spent 2 hr/day in nonresidential outdoor locations, effect estimates were stronger and statistically significant for the association between NO2 and growth in HC during weeks 12–20 and growth in AC, BPD, and EFW during weeks 20–32.Conclusions: Our results lend some support to an effect of exposure to traffic-related air pollutants from early pregnancy on fetal growth during mid-pregnancy.
Resumo:
Background: Germline genetic variation is associated with the differential expression of many human genes. The phenotypic effects of this type of variation may be important when considering susceptibility to common genetic diseases. Three regions at 8q24 have recently been identified to independently confer risk of prostate cancer. Variation at 8q24 has also recently been associated with risk of breast and colorectal cancer. However, none of the risk variants map at or relatively close to known genes, with c-MYC mapping a few hundred kilobases distally. Results: This study identifies cis-regulators of germline c-MYC expression in immortalized lymphocytes of HapMap individuals. Quantitative analysis of c-MYC expression in normal prostate tissues suggests an association between overexpression and variants in Region 1 of prostate cancer risk. Somatic c-MYC overexpression correlates with prostate cancer progression and more aggressive tumor forms, which was also a pathological variable associated with Region 1. Expression profiling analysis and modeling of transcriptional regulatory networks predicts a functional association between MYC and the prostate tumor suppressor KLF6. Analysis of MYC/Myc-driven cell transformation and tumorigenesis substantiates a model in which MYC overexpression promotes transformation by down-regulating KLF6. In this model, a feedback loop through E-cadherin down-regulation causes further transactivation of c-MYC.Conclusion: This study proposes that variation at putative 8q24 cis-regulator(s) of transcription can significantly alter germline c-MYC expression levels and, thus, contribute to prostate cancer susceptibility by down-regulating the prostate tumor suppressor KLF6 gene.
Resumo:
Many European states apply score systems to evaluate the disability severity of non-fatal motor victims under the law of third-party liability. The score is a non-negative integer with an upper bound at 100 that increases with severity. It may be automatically converted into financial terms and thus also reflects the compensation cost for disability. In this paper, discrete regression models are applied to analyze the factors that influence the disability severity score of victims. Standard and zero-altered regression models are compared from two perspectives: an interpretation of the data generating process and the level of statistical fit. The results have implications for traffic safety policy decisions aimed at reducing accident severity. An application using data from Spain is provided.
Resumo:
Background: Epidemiological evidence of the effects of long-term exposure to air pollu tion on the chronic processes of athero genesis is limited. Objective: We investigated the association of long-term exposure to traffic-related air pollu tion with subclinical atherosclerosis, measured by carotid intima media thickness (IMT) and ankle–brachial index (ABI). Methods: We performed a cross-sectional analysis using data collected during the reexamination (2007–2010) of 2,780 participants in the REGICOR (Registre Gironí del Cor: the Gerona Heart Register) study, a population-based prospective cohort in Girona, Spain. Long-term exposure across residences was calculated as the last 10 years’ time-weighted average of residential nitrogen dioxide (NO2) estimates (based on a local-scale land-use regression model), traffic intensity in the nearest street, and traffic intensity in a 100 m buffer. Associations with IMT and ABI were estimated using linear regression and multinomial logistic regression, respectively, controlling for sex, age, smoking status, education, marital status, and several other potential confounders or intermediates. Results: Exposure contrasts between the 5th and 95th percentiles for NO2 (25 μg/m), traffic intensity in the nearest street (15,000 vehicles/day), and traffic load within 100 m (7,200,000 vehicle-m/day) were associated with differences of 0.56% (95% CI: –1.5, 2.6%), 2.32% (95% CI: 0.48, 4.17%), and 1.91% (95% CI: –0.24, 4.06) percent difference in IMT, respectively. Exposures were positively associated with an ABI of > 1.3, but not an ABI of < 0.9. Stronger associations were observed among those with a high level of education and in men ≥ 60 years of age. Conclusions: Long-term traffic-related exposures were associated with subclinical markers of atherosclerosis. Prospective studies are needed to confirm associations and further examine differences among population subgroups.key words: ankle–brachial index, average daily traffic, cardiovascular disease, exposure assessment, exposure to tailpipe emissions, intima media thickness, land use regression model, Mediterranean diet, nitrogen dioxide
Resumo:
We explore the determinants of usage of six different types of health care services, using the Medical Expenditure Panel Survey data, years 1996-2000. We apply a number of models for univariate count data, including semiparametric, semi-nonparametric and finite mixture models. We find that the complexity of the model that is required to fit the data well depends upon the way in which the data is pooled across sexes and over time, and upon the characteristics of the usage measure. Pooling across time and sexes is almost always favored, but when more heterogeneous data is pooled it is often the case that a more complex statistical model is required.
Resumo:
We review recent likelihood-based approaches to modeling demand for medical care. A semi-nonparametric model along the lines of Cameron and Johansson's Poisson polynomial model, but using a negative binomial baseline model, is introduced. We apply these models, as well a semiparametric Poisson, hurdle semiparametric Poisson, and finite mixtures of negative binomial models to six measures of health care usage taken from the Medical Expenditure Panel survey. We conclude that most of the models lead to statistically similar results, both in terms of information criteria and conditional and unconditional prediction. This suggests that applied researchers may not need to be overly concerned with the choice of which of these models they use to analyze data on health care demand.
Resumo:
We show how to calibrate CES production and utility functions when indirect taxation affecting inputs and consumption is present. These calibrated functions can then be used in computable general equilibrium models. Taxation modifies the standard calibration procedures since any taxed good has two associated prices and a choice of reference value units has to be made. We also provide an example of computer code to solve the calibration of CES utilities under two alternate normalizations. To our knowledge, this paper fills a methodological gap in the CGE literature.
Resumo:
El projecte exposat té com a propòsit definir i implementar un model de simulació basat en la coordinació i assignació dels serveis d’emergència en accidents de trànsit. La definició del model s’ha realitzat amb l’ús de les Xarxes de Petri Acolorides i la implementació amb el software Rockwell Arena 7.0. El modelatge de la primera simulació ens mostra un model teòric basat en cues mentre que el segon, mostra un model més complet i real gràcies a la connexió mitjançant la plataforma Corba a una base de dades amb informació geogràfica de les flotes i de les rutes. Com a resultat de l’estudi i amb l’ajuda de GoogleEarth, podem realitzar simulacions gràfiques per veure els accidents generats, les flotes dels serveis i el moviment dels vehicles des de les bases fins als accidents.
Resumo:
Traffic forecasts provide essential input for the appraisal of transport investment projects. However, according to recent empirical evidence, long-term predictions are subject to high levels of uncertainty. This paper quantifies uncertainty in traffic forecasts for the tolled motorway network in Spain. Uncertainty is quantified in the form of a confidence interval for the traffic forecast that includes both model uncertainty and input uncertainty. We apply a stochastic simulation process based on bootstrapping techniques. Furthermore, the paper proposes a new methodology to account for capacity constraints in long-term traffic forecasts. Specifically, we suggest a dynamic model in which the speed of adjustment is related to the ratio between the actual traffic flow and the maximum capacity of the motorway. This methodology is applied to a specific public policy that consists of suppressing the toll on a certain motorway section before the concession expires.
Resumo:
We study the properties of the well known Replicator Dynamics when applied to a finitely repeated version of the Prisoners' Dilemma game. We characterize the behavior of such dynamics under strongly simplifying assumptions (i.e. only 3 strategies are available) and show that the basin of attraction of defection shrinks as the number of repetitions increases. After discussing the difficulties involved in trying to relax the 'strongly simplifying assumptions' above, we approach the same model by means of simulations based on genetic algorithms. The resulting simulations describe a behavior of the system very close to the one predicted by the replicator dynamics without imposing any of the assumptions of the mathematical model. Our main conclusion is that mathematical and computational models are good complements for research in social sciences. Indeed, while computational models are extremely useful to extend the scope of the analysis to complex scenarios hard to analyze mathematically, formal models can be useful to verify and to explain the outcomes of computational models.
Resumo:
This paper is concerned with the modeling and analysis of quantum dissipation phenomena in the Schrödinger picture. More precisely, we do investigate in detail a dissipative, nonlinear Schrödinger equation somehow accounting for quantum Fokker–Planck effects, and how it is drastically reduced to a simpler logarithmic equation via a nonlinear gauge transformation in such a way that the physics underlying both problems keeps unaltered. From a mathematical viewpoint, this allows for a more achievable analysis regarding the local wellposedness of the initial–boundary value problem. This simplification requires the performance of the polar (modulus–argument) decomposition of the wavefunction, which is rigorously attained (for the first time to the best of our knowledge) under quite reasonable assumptions.
Resumo:
Observations in daily practice are sometimes registered as positive values larger then a given threshold α. The sample space is in this case the interval (α,+∞), α & 0, which can be structured as a real Euclidean space in different ways. This fact opens the door to alternative statistical models depending not only on the assumed distribution function, but also on the metric which is considered as appropriate, i.e. the way differences are measured, and thus variability
Resumo:
This paper is a first draft of the principle of statistical modelling on coordinates. Several causes —which would be long to detail—have led to this situation close to the deadline for submitting papers to CODAWORK’03. The main of them is the fast development of the approach along thelast months, which let appear previous drafts as obsolete. The present paper contains the essential parts of the state of the art of this approach from my point of view. I would like to acknowledge many clarifying discussions with the group of people working in this field in Girona, Barcelona, Carrick Castle, Firenze, Berlin, G¨ottingen, and Freiberg. They have given a lot of suggestions and ideas. Nevertheless, there might be still errors or unclear aspects which are exclusively my fault. I hope this contribution serves as a basis for further discussions and new developments
Resumo:
In this paper, we consider the ATM networks in which the virtual path concept is implemented. The question of how to multiplex two or more diverse traffic classes while providing different quality of service requirements is a very complicated open problem. Two distinct options are available: integration and segregation. In an integration approach all the traffic from different connections are multiplexed onto one VP. This implies that the most restrictive QOS requirements must be applied to all services. Therefore, link utilization will be decreased because unnecessarily stringent QOS is provided to all connections. With the segregation approach the problem can be much simplified if different types of traffic are separated by assigning a VP with dedicated resources (buffers and links). Therefore, resources may not be efficiently utilized because no sharing of bandwidth can take place across the VP. The probability that the bandwidth required by the accepted connections exceeds the capacity of the link is evaluated with the probability of congestion (PC). Since the PC can be expressed as the CLP, we shall simply carry out bandwidth allocation using the PC. We first focus on the influence of some parameters (CLP, bit rate and burstiness) on the capacity required by a VP supporting a single traffic class using the new convolution approach. Numerical results are presented both to compare the required capacity and to observe which conditions under each approach are preferred