927 resultados para Poisson model with common shocks
Resumo:
The effects of the nongray absorption (i.e., atmospheric opacity varying with wavelength) on the possible upper bound of the outgoing longwave radiation (OLR) emitted by a planetary atmosphere have been examined. This analysis is based on the semigray approach, which appears to be a reasonable compromise between the complexity of nongray models and the simplicity of the gray assumption (i.e., atmospheric absorption independent of wavelength). Atmospheric gases in semigray atmospheres make use of constant absorption coefficients in finite-width spectral bands. Here, such a semigray absorption is introduced in a one-dimensional (1D) radiative– convective model with a stratosphere in radiative equilibrium and a troposphere fully saturated with water vapor, which is the semigray gas. A single atmospheric window in the infrared spectrum has been assumed. In contrast to the single absolute limit of OLR found in gray atmospheres, semigray ones may also show a relative limit. This means that both finite and infinite runaway effects may arise in some semigray cases. Of particular importance is the finding of an entirely new branch of stable steady states that does not appear in gray atmospheres. This new multiple equilibrium is a consequence of the nongray absorption only. It is suspected that this new set of stable solutions has not been previously revealed in analyses of radiative–convective models since it does not appear for an atmosphere with nongray parameters similar to those for the earth’s current state
Resumo:
BAFF is a B cell survival factor that binds to three receptors BAFF-R, TACI and BCMA. BAFF-R is the receptor triggering naïve B cell survival and maturation while BCMA supports the survival of plasma cells in the bone marrow. Excessive BAFF production leads to autoimmunity, presumably as the consequence of inappropriate survival of self-reactive B cells. The function of TACI has been more elusive with TACI(-/-) mice revealing two sides of this receptor, a positive one driving T cell-independent immune responses and a negative one down-regulating B cell activation and expansion. Recent work has revealed that the regulation of TACI expression is intimately linked to the activation of innate receptors on B cells and that TACI signalling in response to multimeric BAFF and APRIL provides positive signals to plasmablasts. How TACI negatively regulates B cells remains elusive but may involve an indirect control of BAFF levels. The discovery of TACI mutations associated with common variable immunodeficiency (CVID) in humans not only reinforces its important role for humoral responses but also suggests a more complex role than first anticipated from knockout animals. TACI is emerging as an unusual TNF receptor-like molecule with a sophisticated mode of action.
Resumo:
Patterns of cigarette smoking in Switzerland were analyzed on the basis of sales data (available since 1924) and national health surveys conducted in the last decade. There was a steady and substantial increase in cigarettes sales up to the early 1970s. Thereafter, the curve tended to level off around an average value of 3,000 cigarettes per adult per year. According to the 1981-1983 National Health Survey, 37% of Swiss men were current smokers, 25% were ex-smokers, and 39% were never smokers. Corresponding porportions in women were 22, 11, and 67%. Among men, smoking prevalence was higher in lower social classes, and some moderate decline was apparent from survey data over the period 1975-1981 mostly in later middle-age. Trends in lung cancer death certification rates over the period 1950-1984 were analyzed using standard cross-sectional methods and a log-linear Poisson model to isolate the effects of age, birth cohort, and year of death. Mortality from lung cancer increased substantially among Swiss men between the early 1950s and the late 1970s, and levelled off (around a value of 70/100,000 men) thereafter. Among women, there has been a steady upward trend which started in the mid-1960s, and continues to climb steadily, although lung cancer mortality is still considerably lower in absolute terms (around 8/100,000 women) than in several North European countries or in North America. Cohort analyses indicate that the peak rates in men were reached by the generation born around 1910 and mortality stabilized for subsequent generations up to the 1930 birth cohort. Among females, marked increases were observed in each subsequent birth cohort. This pattern of trends is consistent with available information on smoking prevalence in successive generations, showing a peak among men for the 1910 cohort, but steady upward trends among females. Over the period 1980-1984, about 90% of lung cancer deaths among Swiss men and about 40% of those among women could be attributed to smoking (overall proportion, 85%).
Resumo:
BACKGROUND: APOBEC3G (apolipoprotein B mRNA-editing enzyme, catalytic polypeptide-like 3G) has antiretroviral activity associated with the hypermutation of viral DNA through cytosine deamination. APOBEC3G has two cytosine deaminase (CDA) domains; the catalytically inactive amino-terminal domain of APOBEC3G (N-CDA) carries the Vif interaction domain. There is no 3-D structure of APOBEC3G solved by X-ray or nuclear magnetic resonance. METHODOLOGY/PRINCIPAL FINDINGS: We predicted the structure of human APOBEC3G based on the crystal structure of APOBEC2. To assess the model structure, we evaluated 48 mutants of APOBEC3G N-CDA that identify novel variants altering DeltaVif HIV-1 infectivity and packaging of APOBEC3G. Results indicated that the key residue D128 is exposed at the surface of the model, with a negative local electrostatic potential. Mutation D128K changes the sign of that local potential. In addition, two novel functionally relevant residues that result in defective APOBEC3G encapsidation, R122 and W127, cluster at the surface. CONCLUSIONS/SIGNIFICANCE: The structure model identifies a cluster of residues important for packaging of APOBEC3G into virions, and may serve to guide functional analysis of APOBEC3G.
Resumo:
PURPOSE: Quantification of myocardial blood flow (MBF) with generator-produced (82)Rb is an attractive alternative for centres without an on-site cyclotron. Our aim was to validate (82)Rb-measured MBF in relation to that measured using (15)O-water, as a tracer 100% of which can be extracted from the circulation even at high flow rates, in healthy control subject and patients with mild coronary artery disease (CAD). METHODS: MBF was measured at rest and during adenosine-induced hyperaemia with (82)Rb and (15)O-water PET in 33 participants (22 control subjects, aged 30 ± 13 years; 11 CAD patients without transmural infarction, aged 60 ± 13 years). A one-tissue compartment (82)Rb model with ventricular spillover correction was used. The (82)Rb flow-dependent extraction rate was derived from (15)O-water measurements in a subset of 11 control subjects. Myocardial flow reserve (MFR) was defined as the hyperaemic/rest MBF. Pearson's correlation r, Bland-Altman 95% limits of agreement (LoA), and Lin's concordance correlation ρ (c) (measuring both precision and accuracy) were used. RESULTS: Over the entire MBF range (0.66-4.7 ml/min/g), concordance was excellent for MBF (r = 0.90, [(82)Rb-(15)O-water] mean difference ± SD = 0.04 ± 0.66 ml/min/g, LoA = -1.26 to 1.33 ml/min/g, ρ(c) = 0.88) and MFR (range 1.79-5.81, r = 0.83, mean difference = 0.14 ± 0.58, LoA = -0.99 to 1.28, ρ(c) = 0.82). Hyperaemic MBF was reduced in CAD patients compared with the subset of 11 control subjects (2.53 ± 0.74 vs. 3.62 ± 0.68 ml/min/g, p = 0.002, for (15)O-water; 2.53 ± 1.01 vs. 3.82 ± 1.21 ml/min/g, p = 0.013, for (82)Rb) and this was paralleled by a lower MFR (2.65 ± 0.62 vs. 3.79 ± 0.98, p = 0.004, for (15)O-water; 2.85 ± 0.91 vs. 3.88 ± 0.91, p = 0.012, for (82)Rb). Myocardial perfusion was homogeneous in 1,114 of 1,122 segments (99.3%) and there were no differences in MBF among the coronary artery territories (p > 0.31). CONCLUSION: Quantification of MBF with (82)Rb with a newly derived correction for the nonlinear extraction function was validated against MBF measured using (15)O-water in control subjects and patients with mild CAD, where it was found to be accurate at high flow rates. (82)Rb-derived MBF estimates seem robust for clinical research, advancing a step further towards its implementation in clinical routine.
Resumo:
OBJECTIVES: The reconstruction of the right ventricular outflow tract (RVOT) with valved conduits remains a challenge. The reoperation rate at 5 years can be as high as 25% and depends on age, type of conduit, conduit diameter and principal heart malformation. The aim of this study is to provide a bench model with computer fluid dynamics to analyse the haemodynamics of the RVOT, pulmonary artery, its bifurcation, and left and right pulmonary arteries that in the future may serve as a tool for analysis and prediction of outcome following RVOT reconstruction. METHODS: Pressure, flow and diameter at the RVOT, pulmonary artery, bifurcation of the pulmonary artery, and left and right pulmonary arteries were measured in five normal pigs with a mean weight of 24.6 ± 0.89 kg. Data obtained were used for a 3D computer fluid-dynamics simulation of flow conditions, focusing on the pressure, flow and shear stress profile of the pulmonary trunk to the level of the left and right pulmonary arteries. RESULTS: Three inlet steady flow profiles were obtained at 0.2, 0.29 and 0.36 m/s that correspond to the flow rates of 1.5, 2.0 and 2.5 l/min flow at the RVOT. The flow velocity profile was constant at the RVOT down to the bifurcation and decreased at the left and right pulmonary arteries. In all three inlet velocity profiles, low sheer stress and low-velocity areas were detected along the left wall of the pulmonary artery, at the pulmonary artery bifurcation and at the ostia of both pulmonary arteries. CONCLUSIONS: This computed fluid real-time model provides us with a realistic picture of fluid dynamics in the pulmonary tract area. Deep shear stress areas correspond to a turbulent flow profile that is a predictive factor for the development of vessel wall arteriosclerosis. We believe that this bench model may be a useful tool for further evaluation of RVOT pathology following surgical reconstructions.
Resumo:
OBJECTIVES: We developed a population model that describes the ocular penetration and pharmacokinetics of penciclovir in human aqueous humour and plasma after oral administration of famciclovir. METHODS: Fifty-three patients undergoing cataract surgery received a single oral dose of 500 mg of famciclovir prior to surgery. Concentrations of penciclovir in both plasma and aqueous humour were measured by HPLC with fluorescence detection. Concentrations in plasma and aqueous humour were fitted using a two-compartment model (NONMEM software). Inter-individual and intra-individual variabilities were quantified and the influence of demographics and physiopathological and environmental variables on penciclovir pharmacokinetics was explored. RESULTS: Drug concentrations were fitted using a two-compartment, open model with first-order transfer rates between plasma and aqueous humour compartments. Among tested covariates, creatinine clearance, co-intake of angiotensin-converting enzyme inhibitors and body weight significantly influenced penciclovir pharmacokinetics. Plasma clearance was 22.8 ± 9.1 L/h and clearance from the aqueous humour was 8.2 × 10(-5) L/h. AUCs were 25.4 ± 10.2 and 6.6 ± 1.8 μg · h/mL in plasma and aqueous humour, respectively, yielding a penetration ratio of 0.28 ± 0.06. Simulated concentrations in the aqueous humour after administration of 500 mg of famciclovir three times daily were in the range of values required for 50% growth inhibition of non-resistant strains of the herpes zoster virus family. CONCLUSIONS: Plasma and aqueous penciclovir concentrations showed significant variability that could only be partially explained by renal function, body weight and comedication. Concentrations in the aqueous humour were much lower than in plasma, suggesting that factors in the blood-aqueous humour barrier might prevent its ocular penetration or that redistribution occurs in other ocular compartments.
Resumo:
We analyze crash data collected by the Iowa Department of Transportation using Bayesian methods. The data set includes monthly crash numbers, estimated monthly traffic volumes, site length and other information collected at 30 paired sites in Iowa over more than 20 years during which an intervention experiment was set up. The intervention consisted in transforming 15 undivided road segments from four-lane to three lanes, while an additional 15 segments, thought to be comparable in terms of traffic safety-related characteristics were not converted. The main objective of this work is to find out whether the intervention reduces the number of crashes and the crash rates at the treated sites. We fitted a hierarchical Poisson regression model with a change-point to the number of monthly crashes per mile at each of the sites. Explanatory variables in the model included estimated monthly traffic volume, time, an indicator for intervention reflecting whether the site was a “treatment” or a “control” site, and various interactions. We accounted for seasonal effects in the number of crashes at a site by including smooth trigonometric functions with three different periods to reflect the four seasons of the year. A change-point at the month and year in which the intervention was completed for treated sites was also included. The number of crashes at a site can be thought to follow a Poisson distribution. To estimate the association between crashes and the explanatory variables, we used a log link function and added a random effect to account for overdispersion and for autocorrelation among observations obtained at the same site. We used proper but non-informative priors for all parameters in the model, and carried out all calculations using Markov chain Monte Carlo methods implemented in WinBUGS. We evaluated the effect of the four to three-lane conversion by comparing the expected number of crashes per year per mile during the years preceding the conversion and following the conversion for treatment and control sites. We estimated this difference using the observed traffic volumes at each site and also on a per 100,000,000 vehicles. We also conducted a prospective analysis to forecast the expected number of crashes per mile at each site in the study one year, three years and five years following the four to three-lane conversion. Posterior predictive distributions of the number of crashes, the crash rate and the percent reduction in crashes per mile were obtained for each site for the months of January and June one, three and five years after completion of the intervention. The model appears to fit the data well. We found that in most sites, the intervention was effective and reduced the number of crashes. Overall, and for the observed traffic volumes, the reduction in the expected number of crashes per year and mile at converted sites was 32.3% (31.4% to 33.5% with 95% probability) while at the control sites, the reduction was estimated to be 7.1% (5.7% to 8.2% with 95% probability). When the reduction in the expected number of crashes per year, mile and 100,000,000 AADT was computed, the estimates were 44.3% (43.9% to 44.6%) and 25.5% (24.6% to 26.0%) for converted and control sites, respectively. In both cases, the difference in the percent reduction in the expected number of crashes during the years following the conversion was significantly larger at converted sites than at control sites, even though the number of crashes appears to decline over time at all sites. Results indicate that the reduction in the expected number of sites per mile has a steeper negative slope at converted than at control sites. Consistent with this, the forecasted reduction in the number of crashes per year and mile during the years after completion of the conversion at converted sites is more pronounced than at control sites. Seasonal effects on the number of crashes have been well-documented. In this dataset, we found that, as expected, the expected number of monthly crashes per mile tends to be higher during winter months than during the rest of the year. Perhaps more interestingly, we found that there is an interaction between the four to three-lane conversion and season; the reduction in the number of crashes appears to be more pronounced during months, when the weather is nice than during other times of the year, even though a reduction was estimated for the entire year. Thus, it appears that the four to three-lane conversion, while effective year-round, is particularly effective in reducing the expected number of crashes in nice weather.
Resumo:
A skill-biased change in technology can account at once for the changes observed in a number of important variables of the US labour market between 1970 and 1990. These include the increasing inequality in wages, both between and within education groups, and the increase in unemployment at all levels of education. In contrast, in previous literature this type of technology shock cannot account for all of these changes. The paper uses a matching model with a segmented labour market, an imperfect correlation between individual ability and education, and a fixed cost of setting up a job. The endogenous increase in overeducation is key to understand the response of unemployment to the technology shock.
Resumo:
This paper investigates the properties of an international real business cycle model with household production. We show that a model with disturbances to both market and household technologies reproduces the main regularities of the data and improves existing models in matching international consumption, investment and output correlations without irrealistic assumptions on the structure of international financial markets. Sensitivity analysis shows the robustness of the results to alternative specifications of the stochastic processes for the disturbances and to variations of unmeasured parameters within a reasonable range.
Resumo:
This paper presents a model of electoral competition focusing on the formation of thepublic agenda. An incumbent government and a challenger party in opposition competein elections by choosing the issues that will key out their campaigns. Giving salience toan issue implies proposing an innovative policy proposal, alternative to the status-quo.Parties trade off the issues with high salience in voters concerns and those with broadagreement on some alternative policy proposal. Each party expects a higher probabilityof victory if the issue it chooses becomes salient in the voters decision. But remarkably,the issues which are considered the most important ones by a majority of votes may notbe given salience during the electoral campaign. An incumbent government may survivein spite of its bad policy performance if there is no sufficiently broad agreement on apolicy alternative. We illustrate the analytical potential of the model with the case of theUnited States presidential election in 2004.
Resumo:
According to Ljungqvist and Sargent (1998), high European unemployment since the 1980s can be explained by a rise in economic turbulence, leading to greater numbers of unemployed workers with obsolete skills. These workers refuse new jobs due to high unemployment benefits. In this paper we reassess the turbulence-unemployment relationship using a matching model with endogenous job destruction. In our model, higher turbulence reduces the incentives of employed workers to leave their jobs. If turbulence has only a tiny effect on the skills of workers experiencing endogenous separation, then the results of Lungqvist and Sargent (1998, 2004) are reversed, and higher turbulence leads to a reduction in unemployment. Thus, changes in turbulence cannot provide an explanation for European unemployment that reconciles the incentives of both unemployed and employed workers.
Resumo:
In this paper we present an algorithm to assign proctors toexams. This NP-hard problem is related to the generalized assignmentproblem with multiple objectives. The problem consists of assigningteaching assistants to proctor final exams at a university. We formulatethis problem as a multiobjective integer program (IP) with a preferencefunction and a workload-fairness function. We then consider also a weightedobjective that combines both functions. We develop a scatter searchprocedure and compare its outcome with solutions found by solving theIP model with CPLEX 6.5. Our test problems are real instances from aUniversity in Spain.
Resumo:
This paper analyzes a two-alternative voting model with the distinctive feature that voters have preferences over margins of victory. We study voting contests with a finite as well as an infinite number of voters, and with and without mandatory voting. The main result of the paper is the existence and characterization of a unique equilibrium outcome in all those situations. At equilibrium, voters who prefer a larger support for one of the alternatives vote for such alternative.The model also provides a formal argument for the conditional sincerity voting condition in Alesina and Rosenthal (1995) and the benefit of voting function in Llavador (2006). Finally, we offer new insights on explaining why some citizens may vote strategically for an alternative different from the one declared as the most preferred.
Resumo:
We develop an equilibrium search-matching model with risk-neutral agentsand two-sided ex-ante heterogeneity. Unemployment insurance has thestandard effect of reducing employment, but also helps workers to get a suitable job. The predictions of our simple modelare consistent with the contrasting performance of the labor market in Europeand US in terms of unemployment, productivity growth and wage inequality.To show this, we construct two fictitious economies with calibratedparameters which only differ by the degree of unemployment insurance andassume that they are hit by a common technological shock which enhancesthe importance of mismatch. This shock reduces the proportion of jobs whichworkers regards as acceptable in the economy with unemployment insurance(Europe). As a result, unemployment doubles in this economy.In the laissez-faire economy (US), unemployment remains constant,but wage inequality increases more and productivity grows less due to largermismatch. The model can be used to address a number of normative issues.