867 resultados para conditional expected utility


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Audit report on the Regional Utility Service Systems Commission for the year ended June 30, 2013

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Epithelial sodium channels (ENaC) are members of the degenerin/ENaC superfamily of non-voltage-gated, highly amiloride-sensitive cation channels that are composed of three subunits (alpha-, beta-, and gamma-ENaC). Since complete gene inactivation of the beta- and gamma-ENaC subunit genes (Scnn1b and Scnn1g) leads to early postnatal death, we generated conditional alleles and obtained mice harboring floxed and null alleles for both gene loci. Using quantitative RT-PCR analysis, we showed that the introduction of the loxP sites did not interfere with the mRNA transcript expression level of the Scnn1b and Scnn1g gene locus, respectively. Upon a regular and salt-deficient diet, both beta- and gamma-ENaC floxed mice showed no difference in their mRNA transcript expression levels, plasma electrolytes, and aldosterone concentrations as well as weight changes compared with control animals. These mice can now be utilized to dissect the role of ENaC function in classical and nonclassic target organs/tissues.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose new methods for evaluating predictive densities. The methods includeKolmogorov-Smirnov and Cram?r-von Mises-type tests for the correct specification ofpredictive densities robust to dynamic mis-specification. The novelty is that the testscan detect mis-specification in the predictive densities even if it appears only overa fraction of the sample, due to the presence of instabilities. Our results indicatethat our tests are well sized and have good power in detecting mis-specification inpredictive densities, even when it is time-varying. An application to density forecastsof the Survey of Professional Forecasters demonstrates the usefulness of the proposedmethodologies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: 2013 AHA/ACC guidelines on the treatment of cholesterol advised to tailor high-intensity statin after ACS, while previous ATP-III recommended titration of statin to reach low-density lipoprotein cholesterol (LDL-C) targets. We simulated the impact of this change of paradigm on the achievement of recommended targets. METHODS: Among a prospective cohort study of consecutive patients hospitalized for ACS from 2009 to 2012 at four Swiss university hospitals, we analyzed 1602 patients who survived one year after recruitment. Targets based on the previous guidelines approach was defined as (1) achievement of LDL-C target < 1.8 mmol/l, (2) reduction of LDL-C ≥ 50% or (3) intensification of statin in patients who did not reach LDL-C targets. Targets based on the 2013 AHA/ACC guidelines approach was defined as the maximization of statin therapy at high-intensity in patients aged ≤75 years and moderate- or high-intensity statin in patients >75 years. RESULTS: 1578 (99%) patients were prescribed statin at discharge, with 1120 (70%) at high-intensity. 1507 patients (94%) reported taking statin at one year, with 909 (57%) at high-intensity. Among 482 patients discharged with sub-maximal statin, intensification of statin was only observed in 109 patients (23%). 773 (47%) patients reached the previous LDL-C targets, while 1014 (63%) reached the 2013 AHA/ACC guidelines targetsone year after ACS (p value < 0.001). CONCLUSION: The application of the new 2013 AHA/ACC guidelines criteria would substantially increase the proportion of patients achieving recommended lipid targets one year after ACS. Clinical trial number, NCT01075868.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sickness absence (SA) is an important social, economic and public health issue. Identifying and understanding the determinants, whether biological, regulatory or, health services-related, of variability in SA duration is essential for better management of SA. The conditional frailty model (CFM) is useful when repeated SA events occur within the same individual, as it allows simultaneous analysis of event dependence and heterogeneity due to unknown, unmeasured, or unmeasurable factors. However, its use may encounter computational limitations when applied to very large data sets, as may frequently occur in the analysis of SA duration. To overcome the computational issue, we propose a Poisson-based conditional frailty model (CFPM) for repeated SA events that accounts for both event dependence and heterogeneity. To demonstrate the usefulness of the model proposed in the SA duration context, we used data from all non-work-related SA episodes that occurred in Catalonia (Spain) in 2007, initiated by either a diagnosis of neoplasm or mental and behavioral disorders. As expected, the CFPM results were very similar to those of the CFM for both diagnosis groups. The CPU time for the CFPM was substantially shorter than the CFM. The CFPM is an suitable alternative to the CFM in survival analysis with recurrent events,especially with large databases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many governments in developing countries implement programs that aim to address nutrionalfailures in early childhood, yet evidence on the effectiveness of these interventions is scant. Thispaper evaluates the impact of a conditional food supplementation program on child mortality inEcuador. The Programa de Alimentaci?n y Nutrici?n Nacional (PANN) 2000 was implementedby regular staff at local public health posts and consisted of offering a free micronutrient-fortifiedfood, Mi Papilla, for children aged 6 to 24 months in exchange for routine health check-ups forthe children. Our regression discontinuity design exploits the fact that at its inception, the PANN2000 was running for about 8 months only in the poorest communities (parroquias) of certainprovinces. Our main result is that the presence of the program reduced child mortality in cohortswith 8 months of differential exposure from a level of about 2.5 percent by 1 to 1.5 percentagepoints.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This policy covers initial placement, adjustment, relocation and replacement of utility facilities in, on, above or below all highway right of way over which the Iowa State Highway Commission exercises control of access. It embodies the basic specifications and standards needed, to insure the safety of the highway user and the integrity of the highway.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This policy covers initial placement, adjustment, relocation and replacement of utility facilities in, on, above or below all highway right of way over which the Iowa State Highway Commission exercises control of access. It embodies the basic specifications and standards needed, to insure the safety of the highway user and the integrity of the highway. (1973 revision to 1970 policy.)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction This dissertation consists of three essays in equilibrium asset pricing. The first chapter studies the asset pricing implications of a general equilibrium model in which real investment is reversible at a cost. Firms face higher costs in contracting than in expanding their capital stock and decide to invest when their productive capital is scarce relative to the overall capital of the economy. Positive shocks to the capital of the firm increase the size of the firm and reduce the value of growth options. As a result, the firm is burdened with more unproductive capital and its value lowers with respect to the accumulated capital. The optimal consumption policy alters the optimal allocation of resources and affects firm's value, generating mean-reverting dynamics for the M/B ratios. The model (1) captures convergence of price-to-book ratios -negative for growth stocks and positive for value stocks - (firm migration), (2) generates deviations from the classic CAPM in line with the cross-sectional variation in expected stock returns and (3) generates a non-monotone relationship between Tobin's q and conditional volatility consistent with the empirical evidence. The second chapter proposes a standard portfolio-choice problem with transaction costs and mean reversion in expected returns. In the presence of transactions costs, no matter how small, arbitrage activity does not necessarily render equal all riskless rates of return. When two such rates follow stochastic processes, it is not optimal immediately to arbitrage out any discrepancy that arises between them. The reason is that immediate arbitrage would induce a definite expenditure of transactions costs whereas, without arbitrage intervention, there exists some, perhaps sufficient, probability that these two interest rates will come back together without any costs having been incurred. Hence, one can surmise that at equilibrium the financial market will permit the coexistence of two riskless rates that are not equal to each other. For analogous reasons, randomly fluctuating expected rates of return on risky assets will be allowed to differ even after correction for risk, leading to important violations of the Capital Asset Pricing Model. The combination of randomness in expected rates of return and proportional transactions costs is a serious blow to existing frictionless pricing models. Finally, in the last chapter I propose a two-countries two-goods general equilibrium economy with uncertainty about the fundamentals' growth rates to study the joint behavior of equity volatilities and correlation at the business cycle frequency. I assume that dividend growth rates jump from one state to other, while countries' switches are possibly correlated. The model is solved in closed-form and the analytical expressions for stock prices are reported. When calibrated to the empirical data of United States and United Kingdom, the results show that, given the existing degree of synchronization across these business cycles, the model captures quite well the historical patterns of stock return volatilities. Moreover, I can explain the time behavior of the correlation, but exclusively under the assumption of a global business cycle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Geophysical techniques can help to bridge the inherent gap with regard to spatial resolution and the range of coverage that plagues classical hydrological methods. This has lead to the emergence of the new and rapidly growing field of hydrogeophysics. Given the differing sensitivities of various geophysical techniques to hydrologically relevant parameters and their inherent trade-off between resolution and range the fundamental usefulness of multi-method hydrogeophysical surveys for reducing uncertainties in data analysis and interpretation is widely accepted. A major challenge arising from such endeavors is the quantitative integration of the resulting vast and diverse database in order to obtain a unified model of the probed subsurface region that is internally consistent with all available data. To address this problem, we have developed a strategy towards hydrogeophysical data integration based on Monte-Carlo-type conditional stochastic simulation that we consider to be particularly suitable for local-scale studies characterized by high-resolution and high-quality datasets. Monte-Carlo-based optimization techniques are flexible and versatile, allow for accounting for a wide variety of data and constraints of differing resolution and hardness and thus have the potential of providing, in a geostatistical sense, highly detailed and realistic models of the pertinent target parameter distributions. Compared to more conventional approaches of this kind, our approach provides significant advancements in the way that the larger-scale deterministic information resolved by the hydrogeophysical data can be accounted for, which represents an inherently problematic, and as of yet unresolved, aspect of Monte-Carlo-type conditional simulation techniques. We present the results of applying our algorithm to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the local-scale porosity structure. Our procedure is first tested on pertinent synthetic data and then applied to corresponding field data collected at the Boise Hydrogeophysical Research Site near Boise, Idaho, USA.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract: Asthma prevalence in children and adolescents in Spain is 10-17%. It is the most common chronic illness during childhood. Prevalence has been increasing over the last 40 years and there is considerable evidence that, among other factors, continued exposure to cigarette smoke results in asthma in children. No statistical or simulation model exist to forecast the evolution of childhood asthma in Europe. Such a model needs to incorporate the main risk factors that can be managed by medical authorities, such as tobacco (OR = 1.44), to establish how they affect the present generation of children. A simulation model using conditional probability and discrete event simulation for childhood asthma was developed and validated by simulating realistic scenario. The parameters used for the model (input data) were those found in the bibliography, especially those related to the incidence of smoking in Spain. We also used data from a panel of experts from the Hospital del Mar (Barcelona) related to actual evolution and asthma phenotypes. The results obtained from the simulation established a threshold of a 15-20% smoking population for a reduction in the prevalence of asthma. This is still far from the current level in Spain, where 24% of people smoke. We conclude that more effort must be made to combat smoking and other childhood asthma risk factors, in order to significantly reduce the number of cases. Once completed, this simulation methodology can realistically be used to forecast the evolution of childhood asthma as a function of variation in different risk factors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trench maintenance problems are caused by improper backfill placement and construction procedures. This report is part of a multiphase research project that aims to improve long-term performance of utility cut restoration trenches. The goal of this research is to improve pavement patch life and reduce maintenance of the repaired areas. The objectives were to use field-testing data, laboratory-testing data, and long-term monitoring (elevation survey and falling weight deflectometer testing) to suggest and modify recommendations from Phase I and to identify the principles of trench subsurface settlement and load distribution in utility cut restoration areas by using instrumented trenches. The objectives were accomplished by monitoring local agency utility construction from Phase I, constructing and monitoring the recommended trenches from Phase I, and instrumenting trenches to monitor changes in temperature, pressure, moisture content, and settlement as a function of time to determine the influences of seasonal changes on the utility cut performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simulated-annealing-based conditional simulations provide a flexible means of quantitatively integrating diverse types of subsurface data. Although such techniques are being increasingly used in hydrocarbon reservoir characterization studies, their potential in environmental, engineering and hydrological investigations is still largely unexploited. Here, we introduce a novel simulated annealing (SA) algorithm geared towards the integration of high-resolution geophysical and hydrological data which, compared to more conventional approaches, provides significant advancements in the way that large-scale structural information in the geophysical data is accounted for. Model perturbations in the annealing procedure are made by drawing from a probability distribution for the target parameter conditioned to the geophysical data. This is the only place where geophysical information is utilized in our algorithm, which is in marked contrast to other approaches where model perturbations are made through the swapping of values in the simulation grid and agreement with soft data is enforced through a correlation coefficient constraint. Another major feature of our algorithm is the way in which available geostatistical information is utilized. Instead of constraining realizations to match a parametric target covariance model over a wide range of spatial lags, we constrain the realizations only at smaller lags where the available geophysical data cannot provide enough information. Thus we allow the larger-scale subsurface features resolved by the geophysical data to have much more due control on the output realizations. Further, since the only component of the SA objective function required in our approach is a covariance constraint at small lags, our method has improved convergence and computational efficiency over more traditional methods. Here, we present the results of applying our algorithm to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the local-scale porosity structure. Our procedure is first tested on a synthetic data set, and then applied to data collected at the Boise Hydrogeophysical Research Site.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Audit report on the Regional Utility Service Systems Commission for the year ended June 30, 2014