869 resultados para Continuous contracts
Resumo:
BACKGROUND Continuous venovenous hemodialysis (CVVHD) may generate microemboli that cross the pulmonary circulation and reach the brain. The aim of the present study was to quantify (load per time interval) and qualify (gaseous vs. solid) cerebral microemboli (CME), detected as high-intensity transient signals, using transcranial Doppler ultrasound. MATERIALS AND METHODS Twenty intensive care unit (ICU group) patients requiring CVVHD were examined. CME were recorded in both middle cerebral arteries for 30 minutes during CVVHD and a CVVHD-free interval. Twenty additional patients, hospitalized for orthopedic surgery, served as a non-ICU control group. Statistical analyses were performed using the Mann-Whitney U test or the Wilcoxon matched-pairs signed-rank test, followed by Bonferroni corrections for multiple comparisons. RESULTS In the non-ICU group, 48 (14.5-169.5) (median [range]) gaseous CME were detected. In the ICU group, the 67.5 (14.5-588.5) gaseous CME detected during the CVVHD-free interval increased 5-fold to 344.5 (59-1019) during CVVHD (P<0.001). The number of solid CME was low in all groups (non-ICU group: 2 [0-5.5]; ICU group CVVHD-free interval: 1.5 [0-14.25]; ICU group during CVVHD: 7 [3-27.75]). CONCLUSIONS This observational pilot study shows that CVVHD was associated with a higher gaseous but not solid CME burden in critically ill patients. Although the differentiation between gaseous and solid CME remains challenging, our finding may support the hypothesis of microbubble generation in the CVVHD circuit and its transpulmonary translocation toward the intracranial circulation. Importantly, the impact of gaseous and solid CME generated during CVVHD on brain integrity of critically ill patients currently remains unknown and is highly debated.
Resumo:
AIMS/HYPOTHESIS To investigate exercise-related fuel metabolism in intermittent high-intensity (IHE) and continuous moderate intensity (CONT) exercise in individuals with type 1 diabetes mellitus. METHODS In a prospective randomised open-label cross-over trial twelve male individuals with well-controlled type 1 diabetes underwent a 90 min iso-energetic cycling session at 50% maximal oxygen consumption ([Formula: see text]), with (IHE) or without (CONT) interspersed 10 s sprints every 10 min without insulin adaptation. Euglycaemia was maintained using oral (13)C-labelled glucose. (13)C Magnetic resonance spectroscopy (MRS) served to quantify hepatocellular and intramyocellular glycogen. Measurements of glucose kinetics (stable isotopes), hormones and metabolites complemented the investigation. RESULTS Glucose and insulin levels were comparable between interventions. Exogenous glucose requirements during the last 30 min of exercise were significantly lower in IHE (p = 0.02). Hepatic glucose output did not differ significantly between interventions, but glucose disposal was significantly lower in IHE (p < 0.05). There was no significant difference in glycogen consumption. Growth hormone, catecholamine and lactate levels were significantly higher in IHE (p < 0.05). CONCLUSIONS/INTERPRETATION IHE in individuals with type 1 diabetes without insulin adaptation reduced exogenous glucose requirements compared with CONT. The difference was not related to increased hepatic glucose output, nor to enhanced muscle glycogen utilisation, but to decreased glucose uptake. The lower glucose disposal in IHE implies a shift towards consumption of alternative substrates. These findings indicate a high flexibility of exercise-related fuel metabolism in type 1 diabetes, and point towards a novel and potentially beneficial role of IHE in these individuals. TRIAL REGISTRATION ClinicalTrials.gov NCT02068638 FUNDING: Swiss National Science Foundation (grant number 320030_149321/) and R&A Scherbarth Foundation (Switzerland).
Resumo:
The purpose of this study is to investigate the effects of predictor variable correlations and patterns of missingness with dichotomous and/or continuous data in small samples when missing data is multiply imputed. Missing data of predictor variables is multiply imputed under three different multivariate models: the multivariate normal model for continuous data, the multinomial model for dichotomous data and the general location model for mixed dichotomous and continuous data. Subsequent to the multiple imputation process, Type I error rates of the regression coefficients obtained with logistic regression analysis are estimated under various conditions of correlation structure, sample size, type of data and patterns of missing data. The distributional properties of average mean, variance and correlations among the predictor variables are assessed after the multiple imputation process. ^ For continuous predictor data under the multivariate normal model, Type I error rates are generally within the nominal values with samples of size n = 100. Smaller samples of size n = 50 resulted in more conservative estimates (i.e., lower than the nominal value). Correlation and variance estimates of the original data are retained after multiple imputation with less than 50% missing continuous predictor data. For dichotomous predictor data under the multinomial model, Type I error rates are generally conservative, which in part is due to the sparseness of the data. The correlation structure for the predictor variables is not well retained on multiply-imputed data from small samples with more than 50% missing data with this model. For mixed continuous and dichotomous predictor data, the results are similar to those found under the multivariate normal model for continuous data and under the multinomial model for dichotomous data. With all data types, a fully-observed variable included with variables subject to missingness in the multiple imputation process and subsequent statistical analysis provided liberal (larger than nominal values) Type I error rates under a specific pattern of missing data. It is suggested that future studies focus on the effects of multiple imputation in multivariate settings with more realistic data characteristics and a variety of multivariate analyses, assessing both Type I error and power. ^
Resumo:
Credit markets with asymmetric information often prefer credit rationing as a profit maximizing device. This paper asks whether the presence of informal credit markets reduces the cost of credit rationing, that is, whether it can alleviate the impact of asymmetric information based on the available information. We used a dynamic general equilibrium model with heterogenous agents to assess this. Using Indian credit market data our study shows that the presence of informal credit market can reduce the cost of credit rationing by separating high risk firms from the low risk firms in the informal market. But even after this improvement, the steady state capital accumulation is still much lower as compared to incentive based market clearing rates. Through self revelation of each firm's type, based on the incentive mechanism, banks can diversify their risk by achieving a separating equilibrium in the loan market. The incentive mechanism helps banks to increase capital accumulation in the long run by charging lower rates and lending relatively higher amount to the less risky firms. Another important finding of this study is that self-revelation leads to very significant welfare improvement, as measured by consumptiuon equivalence.
Resumo:
The Everglades Depth Estimation Network (EDEN) is an integrated network of realtime water-level monitoring, ground-elevation modeling, and water-surface modeling that provides scientists and managers with current (2000-present), online water-stage and water-depth information for the entire freshwater portion of the Greater Everglades. Continuous daily spatial interpolations of the EDEN network stage data are presented on grid with 400-square-meter spacing. EDEN offers a consistent and documented dataset that can be used by scientists and managers to: (1) guide large-scale field operations, (2) integrate hydrologic and ecological responses, and (3) support biological and ecological assessments that measure ecosystem responses to the implementation of the Comprehensive Everglades Restoration Plan (CERP) (U.S. Army Corps of Engineers, 1999). The target users are biologists and ecologists examining trophic level responses to hydrodynamic changes in the Everglades. The first objective of this report is to validate the spatially continuous EDEN water-surface model for the Everglades, Florida developed by Pearlstine et al. (2007) by using an independent field-measured data-set. The second objective is to demonstrate two applications of the EDEN water-surface model: to estimate site-specific ground elevation by using the validated EDEN water-surface model and observed water depth data; and to create water-depth hydrographs for tree islands. We found that there are no statistically significant differences between model-predicted and field-observed water-stage data in both southern Water Conservation Area (WCA) 3A and WCA 3B. Tree island elevations were derived by subtracting field water-depth measurements from the predicted EDEN water-surface. Water-depth hydrographs were then computed by subtracting tree island elevations from the EDEN water stage. Overall, the model is reliable by a root mean square error (RMSE) of 3.31 cm. By region, the RMSE is 2.49 cm and 7.77 cm in WCA 3A and 3B, respectively. This new landscape-scale hydrological model has wide applications for ongoing research and management efforts that are vital to restoration of the Florida Everglades. The accurate, high-resolution hydrological data, generated over broad spatial and temporal scales by the EDEN model, provides a previously missing key to understanding the habitat requirements and linkages among native and invasive populations, including fish, wildlife, wading birds, and plants. The EDEN model is a powerful tool that could be adapted for other ecosystem-scale restoration and management programs worldwide.
Resumo:
The traditional law of leases imposed no duty on landlords to mitigate damages in the event of tenant breach, whereas the modern law of leases does. An economic model of leases, in which absentee tenants may or may not intend to breach, shows that the traditional rule promotes tenant investment in the property by discouraging landlord entry. In contrast, the modern rule prevents the property from being left idle by encouraging landlords to enter and re-let abandoned property. The model reflects the historic use of the traditional rule for agricultural leases, where absentee use was valuable, and the emergence of the modern rule for residential leases, where the primary use entails continuous occupation.
Resumo:
This paper considers the contacting approach to central banking in the context of a simple common agency model. The recent literature on optimal contracts suggests that the political principal of the central bank can design the appropriate incentive schemes that remedy for time-inconsistency problems in monetary policy. The effectiveness of such contracts, however, requires a central banker that attaches a positive weight to the incentive scheme. As a result, delegating monetary policy under such circumstances gives rise to the possibility that the central banker may respond to incentive schemes offered by other potential principals. We introduce common agency considerations in the design of optimal central banker contracts. We introduce two principals - society (government) and an interest group, whose objectives conflict with society's and we examine under what circumstances the government-offered or the interest-group-offered contract dominates. Our results largely depend on the type of bias that the interest group contract incorporates. In particular, when the interest group contract incorporates an inflationary bias the outcome depends on the principals' relative concern of the incentive schemes' costs. When the interest group contract incorporates an expansionary bias, however, it always dominates the government contract. A corollary of our results is that central banker contracts aiming to remove the expansionary bias of policymakers should be written explicitly in terms of the perceived bias.
Resumo:
A single formula assigns a continuous utility function to every representable preference relation.
Resumo:
This study of the wholesale electricity market compares the cost-minimizing performance of the auction mechanism currently in place in U.S. markets with the performance of a proposed replacement. The current mechanism chooses an allocation of contracts that minimizes a fictional cost calculated using pay-as-offer pricing. Then suppliers are paid the market clearing price. The proposed mechanism uses the market clearing price in the allocation phase as well as in the payment phase. In concentrated markets, the proposed mechanism outperforms the current mechanism even when strategic behavior by suppliers is taken into account. The advantage of the proposed mechanism increases with increased price competition.
Resumo:
Two forms of continuity are defined for Pareto representations of preferences. They are designated continuity and coordinate continuity. Characterizations are given of those Pareto representable preferences that are continuously representable and, in dimension two, of those that are coordinate-continuously representable.
Resumo:
Standard methods for testing safety data are needed to ensure the safe conduct of clinical trials. In particular, objective rules for reliably identifying unsafe treatments need to be put into place to help protect patients from unnecessary harm. DMCs are uniquely qualified to evaluate accumulating unblinded data and make recommendations about the continuing safe conduct of a trial. However, it is the trial leadership who must make the tough ethical decision about stopping a trial, and they could benefit from objective statistical rules that help them judge the strength of evidence contained in the blinded data. We design early stopping rules for harm that act as continuous safety screens for randomized controlled clinical trials with blinded treatment information, which could be used by anyone, including trial investigators (and trial leadership). A Bayesian framework, with emphasis on the likelihood function, is used to allow for continuous monitoring without adjusting for multiple comparisons. Close collaboration between the statistician and the clinical investigators will be needed in order to design safety screens with good operating characteristics. Though the math underlying this procedure may be computationally intensive, implementation of the statistical rules will be easy and the continuous screening provided will give suitably early warning when real problems were to emerge. Trial investigators and trial leadership need these safety screens to help them to effectively monitor the ongoing safe conduct of clinical trials with blinded data.^
Resumo:
Interaction effect is an important scientific interest for many areas of research. Common approach for investigating the interaction effect of two continuous covariates on a response variable is through a cross-product term in multiple linear regression. In epidemiological studies, the two-way analysis of variance (ANOVA) type of method has also been utilized to examine the interaction effect by replacing the continuous covariates with their discretized levels. However, the implications of model assumptions of either approach have not been examined and the statistical validation has only focused on the general method, not specifically for the interaction effect.^ In this dissertation, we investigated the validity of both approaches based on the mathematical assumptions for non-skewed data. We showed that linear regression may not be an appropriate model when the interaction effect exists because it implies a highly skewed distribution for the response variable. We also showed that the normality and constant variance assumptions required by ANOVA are not satisfied in the model where the continuous covariates are replaced with their discretized levels. Therefore, naïve application of ANOVA method may lead to an incorrect conclusion. ^ Given the problems identified above, we proposed a novel method modifying from the traditional ANOVA approach to rigorously evaluate the interaction effect. The analytical expression of the interaction effect was derived based on the conditional distribution of the response variable given the discretized continuous covariates. A testing procedure that combines the p-values from each level of the discretized covariates was developed to test the overall significance of the interaction effect. According to the simulation study, the proposed method is more powerful then the least squares regression and the ANOVA method in detecting the interaction effect when data comes from a trivariate normal distribution. The proposed method was applied to a dataset from the National Institute of Neurological Disorders and Stroke (NINDS) tissue plasminogen activator (t-PA) stroke trial, and baseline age-by-weight interaction effect was found significant in predicting the change from baseline in NIHSS at Month-3 among patients received t-PA therapy.^
Resumo:
Mixture modeling is commonly used to model categorical latent variables that represent subpopulations in which population membership is unknown but can be inferred from the data. In relatively recent years, the potential of finite mixture models has been applied in time-to-event data. However, the commonly used survival mixture model assumes that the effects of the covariates involved in failure times differ across latent classes, but the covariate distribution is homogeneous. The aim of this dissertation is to develop a method to examine time-to-event data in the presence of unobserved heterogeneity under a framework of mixture modeling. A joint model is developed to incorporate the latent survival trajectory along with the observed information for the joint analysis of a time-to-event variable, its discrete and continuous covariates, and a latent class variable. It is assumed that the effects of covariates on survival times and the distribution of covariates vary across different latent classes. The unobservable survival trajectories are identified through estimating the probability that a subject belongs to a particular class based on observed information. We applied this method to a Hodgkin lymphoma study with long-term follow-up and observed four distinct latent classes in terms of long-term survival and distributions of prognostic factors. Our results from simulation studies and from the Hodgkin lymphoma study demonstrated the superiority of our joint model compared with the conventional survival model. This flexible inference method provides more accurate estimation and accommodates unobservable heterogeneity among individuals while taking involved interactions between covariates into consideration.^
Resumo:
This article presents a case study of a nonprofit child welfare agency that delivered family preservation services under three different purchase-of-service (POS) contracts. The research specifically focuses on how certain POS contract provisions and reimbursement rates influence the delivery of family preservation services. The three contacts examined differed on criteria, such as reimbursement mechanism, service volume, definition of clientele, and reimbursement rate. The study found that as reimbursement rates decline and as administrative costs increase, the service provider struggled with cash flow, staffing, fundraising, and service provision, among other things. It is concluded that contract-related resources, policies, and procedures impact provider agencies in multiple, significant ways that are critical to the provision of services and the accomplishment of positive client outcomes.
Resumo:
In this dissertation, we propose a continuous-time Markov chain model to examine the longitudinal data that have three categories in the outcome variable. The advantage of this model is that it permits a different number of measurements for each subject and the duration between two consecutive time points of measurements can be irregular. Using the maximum likelihood principle, we can estimate the transition probability between two time points. By using the information provided by the independent variables, this model can also estimate the transition probability for each subject. The Monte Carlo simulation method will be used to investigate the goodness of model fitting compared with that obtained from other models. A public health example will be used to demonstrate the application of this method. ^