871 resultados para Optimal Redundancy
Resumo:
This paper clarifies the relationship between an injurer's wealth level and his care choice by highlighting the distinction between monetary and non-monetary care. When care is non-monetary, wealth-constrained injurers generally take less than optimal care, and care is increasing in their wealth level under both strict liability and negligence. In contrast, when care is monetary, injurers may take too much or too little care under strict liability, and care is not strictly increasing in injurer wealth. Under negligence, the relationship between injurer care and wealth is similar in the two formulations. However, when litigation costs are added to the model, the relationship between injurer care and wealth becomes non-monotonic under both liability rules.
Resumo:
This paper considers the contacting approach to central banking in the context of a simple common agency model. The recent literature on optimal contracts suggests that the political principal of the central bank can design the appropriate incentive schemes that remedy for time-inconsistency problems in monetary policy. The effectiveness of such contracts, however, requires a central banker that attaches a positive weight to the incentive scheme. As a result, delegating monetary policy under such circumstances gives rise to the possibility that the central banker may respond to incentive schemes offered by other potential principals. We introduce common agency considerations in the design of optimal central banker contracts. We introduce two principals - society (government) and an interest group, whose objectives conflict with society's and we examine under what circumstances the government-offered or the interest-group-offered contract dominates. Our results largely depend on the type of bias that the interest group contract incorporates. In particular, when the interest group contract incorporates an inflationary bias the outcome depends on the principals' relative concern of the incentive schemes' costs. When the interest group contract incorporates an expansionary bias, however, it always dominates the government contract. A corollary of our results is that central banker contracts aiming to remove the expansionary bias of policymakers should be written explicitly in terms of the perceived bias.
Resumo:
No abstract available.
Resumo:
This paper proposes asymptotically optimal tests for unstable parameter process under the feasible circumstance that the researcher has little information about the unstable parameter process and the error distribution, and suggests conditions under which the knowledge of those processes does not provide asymptotic power gains. I first derive a test under known error distribution, which is asymptotically equivalent to LR tests for correctly identified unstable parameter processes under suitable conditions. The conditions are weak enough to cover a wide range of unstable processes such as various types of structural breaks and time varying parameter processes. The test is then extended to semiparametric models in which the underlying distribution in unknown but treated as unknown infinite dimensional nuisance parameter. The semiparametric test is adaptive in the sense that its asymptotic power function is equivalent to the power envelope under known error distribution.
Resumo:
This paper examines four equivalent methods of optimal monetary policymaking, committing to the social loss function, using discretion with the central bank long-run and short-run loss functions, and following monetary policy rules. All lead to optimal economic performance. The same performance emerges from these different policymaking methods because the central bank actually follows the same (similar) policy rules. These objectives (the social loss function, the central bank long-run and short-run loss functions) and monetary policy rules imply a complete regime for optimal policy making. The central bank long-run and short-run loss functions that produce the optimal policy with discretion differ from the social loss function. Moreover, the optimal policy rule emerges from the optimization of these different central bank loss functions.
Resumo:
Various theories have been put forward to explain the fact that humans experience menopause while virtually no animals do. This paper aims to investigate one such theory: children provide a savings technology into old age, but as human babies are usually large and have long gestation periods, a substantial risk of death exists for the mother as she bears children. It seems therefore appropriate to impose a stopping rule for fertility. Given an objective (support for old age) and demographics (mortality of mother and children), an optimal age for menopause can be calculated. Using demographic data from populations that have seen little influence from modern medicine, this optimal age is compared to empirical evidence.
Resumo:
When conducting a randomized comparative clinical trial, ethical, scientific or economic considerations often motivate the use of interim decision rules after successive groups of patients have been treated. These decisions may pertain to the comparative efficacy or safety of the treatments under study, cost considerations, the desire to accelerate the drug evaluation process, or the likelihood of therapeutic benefit for future patients. At the time of each interim decision, an important question is whether patient enrollment should continue or be terminated; either due to a high probability that one treatment is superior to the other, or a low probability that the experimental treatment will ultimately prove to be superior. The use of frequentist group sequential decision rules has become routine in the conduct of phase III clinical trials. In this dissertation, we will present a new Bayesian decision-theoretic approach to the problem of designing a randomized group sequential clinical trial, focusing on two-arm trials with time-to-failure outcomes. Forward simulation is used to obtain optimal decision boundaries for each of a set of possible models. At each interim analysis, we use Bayesian model selection to adaptively choose the model having the largest posterior probability of being correct, and we then make the interim decision based on the boundaries that are optimal under the chosen model. We provide a simulation study to compare this method, which we call Bayesian Doubly Optimal Group Sequential (BDOGS), to corresponding frequentist designs using either O'Brien-Fleming (OF) or Pocock boundaries, as obtained from EaSt 2000. Our simulation results show that, over a wide variety of different cases, BDOGS either performs at least as well as both OF and Pocock, or on average provides a much smaller trial. ^
Resumo:
Genital warts are a sexually transmitted disease with high prevalence in the U.S. Imiquimod 5% cream is a self-applied treatment, prescribed three-times weekly, at bedtime, for 16 weeks. The post-marketing research addressed questions of imiquimod dosing frequency. MEDLINE, Embase, and the Cochrane Library were searched for randomized trials on efficacy and safety of imiquimod 5% cream with either three-times weekly or once-daily regimens to systemically review treatment options. Efficacy was evaluated by completely cleared warts at the end of treatment, and safety - by frequency of adverse events and at least one rest period taken from treatment. Six studies were selected for the analysis, including circumcised men, uncircumcised men, and women. The once-daily compared to three-times weekly regimen did not improve the efficacy, but resulted in increased incidence of local skin reactions and events, when at least one rest period was taken from treatment. The optimal regimen is three-times weekly.^
Resumo:
Radiomics is the high-throughput extraction and analysis of quantitative image features. For non-small cell lung cancer (NSCLC) patients, radiomics can be applied to standard of care computed tomography (CT) images to improve tumor diagnosis, staging, and response assessment. The first objective of this work was to show that CT image features extracted from pre-treatment NSCLC tumors could be used to predict tumor shrinkage in response to therapy. This is important since tumor shrinkage is an important cancer treatment endpoint that is correlated with probability of disease progression and overall survival. Accurate prediction of tumor shrinkage could also lead to individually customized treatment plans. To accomplish this objective, 64 stage NSCLC patients with similar treatments were all imaged using the same CT scanner and protocol. Quantitative image features were extracted and principal component regression with simulated annealing subset selection was used to predict shrinkage. Cross validation and permutation tests were used to validate the results. The optimal model gave a strong correlation between the observed and predicted shrinkages with . The second objective of this work was to identify sets of NSCLC CT image features that are reproducible, non-redundant, and informative across multiple machines. Feature sets with these qualities are needed for NSCLC radiomics models to be robust to machine variation and spurious correlation. To accomplish this objective, test-retest CT image pairs were obtained from 56 NSCLC patients imaged on three CT machines from two institutions. For each machine, quantitative image features with concordance correlation coefficient values greater than 0.90 were considered reproducible. Multi-machine reproducible feature sets were created by taking the intersection of individual machine reproducible feature sets. Redundant features were removed through hierarchical clustering. The findings showed that image feature reproducibility and redundancy depended on both the CT machine and the CT image type (average cine 4D-CT imaging vs. end-exhale cine 4D-CT imaging vs. helical inspiratory breath-hold 3D CT). For each image type, a set of cross-machine reproducible, non-redundant, and informative image features was identified. Compared to end-exhale 4D-CT and breath-hold 3D-CT, average 4D-CT derived image features showed superior multi-machine reproducibility and are the best candidates for clinical correlation.
Resumo:
Background: For most cytotoxic and biologic anti-cancer agents, the response rate of the drug is commonly assumed to be non-decreasing with an increasing dose. However, an increasing dose does not always result in an appreciable increase in the response rate. This may especially be true at high doses for a biologic agent. Therefore, in a phase II trial the investigators may be interested in testing the anti-tumor activity of a drug at more than one (often two) doses, instead of only at the maximum tolerated dose (MTD). This way, when the lower dose appears equally effective, this dose can be recommended for further confirmatory testing in a phase III trial under potential long-term toxicity and cost considerations. A common approach to designing such a phase II trial has been to use an independent (e.g., Simon's two-stage) design at each dose ignoring the prior knowledge about the ordering of the response probabilities at the different doses. However, failure to account for this ordering constraint in estimating the response probabilities may result in an inefficient design. In this dissertation, we developed extensions of Simon's optimal and minimax two-stage designs, including both frequentist and Bayesian methods, for two doses that assume ordered response rates between doses. ^ Methods: Optimal and minimax two-stage designs are proposed for phase II clinical trials in settings where the true response rates at two dose levels are ordered. We borrow strength between doses using isotonic regression and control the joint and/or marginal error probabilities. Bayesian two-stage designs are also proposed under a stochastic ordering constraint. ^ Results: Compared to Simon's designs, when controlling the power and type I error at the same levels, the proposed frequentist and Bayesian designs reduce the maximum and expected sample sizes. Most of the proposed designs also increase the probability of early termination when the true response rates are poor. ^ Conclusion: Proposed frequentist and Bayesian designs are superior to Simon's designs in terms of operating characteristics (expected sample size and probability of early termination, when the response rates are poor) Thus, the proposed designs lead to more cost-efficient and ethical trials, and may consequently improve and expedite the drug discovery process. The proposed designs may be extended to designs of multiple group trials and drug combination trials.^
Resumo:
We analyze the effect of environmental uncertainties on optimal fishery management in a bio-economic fishery model. Unlike most of the literature on resource economics, but in line with ecological models, we allow the different biological processes of survival and recruitment to be affected differently by environmental uncertainties. We show that the overall effect of uncertainty on the optimal size of a fish stock is ambiguous, depending on the prudence of the value function. For the case of a risk-neutral fishery manager, the overall effect depends on the relative magnitude of two opposing effects, the 'convex-cost effect' and the 'gambling effect'. We apply the analysis to the Baltic cod and the North Sea herring fisheries, concluding that for risk neutral agents the net effect of environmental uncertainties on the optimal size of these fish stocks is negative, albeit small in absolute value. Under risk aversion, the effect on optimal stock size is positive for sufficiently high coefficients of constant relative risk aversion.
Resumo:
A recent study by Rozvany and Sokól discussed an important topic in structural design: the allowance for support costs in the optimization process. This paper examines a frequently used kind of support —that of simple foundation with horizontal reaction by friction— that appears no covered for the Authors’ approach. A simple example is examined to illustrate the case and to apply the Authors’ method and the standard design method.