836 resultados para Optimal DVR placement
Resumo:
Quantum states can be used to encode the information contained in a direction, i.e., in a unit vector. We present the best encoding procedure when the quantum state is made up of N spins (qubits). We find that the quality of this optimal procedure, which we quantify in terms of the fidelity, depends solely on the dimension of the encoding space. We also investigate the use of spatial rotations on a quantum state, which provide a natural and less demanding encoding. In this case we prove that the fidelity is directly related to the largest zeros of the Legendre and Jacobi polynomials. We also discuss our results in terms of the information gain.
Resumo:
Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.
Resumo:
The correct use of closed field chambers to determine N2O emissions requires defining the time of day that best represents the daily mean N2O flux. A short-term field experiment was carried out on a Mollisol soil, on which annual crops were grown under no-till management in the Pampa Ondulada of Argentina. The N2O emission rates were measured every 3 h for three consecutive days. Fluxes ranged from 62.58 to 145.99 ∝g N-N2O m-2 h-1 (average of five field chambers) and were negatively related (R² = 0.34, p < 0.01) to topsoil temperature (14 - 20 ºC). N2O emission rates measured between 9:00 and 12:00 am presented a high relationship to daily mean N2O flux (R² = 0.87, p < 0.01), showing that, in the study region, sampling in the mornings is preferable for GHG.
Resumo:
The problem of searchability in decentralized complex networks is of great importance in computer science, economy, and sociology. We present a formalism that is able to cope simultaneously with the problem of search and the congestion effects that arise when parallel searches are performed, and we obtain expressions for the average search cost both in the presence and the absence of congestion. This formalism is used to obtain optimal network structures for a system using a local search algorithm. It is found that only two classes of networks can be optimal: starlike configurations, when the number of parallel searches is small, and homogeneous-isotropic configurations, when it is large.
Resumo:
BACKGROUND AND OBJECTIVES: The SBP values to be achieved by antihypertensive therapy in order to maximize reduction of cardiovascular outcomes are unknown; neither is it clear whether in patients with a previous cardiovascular event, the optimal values are lower than in the low-to-moderate risk hypertensive patients, or a more cautious blood pressure (BP) reduction should be obtained. Because of the uncertainty whether 'the lower the better' or the 'J-curve' hypothesis is correct, the European Society of Hypertension and the Chinese Hypertension League have promoted a randomized trial comparing antihypertensive treatment strategies aiming at three different SBP targets in hypertensive patients with a recent stroke or transient ischaemic attack. As the optimal level of low-density lipoprotein cholesterol (LDL-C) level is also unknown in these patients, LDL-C-lowering has been included in the design. PROTOCOL DESIGN: The European Society of Hypertension-Chinese Hypertension League Stroke in Hypertension Optimal Treatment trial is a prospective multinational, randomized trial with a 3 × 2 factorial design comparing: three different SBP targets (1, <145-135; 2, <135-125; 3, <125 mmHg); two different LDL-C targets (target A, 2.8-1.8; target B, <1.8 mmol/l). The trial is to be conducted on 7500 patients aged at least 65 years (2500 in Europe, 5000 in China) with hypertension and a stroke or transient ischaemic attack 1-6 months before randomization. Antihypertensive and statin treatments will be initiated or modified using suitable registered agents chosen by the investigators, in order to maintain patients within the randomized SBP and LDL-C windows. All patients will be followed up every 3 months for BP and every 6 months for LDL-C. Ambulatory BP will be measured yearly. OUTCOMES: Primary outcome is time to stroke (fatal and non-fatal). Important secondary outcomes are: time to first major cardiovascular event; cognitive decline (Montreal Cognitive Assessment) and dementia. All major outcomes will be adjudicated by committees blind to randomized allocation. A Data and Safety Monitoring Board has open access to data and can recommend trial interruption for safety. SAMPLE SIZE CALCULATION: It has been calculated that 925 patients would reach the primary outcome after a mean 4-year follow-up, and this should provide at least 80% power to detect a 25% stroke difference between SBP targets and a 20% difference between LDL-C targets.
Resumo:
PURPOSE: Neurophysiological monitoring aims to improve the safety of pedicle screw placement, but few quantitative studies assess specificity and sensitivity. In this study, screw placement within the pedicle is measured (post-op CT scan, horizontal and vertical distance from the screw edge to the surface of the pedicle) and correlated with intraoperative neurophysiological stimulation thresholds. METHODS: A single surgeon placed 68 thoracic and 136 lumbar screws in 30 consecutive patients during instrumented fusion under EMG control. The female to male ratio was 1.6 and the average age was 61.3 years (SD 17.7). Radiological measurements, blinded to stimulation threshold, were done on reformatted CT reconstructions using OsiriX software. A standard deviation of the screw position of 2.8 mm was determined from pilot measurements, and a 1 mm of screw-pedicle edge distance was considered as a difference of interest (standardised difference of 0.35) leading to a power of the study of 75 % (significance level 0.05). RESULTS: Correct placement and stimulation thresholds above 10 mA were found in 71 % of screws. Twenty-two percent of screws caused cortical breach, 80 % of these had stimulation thresholds above 10 mA (sensitivity 20 %, specificity 90 %). True prediction of correct position of the screw was more frequent for lumbar than for thoracic screws. CONCLUSION: A screw stimulation threshold of >10 mA does not indicate correct pedicle screw placement. A hypothesised gradual decrease of screw stimulation thresholds was not observed as screw placement approaches the nerve root. Aside from a robust threshold of 2 mA indicating direct contact with nervous tissue, a secondary threshold appears to depend on patients' pathology and surgical conditions.
Resumo:
T-cell receptor affinity for self-antigen has an important role in establishing self-tolerance. Three transgenic mouse strains expressing antigens of variable affinity for the OVA transgenic-I T-cell receptor were generated to address how TCR affinity affects the efficiency of negative selection, the ability to prime an autoimmune response, and the elimination of the relevant target cell. Mice expressing antigens with an affinity just above the negative selection threshold exhibited the highest risk of developing experimental autoimmune diabetes. The data demonstrate that close to the affinity threshold for negative selection, sufficient numbers of self-reactive T cells escape deletion and create an increased risk for the development of autoimmunity.
Resumo:
BACKGROUND AND PURPOSE: This study aims to determine whether perfusion computed tomographic (PCT) thresholds for delineating the ischemic core and penumbra are time dependent or time independent in patients presenting with symptoms of acute stroke. METHODS: Two hundred seventeen patients were evaluated in a retrospective, multicenter study. Patients were divided into those with either persistent occlusion or recanalization. All patients received admission PCT and follow-up imaging to determine the final ischemic core, which was then retrospectively matched to the PCT images to identify optimal thresholds for the different PCT parameters. These thresholds were assessed for significant variation over time since symptom onset. RESULTS: In the persistent occlusion group, optimal PCT parameters that did not significantly change with time included absolute mean transit time, relative mean transit time, relative cerebral blood flow, and relative cerebral blood volume when time was restricted to 15 hours after symptom onset. Conversely, the recanalization group showed no significant time variation for any PCT parameter at any time interval. In the persistent occlusion group, the optimal threshold to delineate the total ischemic area was the relative mean transit time at a threshold of 180%. In patients with recanalization, the optimal parameter to predict the ischemic core was relative cerebral blood volume at a threshold of 66%. CONCLUSIONS: Time does not influence the optimal PCT thresholds to delineate the ischemic core and penumbra in the first 15 hours after symptom onset for relative mean transit time and relative cerebral blood volume, the optimal parameters to delineate ischemic core and penumbra.
Resumo:
Introduction: Accurate and reproducible tibial tunnel placement minimizing the risk of neurovascular damage is a crucial condition for successful arthroscopic reconstruction of the posterior cruciate ligament (PCL). This step is commonly performed under fluoroscopic control. Hypothesis: Performing the tibial tunnel under exclusive arthroscopic control allows accurate and reliable tunnel placement according to recommendations in the literature. Materials and Methods: Between February 2007 and December 2009, 108 arthroscopic single bundle PCL reconstructions in tibial tunnel technique were performed. The routine postoperative radiographs were screened according to previously defined quality criterions. After critical analysis, the radiographs of 48 patients (48 knees) were enrolled in the study. 10 patients had simultaneous ACL reconstruction and 7 had PCL revision surgery. The tibial tunnel was placed under direct arthroscopic control through a posteromedial portal using a standard tibial aming device. Key anatomical landmarks were the exposed tibial insertion of the PCL and the posterior horn of the medial meniscus. First, the centre of the posterior tibial tunnel outlet on the a-p view was determined by digital analysis of the postoperative radiographes. Its distance to the medial tibial spine was measured parallel to the tibia plateau. The mediolateral position was expressed by the ratio between the distance of the tunnel outlet to the medial border and the total width of the tibial plateau. On the lateral view the vertical tunnel position was measured perpendicularly to a tangent of the medial tibial plateau. All measurement were repeated at least twice and carried out by two examiners. Results: The mean mediolateral tunnel position was 49.3 ± 4.6% (ratio), 6.7 ± 3.6 mm lateral to the medial tibial spine. On the lateral view the tunnel centre was 10.1 ± 4.5 mm distal to the bony surface of the medial tibial plateau. Neurovascular damage was observed in none of our patients. Conclusion: The results of this radiological study confirm that exclusive arthroscopic control for tibial tunnel placement in PCL reconstruction yields reproducible and accurate results according to the literature. Our technique avoids radiation, facilitates the operation room setting and enables the surgeon to visualize the anatomic key landmarks for tibial tunnel placement.
Resumo:
[eng] This paper provides, from a theoretical and quantitative point of view, an explanation of why taxes on capital returns are high (around 35%) by analyzing the optimal fiscal policy in an economy with intergenerational redistribution. For this purpose, the government is modeled explicitly and can choose (and commit to) an optimal tax policy in order to maximize society's welfare. In an infinitely lived economy with heterogeneous agents, the long run optimal capital tax is zero. If heterogeneity is due to the existence of overlapping generations, this result in general is no longer true. I provide sufficient conditions for zero capital and labor taxes, and show that a general class of preferences, commonly used on the macro and public finance literature, violate these conditions. For a version of the model, calibrated to the US economy, the main results are: first, if the government is restricted to a set of instruments, the observed fiscal policy cannot be disregarded as sub optimal and capital taxes are positive and quantitatively relevant. Second, if the government can use age specific taxes for each generation, then the age profile capital tax pattern implies subsidizing asset returns of the younger generations and taxing at higher rates the asset returns of the older ones.