127 resultados para Optimal hedge ratio. Garch. Effectiveness
Resumo:
Objective To determine the costs and benefits of interventions for maternal and newborn health to assess the appropriateness of current strategies and guide future plans to attain the millennium development goals. Design Cost effectiveness analysis. Setting Two regions classified by the World Health Organization according to their epidemiological grouping: Afr-E, those countries in sub-Saharan Africa with very high adult and high child mortality, and Sear-D, comprising countries in South East Asia with high adult and high child mortality. Data sources Effectiveness data from several sources, including trials, observational studies, and expert opinion. For resource inputs, quantifies came from WHO guidelines, literature, and expert opinion, and prices from the WHO choosing interventions that are cost effective database. Main outcome measures Cost per disability adjusted life year (DALY) averted in year 2000 international dollars. Results The most cost effective mix of interventions was similar in Afr-E and Sear-D. These were the community based newborn care package, followed by antenatal care (tetanus toxoid, screening for pre-eclampsia, screening and treatment of asymptomatic bacteriuria and syphilis); skilled attendance at birth, offering first level maternal and neonatal care around childbirth; and emergency obstetric and neonatal care around and after birth. Screening and treatment of maternal syphilis, community based management of neonatal pneumonia, and steroids given during the antenatal period were relatively less cost effective in Sear-D. Scaling up all of the included interventions to 95% coverage would halve neonatal and maternal deaths. Conclusion Preventive interventions at the community level for newborn babies and at the primary care level for mothers and newborn babies are extremely cost effective, but the millennium development goals for maternal and child health will not be achieved without universal access to clinical services as well.
Resumo:
We explore the task of optimal quantum channel identification and in particular, the estimation of a general one-parameter quantum process. We derive new characterizations of optimality and apply the results to several examples including the qubit depolarizing channel and the harmonic oscillator damping channel. We also discuss the geometry of the problem and illustrate the usefulness of using entanglement in process estimation.
Resumo:
Quantum information theory, applied to optical interferometry, yields a 1/n scaling of phase uncertainty Delta phi independent of the applied phase shift phi, where n is the number of photons in the interferometer. This 1/n scaling is achieved provided that the output state is subjected to an optimal phase measurement. We establish this scaling law for both passive (linear) and active (nonlinear) interferometers and identify the coefficient of proportionality. Whereas a highly nonclassical state is required to achieve optimal scaling for passive interferometry, a classical input state yields a 1/n scaling of phase uncertainty for active interferometry.
Resumo:
Using the method of quantum trajectories we show that a known pure state can be optimally monitored through time when subject to a sequence of discrete measurements. By modifying the way that we extract information from the measurement apparatus we can minimize the average algorithmic information of the measurement record, without changing the unconditional evolution of the measured system. We define an optimal measurement scheme as one which has the lowest average algorithmic information allowed. We also show how it is possible to extract information about system operator averages from the measurement records and their probabilities. The optimal measurement scheme, in the limit of weak coupling, determines the statistics of the variance of the measured variable directly. We discuss the relevance of such measurements for recent experiments in quantum optics.
Resumo:
Feature selection is one of important and frequently used techniques in data preprocessing. It can improve the efficiency and the effectiveness of data mining by reducing the dimensions of feature space and removing the irrelevant and redundant information. Feature selection can be viewed as a global optimization problem of finding a minimum set of M relevant features that describes the dataset as well as the original N attributes. In this paper, we apply the adaptive partitioned random search strategy into our feature selection algorithm. Under this search strategy, the partition structure and evaluation function is proposed for feature selection problem. This algorithm ensures the global optimal solution in theory and avoids complete randomness in search direction. The good property of our algorithm is shown through the theoretical analysis.