936 resultados para Vital statistics.
From Unsuitable Ally to Vital Parter: The Case of US-Korean Relations and The Vietnam War, 1954-1966
Resumo:
This is the history of the decade prior to the entrance of Korean troops into the Vietnam War, roughly covering the years 1953-1965.
Resumo:
Locally affine (polyaffine) image registration methods capture intersubject non-linear deformations with a low number of parameters, while providing an intuitive interpretation for clinicians. Considering the mandible bone, anatomical shape differences can be found at different scales, e.g. left or right side, teeth, etc. Classically, sequential coarse to fine registration are used to handle multiscale deformations, instead we propose a simultaneous optimization of all scales. To avoid local minima we incorporate a prior on the polyaffine transformations. This kind of groupwise registration approach is natural in a polyaffine context, if we assume one configuration of regions that describes an entire group of images, with varying transformations for each region. In this paper, we reformulate polyaffine deformations in a generative statistical model, which enables us to incorporate deformation statistics as a prior in a Bayesian setting. We find optimal transformations by optimizing the maximum a posteriori probability. We assume that the polyaffine transformations follow a normal distribution with mean and concentration matrix. Parameters of the prior are estimated from an initial coarse to fine registration. Knowing the region structure, we develop a blockwise pseudoinverse to obtain the concentration matrix. To our knowledge, we are the first to introduce simultaneous multiscale optimization through groupwise polyaffine registration. We show results on 42 mandible CT images.
Resumo:
Traditionally, the use of Bayes factors has required the specification of proper prior distributions on model parameters implicit to both null and alternative hypotheses. In this paper, I describe an approach to defining Bayes factors based on modeling test statistics. Because the distributions of test statistics do not depend on unknown model parameters, this approach eliminates the subjectivity normally associated with the definition of Bayes factors. For standard test statistics, including the _2, F, t and z statistics, the values of Bayes factors that result from this approach can be simply expressed in closed form.
Resumo:
BACKGROUND AND OBJECTIVE: The decision to maintain intensive treatment in cardiac surgical patients with poor initial outcome is mostly based on individual experience. The risk scoring systems used in cardiac surgery have no prognostic value for individuals. This study aims to assess (a) factors possibly related to poor survival and functional outcomes in cardiac surgery patients requiring prolonged (> or = 5 days) intensive care unit (ICU) treatment, (b) conditions in which treatment withdrawal might be justified, and (c) the patient's perception of the benefits and drawbacks of long intensive treatments. METHODS: The computerized data prospectively recorded for every patient in the intensive care unit over a 3-year period were reviewed and analyzed (n=1859). Survival and quality of life (QOL) outcomes were determined in all patients having required > or =5 consecutive days of intensive treatment (n=194/10.4%). Long-term survivors were interviewed at yearly intervals in a standardized manner and quality of life was assessed using the dependency score of Karnofsky. No interventions or treatments were given, withhold, or withdrawn as part of this study. RESULTS: In-hospital, 1-, and 3-year cumulative survival rates reached 91.3%, 85.6%, and 75.1%, respectively. Quality of life assessed 1 year postoperatively by the score of Karnofsky was good in 119/165 patients, fair in 32 and poor in 14. Multivariate logistic regression analysis of 19 potential predictors of poor outcome identified dialysis as the sole factor significantly (p=0.027) - albeit moderately - reducing long-term survival, and sustained neurological deficit as an inconstant predictor of poor functional outcome (p=0.028). One year postoperatively 0.63% of patients still reminded of severe suffering in the intensive station and 20% of discomfort. Only 7.7% of patients would definitely refuse redo surgery. CONCLUSIONS: This study of cardiac surgical patients requiring > or =5 days of intensive treatment did not identify factors unequivocally justifying early treatment limitation in individuals. It found that 1-year mortality and disability rates can be maintained at a low level in this subset of patients, and that severe suffering in the ICU is infrequent.
Resumo:
Functional Magnetic Resonance Imaging (fMRI) is a non-invasive technique which is commonly used to quantify changes in blood oxygenation and flow coupled to neuronal activation. One of the primary goals of fMRI studies is to identify localized brain regions where neuronal activation levels vary between groups. Single voxel t-tests have been commonly used to determine whether activation related to the protocol differs across groups. Due to the generally limited number of subjects within each study, accurate estimation of variance at each voxel is difficult. Thus, combining information across voxels in the statistical analysis of fMRI data is desirable in order to improve efficiency. Here we construct a hierarchical model and apply an Empirical Bayes framework on the analysis of group fMRI data, employing techniques used in high throughput genomic studies. The key idea is to shrink residual variances by combining information across voxels, and subsequently to construct an improved test statistic in lieu of the classical t-statistic. This hierarchical model results in a shrinkage of voxel-wise residual sample variances towards a common value. The shrunken estimator for voxelspecific variance components on the group analyses outperforms the classical residual error estimator in terms of mean squared error. Moreover, the shrunken test-statistic decreases false positive rate when testing differences in brain contrast maps across a wide range of simulation studies. This methodology was also applied to experimental data regarding a cognitive activation task.
Resumo:
Peer review procedures and citation statistics are important yet often neglected components of the scientific publication process. Here I discuss fundamental consequences of such quality measures for the scientific community and propose three remedial actions: (1) use of a ''Combined Impact Estimate'' as a measure of citation statistics, (2) adoption of an open reviewing policy and (3) acceleration of the publication process in order to raise the reputation of the entire discipline (in our case: behavioural science). Authors, reviewers and editors are invited to contribute to the improvement of publication practice.
Resumo:
In Switzerland there is a shortage of population-based information on heart failure (HF) incidence and case fatalities (CF). The aim of this study was to estimate HF event rates and both in- and out-of-hospital CF rates.
Resumo:
Free space optical (FSO) communication links can experience extreme signal degradation due to atmospheric turbulence induced spatial and temporal irradiance fuctuations (scintillation) in the laser wavefront. In addition, turbulence can cause the laser beam centroid to wander resulting in power fading, and sometimes complete loss of the signal. Spreading of the laser beam and jitter are also artifacts of atmospheric turbulence. To accurately predict the signal fading that occurs in a laser communication system and to get a true picture of how this affects crucial performance parameters like bit error rate (BER) it is important to analyze the probability density function (PDF) of the integrated irradiance fuctuations at the receiver. In addition, it is desirable to find a theoretical distribution that accurately models these ?uctuations under all propagation conditions. The PDF of integrated irradiance fuctuations is calculated from numerical wave-optic simulations of a laser after propagating through atmospheric turbulence to investigate the evolution of the distribution as the aperture diameter is increased. The simulation data distribution is compared to theoretical gamma-gamma and lognormal PDF models under a variety of scintillation regimes from weak to very strong. Our results show that the gamma-gamma PDF provides a good fit to the simulated data distribution for all aperture sizes studied from weak through moderate scintillation. In strong scintillation, the gamma-gamma PDF is a better fit to the distribution for point-like apertures and the lognormal PDF is a better fit for apertures the size of the atmospheric spatial coherence radius ρ0 or larger. In addition, the PDF of received power from a Gaussian laser beam, which has been adaptively compensated at the transmitter before propagation to the receiver of a FSO link in the moderate scintillation regime is investigated. The complexity of the adaptive optics (AO) system is increased in order to investigate the changes in the distribution of the received power and how this affects the BER. For the 10 km link, due to the non-reciprocal nature of the propagation path the optimal beam to transmit is unknown. These results show that a low-order level of complexity in the AO provides a better estimate for the optimal beam to transmit than a higher order for non-reciprocal paths. For the 20 km link distance it was found that, although minimal, all AO complexity levels provided an equivalent improvement in BER and that no AO complexity provided the correction needed for the optimal beam to transmit. Finally, the temporal power spectral density of received power from a FSO communication link is investigated. Simulated and experimental results for the coherence time calculated from the temporal correlation function are presented. Results for both simulation and experimental data show that the coherence time increases as the receiving aperture diameter increases. For finite apertures the coherence time increases as the communication link distance is increased. We conjecture that this is due to the increasing speckle size within the pupil plane of the receiving aperture for an increasing link distance.