103 resultados para Forward error correcting code
Resumo:
Electrical Impedance Tomography (EIT) is an imaging method which enables a volume conductivity map of a subject to be produced from multiple impedance measurements. It has the potential to become a portable non-invasive imaging technique of particular use in imaging brain function. Accurate numerical forward models may be used to improve image reconstruction but, until now, have employed an assumption of isotropic tissue conductivity. This may be expected to introduce inaccuracy, as body tissues, especially those such as white matter and the skull in head imaging, are highly anisotropic. The purpose of this study was, for the first time, to develop a method for incorporating anisotropy in a forward numerical model for EIT of the head and assess the resulting improvement in image quality in the case of linear reconstruction of one example of the human head. A realistic Finite Element Model (FEM) of an adult human head with segments for the scalp, skull, CSF, and brain was produced from a structural MRI. Anisotropy of the brain was estimated from a diffusion tensor-MRI of the same subject and anisotropy of the skull was approximated from the structural information. A method for incorporation of anisotropy in the forward model and its use in image reconstruction was produced. The improvement in reconstructed image quality was assessed in computer simulation by producing forward data, and then linear reconstruction using a sensitivity matrix approach. The mean boundary data difference between anisotropic and isotropic forward models for a reference conductivity was 50%. Use of the correct anisotropic FEM in image reconstruction, as opposed to an isotropic one, corrected an error of 24 mm in imaging a 10% conductivity decrease located in the hippocampus, improved localisation for conductivity changes deep in the brain and due to epilepsy by 4-17 mm, and, overall, led to a substantial improvement on image quality. This suggests that incorporation of anisotropy in numerical models used for image reconstruction is likely to improve EIT image quality.
Resumo:
This article first provides a selective overview of the literature on bureaucratic autonomy and identifies different approaches to this topic. The second section discusses three major sets of open questions, which will be tackled in the contributions to this special issue: the subjective, dynamic and relational nature of autonomy; the complex linkages between tasks, organizational forms, and national path dependencies on the one hand and autonomy and performance on the other hand; and the interplay between autonomy, accountability and democratic legitimacy.
Resumo:
MOTIVATION: Comparative analyses of gene expression data from different species have become an important component of the study of molecular evolution. Thus methods are needed to estimate evolutionary distances between expression profiles, as well as a neutral reference to estimate selective pressure. Divergence between expression profiles of homologous genes is often calculated with Pearson's or Euclidean distance. Neutral divergence is usually inferred from randomized data. Despite being widely used, neither of these two steps has been well studied. Here, we analyze these methods formally and on real data, highlight their limitations and propose improvements. RESULTS: It has been demonstrated that Pearson's distance, in contrast to Euclidean distance, leads to underestimation of the expression similarity between homologous genes with a conserved uniform pattern of expression. Here, we first extend this study to genes with conserved, but specific pattern of expression. Surprisingly, we find that both Pearson's and Euclidean distances used as a measure of expression similarity between genes depend on the expression specificity of those genes. We also show that the Euclidean distance depends strongly on data normalization. Next, we show that the randomization procedure that is widely used to estimate the rate of neutral evolution is biased when broadly expressed genes are abundant in the data. To overcome this problem, we propose a novel randomization procedure that is unbiased with respect to expression profiles present in the datasets. Applying our method to the mouse and human gene expression data suggests significant gene expression conservation between these species. CONTACT: marc.robinson-rechavi@unil.ch; sven.bergmann@unil.ch SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.
Resumo:
INTRODUCTION: Very little surgical care is performed in low- and middle-income countries (LMICs). An estimated two billion people in the world have no access to essential surgical care, and non-surgeons perform much of the surgery in remote and rural areas. Surgical care is as yet not recognized as an integral aspect of primary health care despite its self-demonstrated cost-effectiveness. We aimed to define the parameters of a public health approach to provide surgical care to areas in most need. METHODS: Consensus meetings were held, field experience was collected via targeted interviews, and a literature review on the current state of essential surgical care provision in Sub-Saharan Africa (SSA) was conducted. Comparisons were made across international recommendations for essential surgical interventions and a consensus-driven list was drawn up according to their relative simplicity, resource requirement, and capacity to provide the highest impact in terms of averted mortality or disability. RESULTS: Essential Surgery consists of basic, low-cost surgical interventions, which save lives and prevent life-long disability or life-threatening complications and may be offered in any district hospital. Fifteen essential surgical interventions were deduced from various recommendations from international surgical bodies. Training in the realm of Essential Surgery is narrow and strict enough to be possible for non-physician clinicians (NPCs). This cadre is already active in many SSA countries in providing the bulk of surgical care. CONCLUSION: A basic package of essential surgical care interventions is imperative to provide structure for scaling up training and building essential health services in remote and rural areas of LMICs. NPCs, a health cadre predominant in SSA, require training, mentoring, and monitoring. The cost of such training is vastly more efficient than the expensive training of a few polyvalent or specialist surgeons, who will not be sufficient in numbers within the next few generations. Moreover, these practitioners are used to working in the districts and are much less prone to gravitate elsewhere. The use of these NPCs performing "Essential Surgery" is a feasible route to deal with the almost total lack of primary surgical care in LMICs.
Resumo:
Approximate models (proxies) can be employed to reduce the computational costs of estimating uncertainty. The price to pay is that the approximations introduced by the proxy model can lead to a biased estimation. To avoid this problem and ensure a reliable uncertainty quantification, we propose to combine functional data analysis and machine learning to build error models that allow us to obtain an accurate prediction of the exact response without solving the exact model for all realizations. We build the relationship between proxy and exact model on a learning set of geostatistical realizations for which both exact and approximate solvers are run. Functional principal components analysis (FPCA) is used to investigate the variability in the two sets of curves and reduce the dimensionality of the problem while maximizing the retained information. Once obtained, the error model can be used to predict the exact response of any realization on the basis of the sole proxy response. This methodology is purpose-oriented as the error model is constructed directly for the quantity of interest, rather than for the state of the system. Also, the dimensionality reduction performed by FPCA allows a diagnostic of the quality of the error model to assess the informativeness of the learning set and the fidelity of the proxy to the exact model. The possibility of obtaining a prediction of the exact response for any newly generated realization suggests that the methodology can be effectively used beyond the context of uncertainty quantification, in particular for Bayesian inference and optimization.
Resumo:
Given their high sensitivity and ability to limit the field of view (FOV), surface coils are often used in magnetic resonance spectroscopy (MRS) and imaging (MRI). A major downside of surface coils is their inherent radiofrequency (RF) B1 heterogeneity across the FOV, decreasing with increasing distance from the coil and giving rise to image distortions due to non-uniform spatial responses. A robust way to compensate for B1 inhomogeneities is to employ adiabatic inversion pulses, yet these are not well adapted to all imaging sequences - including to single-shot approaches like echo planar imaging (EPI). Hybrid spatiotemporal encoding (SPEN) sequences relying on frequency-swept pulses provide another ultrafast MRI alternative, that could help solve this problem thanks to their built-in heterogeneous spatial manipulations. This study explores how this intrinsic SPEN-based spatial discrimination, could be used to compensate for the B1 inhomogeneities inherent to surface coils. Experiments carried out in both phantoms and in vivo rat brains demonstrate that, by suitably modulating the amplitude of a SPEN chirp pulse that progressively excites the spins in a direction normal to the coil, it is possible to compensate for the RF transmit inhomogeneities and thus improve sensitivity and image fidelity.