884 resultados para Fully automated
Resumo:
Extraction of both pelvic and femoral surface models of a hip joint from CT data for computer-assisted pre-operative planning of hip arthroscopy is addressed. We present a method for a fully automatic image segmentation of a hip joint. Our method works by combining fast random forest (RF) regression based landmark detection, atlas-based segmentation, with articulated statistical shape model (aSSM) based hip joint reconstruction. The two fundamental contributions of our method are: (1) An improved fast Gaussian transform (IFGT) is used within the RF regression framework for a fast and accurate landmark detection, which then allows for a fully automatic initialization of the atlas-based segmentation; and (2) aSSM based fitting is used to preserve hip joint structure and to avoid penetration between the pelvic and femoral models. Validation on 30 hip CT images show that our method achieves high performance in segmenting pelvis, left proximal femur, and right proximal femur surfaces with an average accuracy of 0.59 mm, 0.62 mm, and 0.58 mm, respectively.
Resumo:
This paper proposed an automated 3D lumbar intervertebral disc (IVD) segmentation strategy from MRI data. Starting from two user supplied landmarks, the geometrical parameters of all lumbar vertebral bodies and intervertebral discs are automatically extracted from a mid-sagittal slice using a graphical model based approach. After that, a three-dimensional (3D) variable-radius soft tube model of the lumbar spine column is built to guide the 3D disc segmentation. The disc segmentation is achieved as a multi-kernel diffeomorphic registration between a 3D template of the disc and the observed MRI data. Experiments on 15 patient data sets showed the robustness and the accuracy of the proposed algorithm.
Resumo:
In clinical practice, traditional X-ray radiography is widely used, and knowledge of landmarks and contours in anteroposterior (AP) pelvis X-rays is invaluable for computer aided diagnosis, hip surgery planning and image-guided interventions. This paper presents a fully automatic approach for landmark detection and shape segmentation of both pelvis and femur in conventional AP X-ray images. Our approach is based on the framework of landmark detection via Random Forest (RF) regression and shape regularization via hierarchical sparse shape composition. We propose a visual feature FL-HoG (Flexible- Level Histogram of Oriented Gradients) and a feature selection algorithm based on trace radio optimization to improve the robustness and the efficacy of RF-based landmark detection. The landmark detection result is then used in a hierarchical sparse shape composition framework for shape regularization. Finally, the extracted shape contour is fine-tuned by a post-processing step based on low level image features. The experimental results demonstrate that our feature selection algorithm reduces the feature dimension in a factor of 40 and improves both training and test efficiency. Further experiments conducted on 436 clinical AP pelvis X-rays show that our approach achieves an average point-to-curve error around 1.2 mm for femur and 1.9 mm for pelvis.
Resumo:
Extraction of surface models of a hip joint from CT data is a pre-requisite step for computer assisted diagnosis and planning (CADP) of periacetabular osteotomy (PAO). Most of existing CADP systems are based on manual segmentation, which is time-consuming and hard to achieve reproducible results. In this paper, we present a Fully Automatic CT Segmentation (FACTS) approach to simultaneously extract both pelvic and femoral models. Our approach works by combining fast random forest (RF) regression based landmark detection, multi-atlas based segmentation, with articulated statistical shape model (aSSM) based fitting. The two fundamental contributions of our approach are: (1) an improved fast Gaussian transform (IFGT) is used within the RF regression framework for a fast and accurate landmark detection, which then allows for a fully automatic initialization of the multi-atlas based segmentation; and (2) aSSM based fitting is used to preserve hip joint structure and to avoid penetration between the pelvic and femoral models. Taking manual segmentation as the ground truth, we evaluated the present approach on 30 hip CT images (60 hips) with a 6-fold cross validation. When the present approach was compared to manual segmentation, a mean segmentation accuracy of 0.40, 0.36, and 0.36 mm was found for the pelvis, the left proximal femur, and the right proximal femur, respectively. When the models derived from both segmentations were used to compute the PAO diagnosis parameters, a difference of 2.0 ± 1.5°, 2.1 ± 1.6°, and 3.5 ± 2.3% were found for anteversion, inclination, and acetabular coverage, respectively. The achieved accuracy is regarded as clinically accurate enough for our target applications.
Resumo:
A search for resonant WZ production in the ℓνℓ′ℓ′ℓνℓ′ℓ′ (ℓ,ℓ′=e,μℓ,ℓ′=e,μ) decay channel using 20.3 fb−1 of View the MathML sources=8 TeVpp collision data collected by the ATLAS experiment at LHC is presented. No significant deviation from the Standard Model prediction is observed and upper limits on the production cross sections of WZ resonances from an extended gauge model W′W′ and from a simplified model of heavy vector triplets are derived. A corresponding observed (expected) lower mass limit of 1.52 (1.49) TeV is derived for the W′W′ at the 95% confidence level.
Resumo:
BACKGROUND Record linkage of existing individual health care data is an efficient way to answer important epidemiological research questions. Reuse of individual health-related data faces several problems: Either a unique personal identifier, like social security number, is not available or non-unique person identifiable information, like names, are privacy protected and cannot be accessed. A solution to protect privacy in probabilistic record linkages is to encrypt these sensitive information. Unfortunately, encrypted hash codes of two names differ completely if the plain names differ only by a single character. Therefore, standard encryption methods cannot be applied. To overcome these challenges, we developed the Privacy Preserving Probabilistic Record Linkage (P3RL) method. METHODS In this Privacy Preserving Probabilistic Record Linkage method we apply a three-party protocol, with two sites collecting individual data and an independent trusted linkage center as the third partner. Our method consists of three main steps: pre-processing, encryption and probabilistic record linkage. Data pre-processing and encryption are done at the sites by local personnel. To guarantee similar quality and format of variables and identical encryption procedure at each site, the linkage center generates semi-automated pre-processing and encryption templates. To retrieve information (i.e. data structure) for the creation of templates without ever accessing plain person identifiable information, we introduced a novel method of data masking. Sensitive string variables are encrypted using Bloom filters, which enables calculation of similarity coefficients. For date variables, we developed special encryption procedures to handle the most common date errors. The linkage center performs probabilistic record linkage with encrypted person identifiable information and plain non-sensitive variables. RESULTS In this paper we describe step by step how to link existing health-related data using encryption methods to preserve privacy of persons in the study. CONCLUSION Privacy Preserving Probabilistic Record linkage expands record linkage facilities in settings where a unique identifier is unavailable and/or regulations restrict access to the non-unique person identifiable information needed to link existing health-related data sets. Automated pre-processing and encryption fully protect sensitive information ensuring participant confidentiality. This method is suitable not just for epidemiological research but also for any setting with similar challenges.
Resumo:
In this paper we develop an adaptive procedure for the numerical solution of general, semilinear elliptic problems with possible singular perturbations. Our approach combines both prediction-type adaptive Newton methods and a linear adaptive finite element discretization (based on a robust a posteriori error analysis), thereby leading to a fully adaptive Newton–Galerkin scheme. Numerical experiments underline the robustness and reliability of the proposed approach for various examples
Resumo:
In electroweak-boson production processes with a jet veto, higher-order corrections are enhanced by logarithms of the veto scale over the invariant mass of the boson system. In this paper, we resum these Sudakov logarithms at next-to-next-to-leading logarithmic accuracy and match our predictions to next-to-leading-order (NLO) fixed-order results. We perform the calculation in an automated way, for arbitrary electroweak final states and in the presence of kinematic cuts on the leptons produced in the decays of the electroweak bosons. The resummation is based on a factorization theorem for the cross sections into hard functions, which encode the virtual corrections to the boson production process, and beam functions, which describe the low-pT emissions collinear to the beams. The one-loop hard functions for arbitrary processes are calculated using the MadGraph5_aMC@NLO framework, while the beam functions are process independent. We perform the resummation for a variety of processes, in particular for W+W− pair production followed by leptonic decays of the W bosons.
Resumo:
In this study two commonly used automated methods to detect atmospheric fronts in the lower troposphere are compared in various synoptic situations. The first method is a thermal approach, relying on the gradient of equivalent potential temperature (TH), while the second method is based on temporal changes in the 10 m wind (WND). For a comprehensive objective comparison of the outputs of these methods of frontal identification, both schemes are firstly applied to an idealised strong baroclinic wave simulation in the absence of topography. Then, two case-studies (one in the Northern Hemisphere (NH) and one in the Southern Hemisphere (SH)) were conducted to contrast fronts detected by the methods. Finally, we obtain global winter and summer frontal occurrence climatologies (derived from ERA-Interim for 1979–2012) and compare the structure of these. TH is able to identify cold and warm fronts in strong baroclinic cases that are in good agreement with manual analyses. WND is particularly suited for the detection of strongly elongated, meridionally oriented moving fronts, but has very limited ability to identify zonally oriented warm fronts. We note that the areas of the main TH frontal activity are shifted equatorwards compared to the WND patterns and are located upstream of regions of main WND front activity. The number of WND fronts in the NH shows more interseasonal variations than TH fronts, decreasing by more than 50% from winter to summer. In the SH there is a weaker seasonal variation of the number of observed WND fronts, however TH front activity reduces from summer (DJF) to winter (JJA). The main motivation is to give an overview of the performance of these methods, such that researchers can choose the appropriate one for their particular interest.
Resumo:
Rolling Circle Amplification (RCA) is an isothermal enzymatic method generating single-stranded DNA products consisting of concatemers containing multiple copies of the reverse complement of the circular template precursor. Little is known on the compatibility of modified nucleoside triphosphates (dN*TPs) with RCA, which would enable the synthesis of long, fully modified ssDNA sequences. Here, dNTPs modified at any position of the scaffold were shown to be compatible with rolling circle amplification, yielding long (>1 kb), and fully modified single-stranded DNA products. This methodology was applied for the generation of long, cytosine-rich synthetic mimics of telomeric DNA. The resulting modified oligo-nucleotides displayed an improved resistance to fetal bovine serum.
Resumo:
AMS-14C applications often require the analysis of small samples. Such is the case of atmospheric aerosols where frequently only a small amount of sample is available. The ion beam physics group at the ETH, Zurich, has designed an Automated Graphitization Equipment (AGE III) for routine graphite production for AMS analysis from organic samples of approximately 1 mg. In this study, we explore the potential use of the AGE III for graphitization of particulate carbon collected in quartz filters. In order to test the methodology, samples of reference materials and blanks with different sizes were prepared in the AGE III and the graphite was analyzed in a MICADAS AMS (ETH) system. The graphite samples prepared in the AGE III showed recovery yields higher than 80% and reproducible 14C values for masses ranging from 50 to 300 lg. Also, reproducible radiocarbon values were obtained for aerosol filters of small sizes that had been graphitized in the AGE III. As a study case, the tested methodology was applied to PM10 samples collected in two urban cities in Mexico in order to compare the source apportionment of biomass and fossil fuel combustion. The obtained 14C data showed that carbonaceous aerosols from Mexico City have much lower biogenic signature than the smaller city of Cuernavaca.