874 resultados para correlation-based feature selection


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The application of image-guided systems with or without support by surgical robots relies on the accuracy of the navigation process, including patient-to-image registration. The surgeon must carry out the procedure based on the information provided by the navigation system, usually without being able to verify its correctness beyond visual inspection. Misleading surrogate parameters such as the fiducial registration error are often used to describe the success of the registration process, while a lack of methods describing the effects of navigation errors, such as those caused by tracking or calibration, may prevent the application of image guidance in certain accuracy-critical interventions. During minimally invasive mastoidectomy for cochlear implantation, a direct tunnel is drilled from the outside of the mastoid to a target on the cochlea based on registration using landmarks solely on the surface of the skull. Using this methodology, it is impossible to detect if the drill is advancing in the correct direction and that injury of the facial nerve will be avoided. To overcome this problem, a tool localization method based on drilling process information is proposed. The algorithm estimates the pose of a robot-guided surgical tool during a drilling task based on the correlation of the observed axial drilling force and the heterogeneous bone density in the mastoid extracted from 3-D image data. We present here one possible implementation of this method tested on ten tunnels drilled into three human cadaver specimens where an average tool localization accuracy of 0.29 mm was observed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVES Molecular subclassification of non small-cell lung cancer (NSCLC) is essential to improve clinical outcome. This study assessed the prognostic and predictive value of circulating micro-RNA (miRNA) in patients with non-squamous NSCLC enrolled in the phase II SAKK (Swiss Group for Clinical Cancer Research) trial 19/05, receiving uniform treatment with first-line bevacizumab and erlotinib followed by platinum-based chemotherapy at progression. MATERIALS AND METHODS Fifty patients with baseline and 24 h blood samples were included from SAKK 19/05. The primary study endpoint was to identify prognostic (overall survival, OS) miRNA's. Patient samples were analyzed with Agilent human miRNA 8x60K microarrays, each glass slide formatted with eight high-definition 60K arrays. Each array contained 40 probes targeting each of the 1347 miRNA. Data preprocessing included quantile normalization using robust multi-array average (RMA) algorithm. Prognostic and predictive miRNA expression profiles were identified by Spearman's rank correlation test (percentage tumor shrinkage) or log-rank testing (for time-to-event endpoints). RESULTS Data preprocessing kept 49 patients and 424 miRNA for further analysis. Ten miRNA's were significantly associated with OS, with hsa-miR-29a being the strongest prognostic marker (HR=6.44, 95%-CI 2.39-17.33). Patients with high has-miR-29a expression had a significantly lower survival at 10 months compared to patients with a low expression (54% versus 83%). Six out of the 10 miRNA's (hsa-miRN-29a, hsa-miR-542-5p, hsa-miR-502-3p, hsa-miR-376a, hsa-miR-500a, hsa-miR-424) were insensitive to perturbations according to jackknife cross-validation on their HR for OS. The respective principal component analysis (PCA) defined a meta-miRNA signature including the same 6 miRNA's, resulting in a HR of 0.66 (95%-CI 0.53-0.82). CONCLUSION Cell-free circulating miRNA-profiling successfully identified a highly prognostic 6-gene signature in patients with advanced non-squamous NSCLC. Circulating miRNA profiling should further be validated in external cohorts for the selection and monitoring of systemic treatment in patients with advanced NSCLC.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Mechanical thrombectomy provides higher recanalization rates than intravenous or intra-arterial thrombolysis. Finally this has been shown to translate into improved clinical outcome in six multicentric randomized controlled trials. However, within cohorts the clinical outcomes may vary, depending on the endovascular techniques applied. Systems aiming mainly for thrombus fragmentation and lacking a protection against distal embolization have shown disappointing results when compared to recent stent-retriever studies or even to historical data on local arterial fibrinolysis. Procedure-related embolic events are usually graded as adverse events in interventional neuroradiology. In stroke, however, the clinical consequences of secondary emboli have so far mostly been neglected and attributed to progression of the stroke itself. We summarize the evolution of instruments and techniques for endovascular, image-guided, microneurosurgical recanalization in acute stroke, and discuss how to avoid procedure-related embolic complications.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Cancer of the oral cavity and pharynx remains one of the ten leading causes of cancer death in the United States (US). Besides smoking and alcohol consumption, there are no well established risk factors. While poor dental care had been implicated, it is unknown if the lack of dental care, implying poor dental hygiene predisposes to oral cavity cancer. This study aimed to assess the relationship between dental care utilization during the past twelve months and the prevalence of oral cavity cancer. A cross-sectional design of the National Health Interview Survey of adult, non-institutionalized US residents (n=30,475) was used to assess the association between dental care utilization and self reported diagnosis of oral cavity cancer. Chi square statistic was used to examine the crude association between the predictor variable, dental care utilization and other covariates, while unconditional logistic regression was used to assess the relationship between oral cavity cancer and dental care utilization. There were statistically significant differences between those who utilized dental care during the past twelve months and those who did not with respect to education, income, age, marital status, and gender (p < 0.05), but not health insurance coverage (p = 0.53). Also, those who utilized dental care relative to those who did not were 65% less likely to present with oral cavity cancer, prevalence odds ratio (POR), 0.35, 95% Confidence Interval (CI), 0.12–0.98. Further, higher income advanced age, people of African heritage, and unmarried status were statistically significantly associated with oral cavity cancer, (p < 0.05), but health insurance coverage, alcohol use and smoking were not, p > 0.05. However, after simultaneously controlling for the relevant covariates, the association between dental care and oral cavity cancer did not attenuate nor persist. Thus, compared with those who did not use dental care, those who did wee 62% less likely to present with oral cavity cancer adjusted POR, 0.38, 95% CI, 0.13-1.10. Among US adults residing in community settings, use of dental care during the past twelve months did not significantly reduce the predisposition to oral cavity cancer. However, due to the nature of the data used in this study, which restricts temporal sequence, a large sample prospective study that may identify modifiable factors associated with oral cancer development namely poor dental care, is needed. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Radiomics is the high-throughput extraction and analysis of quantitative image features. For non-small cell lung cancer (NSCLC) patients, radiomics can be applied to standard of care computed tomography (CT) images to improve tumor diagnosis, staging, and response assessment. The first objective of this work was to show that CT image features extracted from pre-treatment NSCLC tumors could be used to predict tumor shrinkage in response to therapy. This is important since tumor shrinkage is an important cancer treatment endpoint that is correlated with probability of disease progression and overall survival. Accurate prediction of tumor shrinkage could also lead to individually customized treatment plans. To accomplish this objective, 64 stage NSCLC patients with similar treatments were all imaged using the same CT scanner and protocol. Quantitative image features were extracted and principal component regression with simulated annealing subset selection was used to predict shrinkage. Cross validation and permutation tests were used to validate the results. The optimal model gave a strong correlation between the observed and predicted shrinkages with . The second objective of this work was to identify sets of NSCLC CT image features that are reproducible, non-redundant, and informative across multiple machines. Feature sets with these qualities are needed for NSCLC radiomics models to be robust to machine variation and spurious correlation. To accomplish this objective, test-retest CT image pairs were obtained from 56 NSCLC patients imaged on three CT machines from two institutions. For each machine, quantitative image features with concordance correlation coefficient values greater than 0.90 were considered reproducible. Multi-machine reproducible feature sets were created by taking the intersection of individual machine reproducible feature sets. Redundant features were removed through hierarchical clustering. The findings showed that image feature reproducibility and redundancy depended on both the CT machine and the CT image type (average cine 4D-CT imaging vs. end-exhale cine 4D-CT imaging vs. helical inspiratory breath-hold 3D CT). For each image type, a set of cross-machine reproducible, non-redundant, and informative image features was identified. Compared to end-exhale 4D-CT and breath-hold 3D-CT, average 4D-CT derived image features showed superior multi-machine reproducibility and are the best candidates for clinical correlation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The genomic era brought by recent advances in the next-generation sequencing technology makes the genome-wide scans of natural selection a reality. Currently, almost all the statistical tests and analytical methods for identifying genes under selection was performed on the individual gene basis. Although these methods have the power of identifying gene subject to strong selection, they have limited power in discovering genes targeted by moderate or weak selection forces, which are crucial for understanding the molecular mechanisms of complex phenotypes and diseases. Recent availability and rapid completeness of many gene network and protein-protein interaction databases accompanying the genomic era open the avenues of exploring the possibility of enhancing the power of discovering genes under natural selection. The aim of the thesis is to explore and develop normal mixture model based methods for leveraging gene network information to enhance the power of natural selection target gene discovery. The results show that the developed statistical method, which combines the posterior log odds of the standard normal mixture model and the Guilt-By-Association score of the gene network in a naïve Bayes framework, has the power to discover moderate/weak selection gene which bridges the genes under strong selection and it helps our understanding the biology under complex diseases and related natural selection phenotypes.^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

One of the most used methods in rapidprototyping is Fused Deposition Modeling (FDM), which provides components with a reasonable strength in plastic materials such as ABS and has a low environmental impact. However, the FDM process exhibits low levels of surface finishing, difficulty in getting complex and/or small geometries and low consistency in “slim” elements of the parts. Furthermore, “cantilever” elements need large material structures to be supported. The solution of these deficiencies requires a comprehensive review of the three-dimensional part design to enhance advantages and performances of FDM and reduce their constraints. As a key feature of this redesign a novel method of construction by assembling parts with structuraladhesive joints is proposed. These adhesive joints should be designed specifically to fit the plastic substrate and the FDM manufacturing technology. To achieve this, the most suitable structuraladhesiveselection is firstly required. Therefore, the present work analyzes five different families of adhesives (cyanoacrylate, polyurethane, epoxy, acrylic and silicone), and, by means of the application of technical multi-criteria decision analysis based on the analytic hierarchy process (AHP), to select the structuraladhesive that better conjugates mechanical benefits and adaptation to the FDM manufacturing process

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Mass spectrometry (MS) data provide a promising strategy for biomarker discovery. For this purpose, the detection of relevant peakbins in MS data is currently under intense research. Data from mass spectrometry are challenging to analyze because of their high dimensionality and the generally low number of samples available. To tackle this problem, the scientific community is becoming increasingly interested in applying feature subset selection techniques based on specialized machine learning algorithms. In this paper, we present a performance comparison of some metaheuristics: best first (BF), genetic algorithm (GA), scatter search (SS) and variable neighborhood search (VNS). Up to now, all the algorithms, except for GA, have been first applied to detect relevant peakbins in MS data. All these metaheuristic searches are embedded in two different filter and wrapper schemes coupled with Naive Bayes and SVM classifiers.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The selection of predefined analytic grids (partitions of the numeric ranges) to represent input and output functions as histograms has been proposed as a mechanism of approximation in order to control the tradeoff between accuracy and computation times in several áreas ranging from simulation to constraint solving. In particular, the application of interval methods for probabilistic function characterization has been shown to have advantages over other methods based on the simulation of random samples. However, standard interval arithmetic has always been used for the computation steps. In this paper, we introduce an alternative approximate arithmetic aimed at controlling the cost of the interval operations. Its distinctive feature is that grids are taken into account by the operators. We apply the technique in the context of probability density functions in order to improve the accuracy of the probability estimates. Results show that this approach has advantages over existing approaches in some particular situations, although computation times tend to increase significantly when analyzing large functions.