994 resultados para Correlation algorithm


Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIMS: c-Met is an emerging biomarker in pancreatic ductal adenocarcinoma (PDAC); there is no consensus regarding the immunostaining scoring method for this marker. We aimed to assess the prognostic value of c-Met overexpression in resected PDAC, and to elaborate a robust and reproducible scoring method for c-Met immunostaining in this setting. METHODS AND RESULTS: c-Met immunostaining was graded according to the validated MetMab score, a classic visual scale combining surface and intensity (SI score), or a simplified score (high c-Met: ≥20% of tumour cells with strong membranous staining), in stage I-II PDAC. A computer-assisted classification method (Aperio software) was developed. Clinicopathological parameters were correlated with disease-free survival (DFS) and overall survival(OS). One hundred and forty-nine patients were analysed retrospectively in a two-step process. Thirty-seven samples (whole slides) were analysed as a pre-run test. Reproducibility values were optimal with the simplified score (kappa = 0.773); high c-Met expression (7/37) was associated with shorter DFS [hazard ratio (HR) 3.456, P = 0.0036] and OS (HR 4.257, P = 0.0004). c-Met expression was concordant on whole slides and tissue microarrays in 87.9% of samples, and quantifiable with a specific computer-assisted algorithm. In the whole cohort (n = 131), patients with c-Met(high) tumours (36/131) had significantly shorter DFS (9.3 versus 20.0 months, HR 2.165, P = 0.0005) and OS (18.2 versus 35.0 months, HR 1.832, P = 0.0098) in univariate and multivariate analysis. CONCLUSIONS: Simplified c-Met expression is an independent prognostic marker in stage I-II PDAC that may help to identify patients with a high risk of tumour relapse and poor survival.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes Question Waves, an algorithm that can be applied to social search protocols, such as Asknext or Sixearch. In this model, the queries are propagated through the social network, with faster propagation through more trustable acquaintances. Question Waves uses local information to make decisions and obtain an answer ranking. With Question Waves, the answers that arrive first are the most likely to be relevant, and we computed the correlation of answer relevance with the order of arrival to demonstrate this result. We obtained correlations equivalent to the heuristics that use global knowledge, such as profile similarity among users or the expertise value of an agent. Because Question Waves is compatible with the social search protocol Asknext, it is possible to stop a search when enough relevant answers have been found; additionally, stopping the search early only introduces a minimal risk of not obtaining the best possible answer. Furthermore, Question Waves does not require a re-ranking algorithm because the results arrive sorted

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of automated correlation optimized warping (ACOW) to the correction of retention time shift in the chromatographic fingerprints of Radix Puerariae thomsonii (RPT) was investigated. Twenty-seven samples were extracted from 9 batches of RPT products. The fingerprints of the 27 samples were established by the HPLC method. Because there is a retention time shift in the established fingerprints, the quality of these samples cannot be correctly evaluated by using similarity estimation and principal component analysis (PCA). Thus, the ACOW method was used to align these fingerprints. In the ACOW procedure, the warping parameters, which have a significant influence on the alignment result, were optimized by an automated algorithm. After correcting the retention time shift, the quality of these RPT samples was correctly evaluated by similarity estimation and PCA. It is demonstrated that ACOW is a practical method for aligning the chromatographic fingerprints of RPT. The combination of ACOW, similarity estimation, and PCA is shown to be a promising method for evaluating the quality of Traditional Chinese Medicine.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis we are going to analyze the dictionary graphs and some other kinds of graphs using the PagerRank algorithm. We calculated the correlation between the degree and PageRank of all nodes for a graph obtained from Merriam-Webster dictionary, a French dictionary and WordNet hypernym and synonym dictionaries. Our conclusion was that PageRank can be a good tool to compare the quality of dictionaries. We studied some artificial social and random graphs. We found that when we omitted some random nodes from each of the graphs, we have not noticed any significant changes in the ranking of the nodes according to their PageRank. We also discovered that some social graphs selected for our study were less resistant to the changes of PageRank.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The interaction of short intense laser pulses with atoms/molecules produces a multitude of highly nonlinear processes requiring a non-perturbative treatment. Detailed study of these highly nonlinear processes by numerically solving the time-dependent Schrodinger equation becomes a daunting task when the number of degrees of freedom is large. Also the coupling between the electronic and nuclear degrees of freedom further aggravates the computational problems. In the present work we show that the time-dependent Hartree (TDH) approximation, which neglects the correlation effects, gives unreliable description of the system dynamics both in the absence and presence of an external field. A theoretical framework is required that treats the electrons and nuclei on equal footing and fully quantum mechanically. To address this issue we discuss two approaches, namely the multicomponent density functional theory (MCDFT) and the multiconfiguration time-dependent Hartree (MCTDH) method, that go beyond the TDH approximation and describe the correlated electron-nuclear dynamics accurately. In the MCDFT framework, where the time-dependent electronic and nuclear densities are the basic variables, we discuss an algorithm to calculate the exact Kohn-Sham (KS) potentials for small model systems. By simulating the photodissociation process in a model hydrogen molecular ion, we show that the exact KS potentials contain all the many-body effects and give an insight into the system dynamics. In the MCTDH approach, the wave function is expanded as a sum of products of single-particle functions (SPFs). The MCTDH method is able to describe the electron-nuclear correlation effects as the SPFs and the expansion coefficients evolve in time and give an accurate description of the system dynamics. We show that the MCTDH method is suitable to study a variety of processes such as the fragmentation of molecules, high-order harmonic generation, the two-center interference effect, and the lochfrass effect. We discuss these phenomena in a model hydrogen molecular ion and a model hydrogen molecule. Inclusion of absorbing boundaries in the mean-field approximation and its consequences are discussed using the model hydrogen molecular ion. To this end, two types of calculations are considered: (i) a variational approach with a complex absorbing potential included in the full many-particle Hamiltonian and (ii) an approach in the spirit of time-dependent density functional theory (TDDFT), including complex absorbing potentials in the single-particle equations. It is elucidated that for small grids the TDDFT approach is superior to the variational approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a parallel architecture for estimation of the motion of an underwater robot. It is well known that image processing requires a huge amount of computation, mainly at low-level processing where the algorithms are dealing with a great number of data. In a motion estimation algorithm, correspondences between two images have to be solved at the low level. In the underwater imaging, normalised correlation can be a solution in the presence of non-uniform illumination. Due to its regular processing scheme, parallel implementation of the correspondence problem can be an adequate approach to reduce the computation time. Taking into consideration the complexity of the normalised correlation criteria, a new approach using parallel organisation of every processor from the architecture is proposed

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces a new neurofuzzy model construction and parameter estimation algorithm from observed finite data sets, based on a Takagi and Sugeno (T-S) inference mechanism and a new extended Gram-Schmidt orthogonal decomposition algorithm, for the modeling of a priori unknown dynamical systems in the form of a set of fuzzy rules. The first contribution of the paper is the introduction of a one to one mapping between a fuzzy rule-base and a model matrix feature subspace using the T-S inference mechanism. This link enables the numerical properties associated with a rule-based matrix subspace, the relationships amongst these matrix subspaces, and the correlation between the output vector and a rule-base matrix subspace, to be investigated and extracted as rule-based knowledge to enhance model transparency. The matrix subspace spanned by a fuzzy rule is initially derived as the input regression matrix multiplied by a weighting matrix that consists of the corresponding fuzzy membership functions over the training data set. Model transparency is explored by the derivation of an equivalence between an A-optimality experimental design criterion of the weighting matrix and the average model output sensitivity to the fuzzy rule, so that rule-bases can be effectively measured by their identifiability via the A-optimality experimental design criterion. The A-optimality experimental design criterion of the weighting matrices of fuzzy rules is used to construct an initial model rule-base. An extended Gram-Schmidt algorithm is then developed to estimate the parameter vector for each rule. This new algorithm decomposes the model rule-bases via an orthogonal subspace decomposition approach, so as to enhance model transparency with the capability of interpreting the derived rule-base energy level. This new approach is computationally simpler than the conventional Gram-Schmidt algorithm for resolving high dimensional regression problems, whereby it is computationally desirable to decompose complex models into a few submodels rather than a single model with large number of input variables and the associated curse of dimensionality problem. Numerical examples are included to demonstrate the effectiveness of the proposed new algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of the association between two random variables that have a joint normal distribution is of interest in applied statistics; for example, in statistical genetics. This article, targeted to applied statisticians, addresses inferences about the coefficient of correlation (ρ) in the bivariate normal and standard bivariate normal distributions using likelihood, frequentist, and Baycsian perspectives. Some results are surprising. For instance, the maximum likelihood estimator and the posterior distribution of ρ in the standard bivariate normal distribution do not follow directly from results for a general bivariate normal distribution. An example employing bootstrap and rejection sampling procedures is used to illustrate some of the peculiarities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional Monte Carlo simulations of QCD in the presence of a baryon chemical potential are plagued by the complex phase problem and new numerical approaches are necessary for studying the phase diagram of the theory. In this work we consider a ℤ3 Polyakov loop model for the deconfining phase transition in QCD and discuss how a flux representation of the model in terms of dimer and monomer variable solves the complex action problem. We present results of numerical simulations using a worm algorithm for the specific heat and two-point correlation function of Polyakov loops. Evidences of a first order deconfinement phase transition are discussed. © 2013 American Institute of Physics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to estimate genetic, environmental and phenotypic correlation between birth weight (BW) and weight at 205 days age (W205), BW and weight at 365 days age (W365) and W205-W365, using Bayesian inference. The Brazilian Program for Genetic Improvement of Buffaloes provided the data that included 3,883 observations from Mediterranean breed buffaloes. With the purpose to estimate variance and covariance, bivariate analyses were performed using Gibbs sampler that is included in the MTGSAM software. The model for BW, W205 and W365 included additive direct and maternal genetic random effects, maternal environmental random effect and contemporary group as fixed effect. The convergence diagnosis was achieved using Geweke, a method that uses an algorithm implemented in R software through the package Bayesian Output Analysis. The calculated direct genetic correlations were 0.34 (BW-W205), 0.25 (BW-W365) and 0.74 (W205-W365). The environmental correlations were 0.12, 0.11 and 0.72 between BW-W205, BW-W365 and W205-W365, respectively. The phenotypic correlations were low for BW-W205 (0.01) and BW-W365 (0.04), differently than the obtained for W205-W365 with a value of 0.67. The results indicate that BW trait have low genetic, environmental and phenotypic association with the two others traits. The genetic correlation between W205 and W365 was high and suggests that the selection for weight at around 205 days could be beneficial to accelerate the genetic gain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Advanced Very High Resolution Radiometer (AVHRR) carried on board the National Oceanic and Atmospheric Administration (NOAA) and the Meteorological Operational Satellite (MetOp) polar orbiting satellites is the only instrument offering more than 25 years of satellite data to analyse aerosols on a daily basis. The present study assessed a modified AVHRR aerosol optical depth τa retrieval over land for Europe. The algorithm might also be applied to other parts of the world with similar surface characteristics like Europe, only the aerosol properties would have to be adapted to a new region. The initial approach used a relationship between Sun photometer measurements from the Aerosol Robotic Network (AERONET) and the satellite data to post-process the retrieved τa. Herein a quasi-stand-alone procedure, which is more suitable for the pre-AERONET era, is presented. In addition, the estimation of surface reflectance, the aerosol model, and other processing steps have been adapted. The method's cross-platform applicability was tested by validating τa from NOAA-17 and NOAA-18 AVHRR at 15 AERONET sites in Central Europe (40.5° N–50° N, 0° E–17° E) from August 2005 to December 2007. Furthermore, the accuracy of the AVHRR retrieval was related to products from two newer instruments, the Medium Resolution Imaging Spectrometer (MERIS) on board the Environmental Satellite (ENVISAT) and the Moderate Resolution Imaging Spectroradiometer (MODIS) on board Aqua/Terra. Considering the linear correlation coefficient R, the AVHRR results were similar to those of MERIS with even lower root mean square error RMSE. Not surprisingly, MODIS, with its high spectral coverage, gave the highest R and lowest RMSE. Regarding monthly averaged τa, the results were ambiguous. Focusing on small-scale structures, R was reduced for all sensors, whereas the RMSE solely for MERIS substantially increased. Regarding larger areas like Central Europe, the error statistics were similar to the individual match-ups. This was mainly explained with sampling issues. With the successful validation of AVHRR we are now able to concentrate on our large data archive dating back to 1985. This is a unique opportunity for both climate and air pollution studies over land surfaces.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study a homogeneously driven granular fluid of hard spheres at intermediate volume fractions and focus on time-delayed correlation functions in the stationary state. Inelastic collisions are modeled by incomplete normal restitution, allowing for efficient simulations with an event-driven algorithm. The incoherent scattering function Fincoh(q,t ) is seen to follow time-density superposition with a relaxation time that increases significantly as the volume fraction increases. The statistics of particle displacements is approximately Gaussian. For the coherent scattering function S(q,ω), we compare our results to the predictions of generalized fluctuating hydrodynamics, which takes into account that temperature fluctuations decay either diffusively or with a finite relaxation rate, depending on wave number and inelasticity. For sufficiently small wave number q we observe sound waves in the coherent scattering function S(q,ω) and the longitudinal current correlation function Cl(q,ω). We determine the speed of sound and the transport coefficients and compare them to the results of kinetic theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, the authors evaluate a merit function for 2D/3D registration called stochastic rank correlation (SRC). SRC is characterized by the fact that differences in image intensity do not influence the registration result; it therefore combines the numerical advantages of cross correlation (CC)-type merit functions with the flexibility of mutual-information-type merit functions. The basic idea is that registration is achieved on a random subset of the image, which allows for an efficient computation of Spearman's rank correlation coefficient. This measure is, by nature, invariant to monotonic intensity transforms in the images under comparison, which renders it an ideal solution for intramodal images acquired at different energy levels as encountered in intrafractional kV imaging in image-guided radiotherapy. Initial evaluation was undertaken using a 2D/3D registration reference image dataset of a cadaver spine. Even with no radiometric calibration, SRC shows a significant improvement in robustness and stability compared to CC. Pattern intensity, another merit function that was evaluated for comparison, gave rather poor results due to its limited convergence range. The time required for SRC with 5% image content compares well to the other merit functions; increasing the image content does not significantly influence the algorithm accuracy. The authors conclude that SRC is a promising measure for 2D/3D registration in IGRT and image-guided therapy in general.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of image-guided systems with or without support by surgical robots relies on the accuracy of the navigation process, including patient-to-image registration. The surgeon must carry out the procedure based on the information provided by the navigation system, usually without being able to verify its correctness beyond visual inspection. Misleading surrogate parameters such as the fiducial registration error are often used to describe the success of the registration process, while a lack of methods describing the effects of navigation errors, such as those caused by tracking or calibration, may prevent the application of image guidance in certain accuracy-critical interventions. During minimally invasive mastoidectomy for cochlear implantation, a direct tunnel is drilled from the outside of the mastoid to a target on the cochlea based on registration using landmarks solely on the surface of the skull. Using this methodology, it is impossible to detect if the drill is advancing in the correct direction and that injury of the facial nerve will be avoided. To overcome this problem, a tool localization method based on drilling process information is proposed. The algorithm estimates the pose of a robot-guided surgical tool during a drilling task based on the correlation of the observed axial drilling force and the heterogeneous bone density in the mastoid extracted from 3-D image data. We present here one possible implementation of this method tested on ten tunnels drilled into three human cadaver specimens where an average tool localization accuracy of 0.29 mm was observed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently several thousands of objects are being tracked in the MEO and GEO regions through optical means. The problem faced in this framework is that of Multiple Target Tracking (MTT). In this context both the correct associations among the observations, and the orbits of the objects have to be determined. The complexity of the MTT problem is defined by its dimension S. Where S stands for the number of ’fences’ used in the problem, each fence consists of a set of observations that all originate from dierent targets. For a dimension of S ˃ the MTT problem becomes NP-hard. As of now no algorithm exists that can solve an NP-hard problem in an optimal manner within a reasonable (polynomial) computation time. However, there are algorithms that can approximate the solution with a realistic computational e ort. To this end an Elitist Genetic Algorithm is implemented to approximately solve the S ˃ MTT problem in an e cient manner. Its complexity is studied and it is found that an approximate solution can be obtained in a polynomial time. With the advent of improved sensors and a heightened interest in the problem of space debris, it is expected that the number of tracked objects will grow by an order of magnitude in the near future. This research aims to provide a method that can treat the correlation and orbit determination problems simultaneously, and is able to e ciently process large data sets with minimal manual intervention.