57 resultados para Automatic Query Refinement

em CentAUR: Central Archive University of Reading - UK


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new algorithm is described for refining the pose of a model of a rigid object, to conform more accurately to the image structure. Elemental 3D forces are considered to act on the model. These are derived from directional derivatives of the image local to the projected model features. The convergence properties of the algorithm is investigated and compared to a previous technique. Its use in a video sequence of a cluttered outdoor traffic scene is also illustrated and assessed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Different optimization methods can be employed to optimize a numerical estimate for the match between an instantiated object model and an image. In order to take advantage of gradient-based optimization methods, perspective inversion must be used in this context. We show that convergence can be very fast by extrapolating to maximum goodness-of-fit with Newton's method. This approach is related to methods which either maximize a similar goodness-of-fit measure without use of gradient information, or else minimize distances between projected model lines and image features. Newton's method combines the accuracy of the former approach with the speed of convergence of the latter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Alternative meshes of the sphere and adaptive mesh refinement could be immensely beneficial for weather and climate forecasts, but it is not clear how mesh refinement should be achieved. A finite-volume model that solves the shallow-water equations on any mesh of the surface of the sphere is presented. The accuracy and cost effectiveness of four quasi-uniform meshes of the sphere are compared: a cubed sphere, reduced latitude–longitude, hexagonal–icosahedral, and triangular–icosahedral. On some standard shallow-water tests, the hexagonal–icosahedral mesh performs best and the reduced latitude–longitude mesh performs well only when the flow is aligned with the mesh. The inclusion of a refined mesh over a disc-shaped region is achieved using either gradual Delaunay, gradual Voronoi, or abrupt 2:1 block-structured refinement. These refined regions can actually degrade global accuracy, presumably because of changes in wave dispersion where the mesh is highly nonuniform. However, using gradual refinement to resolve a mountain in an otherwise coarse mesh can improve accuracy for the same cost. The model prognostic variables are height and momentum collocated at cell centers, and (to remove grid-scale oscillations of the A grid) the mass flux between cells is advanced from the old momentum using the momentum equation. Quadratic and upwind biased cubic differencing methods are used as explicit corrections to a fast implicit solution that uses linear differencing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Parameters to be determined in a least squares refinement calculation to fit a set of observed data may sometimes usefully be `predicated' to values obtained from some independent source, such as a theoretical calculation. An algorithm for achieving this in a least squares refinement calculation is described, which leaves the operator in full control of the weight that he may wish to attach to the predicate values of the parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurately measured peptide masses can be used for large-scale protein identification from bacterial whole-cell digests as an alternative to tandem mass spectrometry (MS/MS) provided mass measurement errors of a few parts-per-million (ppm) are obtained. Fourier transform ion cyclotron resonance (FTICR) mass spectrometry (MS) routinely achieves such mass accuracy either with internal calibration or by regulating the charge in the analyzer cell. We have developed a novel and automated method for internal calibration of liquid chromatography (LC)/FTICR data from whole-cell digests using peptides in the sample identified by concurrent MS/MS together with ambient polydimethyl-cyclosiloxanes as internal calibrants in the mass spectra. The method reduced mass measurement error from 4.3 +/- 3.7 ppm to 0.3 +/- 2.3 ppm in an E. coli LC/FTICR dataset of 1000 MS and MS/MS spectra and is applicable to all analyses of complex protein digests by FTICRMS. Copyright (c) 2006 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The assessment of cellular effects by the aqueous phase of human feces (fecal water, FW) is a useful biomarker approach to study cancer risks and protective activities of food. In order to refine and develop the biomarker, different protocols of preparing FW were compared. Fecal waters were prepared by 3 methods: (A) direct centrifugation; (B) extraction of feces in PBS before centrifugation; and (C) centrifugation of lyophilized and reconstituted feces. Genotoxicity was determined in colon cells using the Comet assay. Selected samples were investigated for additional parameters related to carcinogenesis. Two of 7 FWs obtained by methods A and B were similarly genotoxic. Method B, however, yielded higher volumes of FW, allowing sterile filtration for long-term culture experiments. Four of 7 samples were non-genotoxic when prepared according to all 3 methods. FW from lyophilized feces and from fresh samples were equally genotoxic. FWs modulated cytotoxicity, paracellular permeability, and invasion, independent of their genotoxicity. All 3 methods of FW preparation can be used to assess genotoxicity. The higher volumes of FWobtained by preparation method B greatly enhance the perspectives of measuring different types of biological parameters and using these to disclose activities related to cancer development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The compounds chlorothiazide and hydrochlorothiazide (crystalline form II) have been studied in their fully hydrogenous forms by powder neutron diffraction on the GEM diffractometer. The results of joint Rietveld refinement of the structures against multi-bank neutron and single-bank X-ray powder data are reported and show that accurate and precise structural information can be obtained from polycrystalline molecular organic materials by this route.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The identification of criminal networks is not a routine exploratory process within the current practice of the law enforcement authorities; rather it is triggered by specific evidence of criminal activity being investigated. A network is identified when a criminal comes to notice and any associates who could also be potentially implicated would need to be identified if only to be eliminated from the enquiries as suspects or witnesses as well as to prevent and/or detect crime. However, an identified network may not be the one causing most harm in a given area.. This paper identifies a methodology to identify all of the criminal networks that are present within a Law Enforcement Area, and, prioritises those that are causing most harm to the community. Each crime is allocated a score based on its crime type and how recently the crime was committed; the network score, which can be used as decision support to help prioritise it for law enforcement purposes, is the sum of the individual crime scores.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Finding the smallest eigenvalue of a given square matrix A of order n is computationally very intensive problem. The most popular method for this problem is the Inverse Power Method which uses LU-decomposition and forward and backward solving of the factored system at every iteration step. An alternative to this method is the Resolvent Monte Carlo method which uses representation of the resolvent matrix [I -qA](-m) as a series and then performs Monte Carlo iterations (random walks) on the elements of the matrix. This leads to great savings in computations, but the method has many restrictions and a very slow convergence. In this paper we propose a method that includes fast Monte Carlo procedure for finding the inverse matrix, refinement procedure to improve approximation of the inverse if necessary, and Monte Carlo power iterations to compute the smallest eigenvalue. We provide not only theoretical estimations about accuracy and convergence but also results from numerical tests performed on a number of test matrices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There are still major challenges in the area of automatic indexing and retrieval of multimedia content data for very large multimedia content corpora. Current indexing and retrieval applications still use keywords to index multimedia content and those keywords usually do not provide any knowledge about the semantic content of the data. With the increasing amount of multimedia content, it is inefficient to continue with this approach. In this paper, we describe the project DREAM, which addresses such challenges by proposing a new framework for semi-automatic annotation and retrieval of multimedia based on the semantic content. The framework uses the Topic Map Technology, as a tool to model the knowledge automatically extracted from the multimedia content using an Automatic Labelling Engine. We describe how we acquire knowledge from the content and represent this knowledge using the support of NLP to automatically generate Topic Maps. The framework is described in the context of film post-production.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 3D reconstruction of a Golgi-stained dendritic tree from a serial stack of images captured with a transmitted light bright-field microscope is investigated. Modifications to the bootstrap filter are discussed such that the tree structure may be estimated recursively as a series of connected segments. The tracking performance of the bootstrap particle filter is compared against Differential Evolution, an evolutionary global optimisation method, both in terms of robustness and accuracy. It is found that the particle filtering approach is significantly more robust and accurate for the data considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The externally recorded electroencephalogram (EEG) is contaminated with signals that do not originate from the brain, collectively known as artefacts. Thus, EEG signals must be cleaned prior to any further analysis. In particular, if the EEG is to be used in online applications such as Brain-Computer Interfaces (BCIs) the removal of artefacts must be performed in an automatic manner. This paper investigates the robustness of Mutual Information based features to inter-subject variability for use in an automatic artefact removal system. The system is based on the separation of EEG recordings into independent components using a temporal ICA method, RADICAL, and the utilisation of a Support Vector Machine for classification of the components into EEG and artefact signals. High accuracy and robustness to inter-subject variability is achieved.