982 resultados para Fast methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rate constant of very fast chemical reaction generally can be measured by electrochemical methods, but can not by the thin layer electrochemical methods because of the influence of diffusion effect. Long optical path length thin layer cell (LOPTLC) with large ratio of electrode area to solution volume can be used to monitor the fist chemical reaction in situ with high sensitivity and accuracy. It enable the adsorption spectra to be measured without the influence of diffusion effect. In the present paper, a fast chemical reaction of Alizarin Red S (ARS) with its oxidative state has been studied. The reaction equilibrium constant (K) under different potentials can be determined by single step potential-absorption spectra in LOPTLC. An equilibrium constant of 7.94 x 10(5) l.mol(-1) for the chemical reaction has been obtained from the plot of lgK vs. (E - E-1(0)'). Rate constant (k) under different potentials can be measured by single step potential-chronoabsorptiometry. A rate constant of 426.6 l.mol(-1).s(-1) for the chemical reaction has been obtained from the plot of lgK vs. (E - E-1(0)') with (E - E-1(0)') = 0.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Example-based methods are effective for parameter estimation problems when the underlying system is simple or the dimensionality of the input is low. For complex and high-dimensional problems such as pose estimation, the number of required examples and the computational complexity rapidly becme prohibitively high. We introduce a new algorithm that learns a set of hashing functions that efficiently index examples relevant to a particular estimation task. Our algorithm extends a recently developed method for locality-sensitive hashing, which finds approximate neighbors in time sublinear in the number of examples. This method depends critically on the choice of hash functions; we show how to find the set of hash functions that are optimally relevant to a particular estimation problem. Experiments demonstrate that the resulting algorithm, which we call Parameter-Sensitive Hashing, can rapidly and accurately estimate the articulated pose of human figures from a large database of example images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present new, simple, efficient data structures for approximate reconciliation of set differences, a useful standalone primitive for peer-to-peer networks and a natural subroutine in methods for exact reconciliation. In the approximate reconciliation problem, peers A and B respectively have subsets of elements SA and SB of a large universe U. Peer A wishes to send a short message M to peer B with the goal that B should use M to determine as many elements in the set SB–SA as possible. To avoid the expense of round trip communication times, we focus on the situation where a single message M is sent. We motivate the performance tradeoffs between message size, accuracy and computation time for this problem with a straightforward approach using Bloom filters. We then introduce approximation reconciliation trees, a more computationally efficient solution that combines techniques from Patricia tries, Merkle trees, and Bloom filters. We present an analysis of approximation reconciliation trees and provide experimental results comparing the various methods proposed for approximate reconciliation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the problem of preprocessing a large graph so that point-to-point shortest-path queries can be answered very fast. Computing shortest paths is a well studied problem, but exact algorithms do not scale to huge graphs encountered on the web, social networks, and other applications. In this paper we focus on approximate methods for distance estimation, in particular using landmark-based distance indexing. This approach involves selecting a subset of nodes as landmarks and computing (offline) the distances from each node in the graph to those landmarks. At runtime, when the distance between a pair of nodes is needed, we can estimate it quickly by combining the precomputed distances of the two nodes to the landmarks. We prove that selecting the optimal set of landmarks is an NP-hard problem, and thus heuristic solutions need to be employed. Given a budget of memory for the index, which translates directly into a budget of landmarks, different landmark selection strategies can yield dramatically different results in terms of accuracy. A number of simple methods that scale well to large graphs are therefore developed and experimentally compared. The simplest methods choose central nodes of the graph, while the more elaborate ones select central nodes that are also far away from one another. The efficiency of the suggested techniques is tested experimentally using five different real world graphs with millions of edges; for a given accuracy, they require as much as 250 times less space than the current approach in the literature which considers selecting landmarks at random. Finally, we study applications of our method in two problems arising naturally in large-scale networks, namely, social search and community detection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate head tilt detection has a large potential to aid people with disabilities in the use of human-computer interfaces and provide universal access to communication software. We show how it can be utilized to tab through links on a web page or control a video game with head motions. It may also be useful as a correction method for currently available video-based assistive technology that requires upright facial poses. Few of the existing computer vision methods that detect head rotations in and out of the image plane with reasonable accuracy can operate within the context of a real-time communication interface because the computational expense that they incur is too great. Our method uses a variety of metrics to obtain a robust head tilt estimate without incurring the computational cost of previous methods. Our system runs in real time on a computer with a 2.53 GHz processor, 256 MB of RAM and an inexpensive webcam, using only 55% of the processor cycles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research work in this thesis reports rapid separation of biologically important low molecular weight compounds by microchip electrophoresis and ultrahigh liquid chromatography. Chapter 1 introduces the theory and principles behind capillary electrophoresis separation. An overview of the history, different modes and detection techniques coupled to CE is provided. The advantages of microchip electrophoresis are highlighted. Some aspects of metal complex analysis by capillary electrophoresis are described. Finally, the theory and different modes of the liquid chromatography technology are presented. Chapter 2 outlines the development of a method for the capillary electrophoresis of (R, S) Naproxen. Variable parameters of the separation were optimized (i.e. buffer concentration and pH, concentration of chiral selector additives, applied voltage and injection condition).The method was validated in terms of linearity, precision, and LOD. The optimized method was then transferred to a microchip electrophoresis system. Two different types of injection i.e. gated and pinched, were investigated. This microchip method represents the fastest reported chiral separation of Naproxen to date. Chapter 3 reports ultra-fast separation of aromatic amino acid by capillary electrophoresis using the short-end technique. Variable parameters of the separation were optimized and validated. The optimized method was then transferred to a microchip electrophoresis system where the separation time was further reduced. Chapter 4 outlines the use of microchip electrophoresis as an efficient tool for analysis of aluminium complexes. A 2.5 cm channel with linear imaging UV detection was used to separate and detect aluminium-dopamine complex and free dopamine. For the first time, a baseline, separation of aluminium dopamine was achieved on a 15 seconds timescale. Chapter 5 investigates a rapid, ultra-sensitive and highly efficient method for quantification of histamine in human psoriatic plaques using microdialysis and ultrahigh performance liquid chromatography with fluorescence detection. The method utilized a sub-two-micron packed C18 stationary phase. A fluorescent reagent, 4-(1-pyrene) butyric acid N-hydroxysuccinimide ester was conjugated to the primary and secondary amino moieties of histamine. The dipyrene-labeled histamine in human urine was also investigated by ultrahigh pressure liquid chromatography using a C18 column with 1.8 μm particle diameter. These methods represent one of the fastest reported separations to date of histamine using fluorescence detection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Colloidal photonic crystals have potential light manipulation applications including; fabrication of efficient lasers and LEDs, improved optical sensors and interconnects, and improving photovoltaic efficiencies. One road-block of colloidal selfassembly is their inherent defects; however, they can be manufactured cost effectively into large area films compared to micro-fabrication methods. This thesis investigates production of ‘large-area’ colloidal photonic crystals by sonication, under oil co-crystallization and controlled evaporation, with a view to reducing cracking and other defects. A simple monotonic Stöber particle synthesis method was developed producing silica particles in the range of 80 to 600nm in a single step. An analytical method assesses the quality of surface particle ordering in a semiquantitative manner was developed. Using fast Fourier transform (FFT) spot intensities, a grey scale symmetry area method, has been used to quantify the FFT profiles. Adding ultrasonic vibrations during film formation demonstrated large areas could be assembled rapidly, however film ordering suffered as a result. Under oil cocrystallisation results in the particles being bound together during film formation. While having potential to form large areas, it requires further refinement to be established as a production technique. Achieving high quality photonic crystals bonded with low concentrations (<5%) of polymeric adhesives while maintaining refractive index contrast, proved difficult and degraded the film’s uniformity. A controlled evaporation method, using a mixed solvent suspension, represents the most promising method to produce high quality films over large areas, 75mm x 25mm. During this mixed solvent approach, the film is kept in the wet state longer, thus reducing cracks developing during the drying stage. These films are crack-free up to a critical thickness, and show very large domains, which are visible in low magnification SEM images as Moiré fringe patterns. Higher magnification reveals separation between alternate fringe patterns are domain boundaries between individual crystalline growth fronts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Image inpainting refers to restoring a damaged image with missing information. The total variation (TV) inpainting model is one such method that simultaneously fills in the regions with available information from their surroundings and eliminates noises. The method works well with small narrow inpainting domains. However there remains an urgent need to develop fast iterative solvers, as the underlying problem sizes are large. In addition one needs to tackle the imbalance of results between inpainting and denoising. When the inpainting regions are thick and large, the procedure of inpainting works quite slowly and usually requires a significant number of iterations and leads inevitably to oversmoothing in the outside of the inpainting domain. To overcome these difficulties, we propose a solution for TV inpainting method based on the nonlinear multi-grid algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurements of electron capture and ionization of O-2 molecules in collisions with H+ and O+ ions have been made over an energy range 10 - 100 keV. Cross sections for dissociative and nondissociative interactions have been separately determined using coincidence techniques. Nondissociative channels leading to O-2(+) product formation are shown to be dominant for both the H+ and the O+ projectiles in the capture collisions and only for the H+ projectiles in the ionization collisions. Dissociative channels are dominant for ionizing collisions involving O+ projectiles. The energy distributions of the O+ fragment products from collisions involving H+ and O+ have also been measured for the first time using time-of-flight methods, and the results are compared with those from other related studies. These measurements have been used to describe the interaction of the energetic ions trapped in Jupiter's magnetosphere with the very thin oxygen atmosphere of the icy satellite Europa. It is shown that the ionization of oxygen molecules is dominated by charge exchange plus ion impact ionization processes rather than photoionization. In addition, dissociation is predominately induced through excitation of electrons into high-lying repulsive energy states ( electronically) rather than arising from momentum transfer from knock-on collisions between colliding nuclei, which are the only processes included in current models. Future modeling will need to include both these processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND:
tissue MicroArrays (TMAs) are a valuable platform for tissue based translational research and the discovery of tissue biomarkers. The digitised TMA slides or TMA Virtual Slides, are ultra-large digital images, and can contain several hundred samples. The processing of such slides is time-consuming, bottlenecking a potentially high throughput platform.
METHODS:
a High Performance Computing (HPC) platform for the rapid analysis of TMA virtual slides is presented in this study. Using an HP high performance cluster and a centralised dynamic load balancing approach, the simultaneous analysis of multiple tissue-cores were established. This was evaluated on Non-Small Cell Lung Cancer TMAs for complex analysis of tissue pattern and immunohistochemical positivity.
RESULTS:
the automated processing of a single TMA virtual slide containing 230 patient samples can be significantly speeded up by a factor of circa 22, bringing the analysis time to one minute. Over 90 TMAs could also be analysed simultaneously, speeding up multiplex biomarker experiments enormously.
CONCLUSIONS:
the methodologies developed in this paper provide for the first time a genuine high throughput analysis platform for TMA biomarker discovery that will significantly enhance the reliability and speed for biomarker research. This will have widespread implications in translational tissue based research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The chemoselective acylation of primary aliphatic amines has been achieved in under ten minutes (and for aromatic amines under 120 min) using vibration ball-milling, avoiding undesirable solvents which are typically employed for such reactions (e.g. DMF). Under optimised conditions, the synthesis of amides in the presence of both primary and secondary alcohol functions was achieved in high to excellent yields (65-94%). Overall, the methods described have significant practical advantages over conventional approaches based upon bulk solvents including greater yields, higher chemoselectivity and easier product separation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is convenient and effective to solve nonlinear problems with a model that has a linear-in-the-parameters (LITP) structure. However, the nonlinear parameters (e.g. the width of Gaussian function) of each model term needs to be pre-determined either from expert experience or through exhaustive search. An alternative approach is to optimize them by a gradient-based technique (e.g. Newton’s method). Unfortunately, all of these methods still need a lot of computations. Recently, the extreme learning machine (ELM) has shown its advantages in terms of fast learning from data, but the sparsity of the constructed model cannot be guaranteed. This paper proposes a novel algorithm for automatic construction of a nonlinear system model based on the extreme learning machine. This is achieved by effectively integrating the ELM and leave-one-out (LOO) cross validation with our two-stage stepwise construction procedure [1]. The main objective is to improve the compactness and generalization capability of the model constructed by the ELM method. Numerical analysis shows that the proposed algorithm only involves about half of the computation of orthogonal least squares (OLS) based method. Simulation examples are included to confirm the efficacy and superiority of the proposed technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Malachite Green (MG), Crystal Violet (CV) and Brilliant Green (BC) are antibacterial, antifungal and antiparasitic agents that have been used for treatment and prevention of diseases in fish. These dyes are metabolized into reduced leuco forms (LMG, LCV, LBG) that can be present in fish muscles for a long period. Due to the carcinogenic properties they are banned for use in fish for human consumption in many countries including the European Union and the United States. HPLC and LC-MS techniques are generally used for the detection of these compounds and their metabolites in fish. This study presents the development of a fast enzyme-linked immunosorbent assay (ELISA) method as an alternative for screening purposes. A first monoclonal cell line producing antibodies to MG was generated using a hybridoma technique. The antibody had good cross-reactivates with related chromatic forms of triphenylmethane dyes such as CV, BC, Methyl Green, Methyl Violet and Victoria Blue R. The monoclonal antibody (mAb) was used to develop a fast (20 min) disequilibrium ELISA screening method for the detection of triphenylmethanes in fish. By introducing an oxidation step with 2,3-dichloro-5,6-dicyano-1,4-benzoquinone (DDQ) during sample extraction the assay was also used to detect the presence of the reduced metabolites of triphenylmethanes. The detection capability of the assay was 1 ng g(-1) for MG, LMG, CV, LCV and BC which was below the minimum required performance limit (MRPL) for the detection method of total MG (sum of MG and LMG) set by the Commission Decision 2004/25/EC (2 ng g(-1)). The mean recoveries for fish samples spiked at 0.5 MRPL and MRPL levels with MG and LMG were between 74.9 and 117.0% and inter- and intra-assay coefficients of variation between 4.7 and 25.7%. The validated method allows the analysis of a batch of 20 samples in two to three hours. Additionally, this procedure is substantially faster than other ELISA methods developed for MG/LMG thus far. The stable and efficient monoclonal cell line obtained is an unlimited source of sensitive and specific antibody to MG and other triphenylmethanes. (C) 2011 Elsevier B.V. All rights reserved.