938 resultados para Point method
Resumo:
Complementary programs
Resumo:
Software for video-based multi-point frequency measuring and mapping: http://hdl.handle.net/10045/53429
Resumo:
Human behaviour recognition has been, and still remains, a challenging problem that involves different areas of computational intelligence. The automated understanding of people activities from video sequences is an open research topic in which the computer vision and pattern recognition areas have made big efforts. In this paper, the problem is studied from a prediction point of view. We propose a novel method able to early detect behaviour using a small portion of the input, in addition to the capabilities of it to predict behaviour from new inputs. Specifically, we propose a predictive method based on a simple representation of trajectories of a person in the scene which allows a high level understanding of the global human behaviour. The representation of the trajectory is used as a descriptor of the activity of the individual. The descriptors are used as a cue of a classification stage for pattern recognition purposes. Classifiers are trained using the trajectory representation of the complete sequence. However, partial sequences are processed to evaluate the early prediction capabilities having a specific observation time of the scene. The experiments have been carried out using the three different dataset of the CAVIAR database taken into account the behaviour of an individual. Additionally, different classic classifiers have been used for experimentation in order to evaluate the robustness of the proposal. Results confirm the high accuracy of the proposal on the early recognition of people behaviours.
Resumo:
Since the beginning of 3D computer vision problems, the use of techniques to reduce the data to make it treatable preserving the important aspects of the scene has been necessary. Currently, with the new low-cost RGB-D sensors, which provide a stream of color and 3D data of approximately 30 frames per second, this is getting more relevance. Many applications make use of these sensors and need a preprocessing to downsample the data in order to either reduce the processing time or improve the data (e.g., reducing noise or enhancing the important features). In this paper, we present a comparison of different downsampling techniques which are based on different principles. Concretely, five different downsampling methods are included: a bilinear-based method, a normal-based, a color-based, a combination of the normal and color-based samplings, and a growing neural gas (GNG)-based approach. For the comparison, two different models have been used acquired with the Blensor software. Moreover, to evaluate the effect of the downsampling in a real application, a 3D non-rigid registration is performed with the data sampled. From the experimentation we can conclude that depending on the purpose of the application some kernels of the sampling methods can improve drastically the results. Bilinear- and GNG-based methods provide homogeneous point clouds, but color-based and normal-based provide datasets with higher density of points in areas with specific features. In the non-rigid application, if a color-based sampled point cloud is used, it is possible to properly register two datasets for cases where intensity data are relevant in the model and outperform the results if only a homogeneous sampling is used.
Resumo:
In recent times the Douglas–Rachford algorithm has been observed empirically to solve a variety of nonconvex feasibility problems including those of a combinatorial nature. For many of these problems current theory is not sufficient to explain this observed success and is mainly concerned with questions of local convergence. In this paper we analyze global behavior of the method for finding a point in the intersection of a half-space and a potentially non-convex set which is assumed to satisfy a well-quasi-ordering property or a property weaker than compactness. In particular, the special case in which the second set is finite is covered by our framework and provides a prototypical setting for combinatorial optimization problems.
Resumo:
Mode of access: Internet.
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
A phantom that can be used for mapping geometric distortion in magnetic resonance imaging (MRI) is described. This phantom provides an array of densely distributed control points in three-dimensional (3D) space. These points form the basis of a comprehensive measurement method to correct for geometric distortion in MR images arising principally from gradient field non-linearity and magnet field inhomogeneity. The phantom was designed based on the concept that a point in space can be defined using three orthogonal planes. This novel design approach allows for as many control points as desired. Employing this novel design, a highly accurate method has been developed that enables the positions of the control points to be measured to sub-voxel accuracy. The phantom described in this paper was constructed to fit into a body coil of a MRI scanner, (external dimensions of the phantom were: 310 mm x 310 mm x 310 mm), and it contained 10,830 control points. With this phantom, the mean errors in the measured coordinates of the control points were on the order of 0.1 mm or less, which were less than one tenth of the voxel's dimensions of the phantom image. The calculated three-dimensional distortion map, i.e., the differences between the image positions and true positions of the control points, can then be used to compensate for geometric distortion for a full image restoration. It is anticipated that this novel method will have an impact on the applicability of MRI in both clinical and research settings. especially in areas where geometric accuracy is highly required, such as in MR neuro-imaging. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
A comprehensive study has been conducted to compare the adsorptions of alkali metals (including Li, Na, and K) on the basal plane of graphite by using molecular orbital theory calculations. All three metal atoms prefer to be adsorbed on the middle hollow site above a hexagonal aromatic ring. A novel phenomenon was observed, that is, Na, instead of Li or K, is the weakest among the three types of metal atoms in adsorption. The reason is that the SOMO (single occupied molecular orbital) of the Na atom is exactly at the middle point between the HOMO and the LUMO of the graphite layer in energy level. As a result, the SOMO of Na cannot form a stable interaction with either the HOMO or the LUMO of the graphite. On the other hand, the SOMO of Li and K can form a relatively stable interaction with either the HOMO or the LUMO of graphite. Why Li has a relatively stronger adsorption than K on graphite has also been interpreted on the basis of their molecular-orbital energy levels.
Resumo:
Detection of point mutations or single nucleotide polymorphisms (SNPs) is important in relation to disease susceptibility or detection in pathogens of mutations determining drug resistance or host range. There is an emergent need for rapid detection methods amenable to point-of-care applications. The purpose of this study was to reduce to practice a novel method for SNP detection and to demonstrate that this technology can be used downstream of nucleic acid amplification. The authors used a model system to develop an oligonucleotide-based SNP detection system on nitrocellulose lateral flow strips. To optimize the assay they used cloned sequences of the herpes simplex virus-1 (HSV-1) DNA polymerase gene into which they introduced a point mutation. The assay system uses chimeric polymerase chain reaction (PCR) primers that incorporate hexameric repeat tags ("hexapet tags"). The chimeric sequences allow capture of amplified products to predefined positions on a lateral flow strip. These "hexapet" sequences have minimal cross-reactivity and allow specific hybridization-based capture of the PCR products at room temperature onto lateral flow strips that have been striped with complementary hexapet tags. The allele-specific amplification was carried out with both mutant and wild-type primer sets present in the PCR mix ("competitive" format). The resulting PCR products carried a hexapet tag that corresponded with either a wild-type or mutant sequence. The lateral flow strips are dropped into the PCR reaction tube, and mutant sequence and wild-type sequences diffuse along the strip and are captured at the corresponding position on the strip. A red line indicative of a positive reaction is visible after 1 minute. Unlike other systems that require separate reactions and strips for each target sequence, this system allows multiplex PCR reactions and multiplex detection on a single strip or other suitable substrates. Unambiguous visual discrimination of a point mutation under room temperature hybridization conditions was achieved with this model system in 10 minutes after PCR. The authors have developed a capture-based hybridization method for the detection and discrimination of HSV-1 DNA polymerase genes that contain a single nucleotide change. It has been demonstrated that the hexapet oligonucleotides can be adapted for hybridization on the lateral flow strip platform for discrimination of SNPs. This is the first step in demonstrating SNP detection on lateral flow using the hexapet oligonucleotide capture system. It is anticipated that this novel system can be widely used in point-of-care settings.
Resumo:
This paper introduces a method for power system modeling during the earth fault. The possibility of using this method for selection and adjustment of earth fault protection is pointed out. The paper also contains the comparison of results achieved by simulation with the experimental measurements.
Resumo:
Most magnetic resonance imaging (MRI) spatial encoding techniques employ low-frequency pulsed magnetic field gradients that undesirably induce multiexponentially decaying eddy currents in nearby conducting structures of the MRI system. The eddy currents degrade the switching performance of the gradient system, distort the MRI image, and introduce thermal loads in the cryostat vessel and superconducting MRI components. Heating of superconducting magnets due to induced eddy currents is particularly problematic as it offsets the superconducting operating point, which can cause a system quench. A numerical characterization of transient eddy current effects is vital for their compensation/control and further advancement of the MRI technology as a whole. However, transient eddy current calculations are particularly computationally intensive. In large-scale problems, such as gradient switching in MRI, conventional finite-element method (FEM)-based routines impose very large computational loads during generation/solving of the system equations. Therefore, other computational alternatives need to be explored. This paper outlines a three-dimensional finite-difference time-domain (FDTD) method in cylindrical coordinates for the modeling of low-frequency transient eddy currents in MRI, as an extension to the recently proposed time-harmonic scheme. The weakly coupled Maxwell's equations are adapted to the low-frequency regime by downscaling the speed of light constant, which permits the use of larger FDTD time steps while maintaining the validity of the Courant-Friedrich-Levy stability condition. The principal hypothesis of this work is that the modified FDTD routine can be employed to analyze pulsed-gradient-induced, transient eddy currents in superconducting MRI system models. The hypothesis is supported through a verification of the numerical scheme on a canonical problem and by analyzing undesired temporal eddy current effects such as the B-0-shift caused by actively shielded symmetric/asymmetric transverse x-gradient head and unshielded z-gradient whole-body coils operating in proximity to a superconducting MRI magnet.
Resumo:
BACKGROUND: Intervention time series analysis (ITSA) is an important method for analysing the effect of sudden events on time series data. ITSA methods are quasi-experimental in nature and the validity of modelling with these methods depends upon assumptions about the timing of the intervention and the response of the process to it. METHOD: This paper describes how to apply ITSA to analyse the impact of unplanned events on time series when the timing of the event is not accurately known, and so the problems of ITSA methods are magnified by uncertainty in the point of onset of the unplanned intervention. RESULTS: The methods are illustrated using the example of the Australian Heroin Shortage of 2001, which provided an opportunity to study the health and social consequences of an abrupt change in heroin availability in an environment of widespread harm reduction measures. CONCLUSION: Application of these methods enables valuable insights about the consequences of unplanned and poorly identified interventions while minimising the risk of spurious results.
Resumo:
This article first summarizes some available experimental results on the frictional behaviour of contact interfaces, and briefly recalls typical frictional experiments and relationships, which are applicable for rock mechanics, and then a unified description is obtained to describe the entire frictional behaviour. It is formulated based on the experimental results and applied with a stick and slip decomposition algorithm to describe the stick-slip instability phenomena, which can describe the effects observed in rock experiments without using the so-called state variable, thus avoiding related numerical difficulties. This has been implemented to our finite element code, which uses the node-to-point contact element strategy proposed by the authors to handle the frictional contact between multiple finite-deformation bodies with stick and finite frictional slip, and applied here to simulate the frictional behaviour of rocks to show its usefulness and efficiency.