967 resultados para Point interpolation method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

3D laser scanning is becoming a standard technology to generate building models of a facility's as-is condition. Since most constructions are constructed upon planar surfaces, recognition of them paves the way for automation of generating building models. This paper introduces a new logarithmically proportional objective function that can be used in both heuristic and metaheuristic (MH) algorithms to discover planar surfaces in a point cloud without exploiting any prior knowledge about those surfaces. It can also adopt itself to the structural density of a scanned construction. In this paper, a metaheuristic method, genetic algorithm (GA), is used to test this introduced objective function on a synthetic point cloud. The results obtained show the proposed method is capable to find all plane configurations of planar surfaces (with a wide variety of sizes) in the point cloud with a minor distance to the actual configurations. © 2014 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is to discuss the linear solution of equality constrained problems by using the Frontal solution method without explicit assembling. Design/methodology/approach - Re-written frontal solution method with a priori pivot and front sequence. OpenMP parallelization, nearly linear (in elimination and substitution) up to 40 threads. Constraints enforced at the local assembling stage. Findings - When compared with both standard sparse solvers and classical frontal implementations, memory requirements and code size are significantly reduced. Research limitations/implications - Large, non-linear problems with constraints typically make use of the Newton method with Lagrange multipliers. In the context of the solution of problems with large number of constraints, the matrix transformation methods (MTM) are often more cost-effective. The paper presents a complete solution, with topological ordering, for this problem. Practical implications - A complete software package in Fortran 2003 is described. Examples of clique-based problems are shown with large systems solved in core. Social implications - More realistic non-linear problems can be solved with this Frontal code at the core of the Newton method. Originality/value - Use of topological ordering of constraints. A-priori pivot and front sequences. No need for symbolic assembling. Constraints treated at the core of the Frontal solver. Use of OpenMP in the main Frontal loop, now quantified. Availability of Software.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hyperspectral imaging has become one of the main topics in remote sensing applications, which comprise hundreds of spectral bands at different (almost contiguous) wavelength channels over the same area generating large data volumes comprising several GBs per flight. This high spectral resolution can be used for object detection and for discriminate between different objects based on their spectral characteristics. One of the main problems involved in hyperspectral analysis is the presence of mixed pixels, which arise when the spacial resolution of the sensor is not able to separate spectrally distinct materials. Spectral unmixing is one of the most important task for hyperspectral data exploitation. However, the unmixing algorithms can be computationally very expensive, and even high power consuming, which compromises the use in applications under on-board constraints. In recent years, graphics processing units (GPUs) have evolved into highly parallel and programmable systems. Specifically, several hyperspectral imaging algorithms have shown to be able to benefit from this hardware taking advantage of the extremely high floating-point processing performance, compact size, huge memory bandwidth, and relatively low cost of these units, which make them appealing for onboard data processing. In this paper, we propose a parallel implementation of an augmented Lagragian based method for unsupervised hyperspectral linear unmixing on GPUs using CUDA. The method called simplex identification via split augmented Lagrangian (SISAL) aims to identify the endmembers of a scene, i.e., is able to unmix hyperspectral data sets in which the pure pixel assumption is violated. The efficient implementation of SISAL method presented in this work exploits the GPU architecture at low level, using shared memory and coalesced accesses to memory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new iterative algorithm based on the inexact-restoration (IR) approach combined with the filter strategy to solve nonlinear constrained optimization problems is presented. The high level algorithm is suggested by Gonzaga et al. (SIAM J. Optim. 14:646–669, 2003) but not yet implement—the internal algorithms are not proposed. The filter, a new concept introduced by Fletcher and Leyffer (Math. Program. Ser. A 91:239–269, 2002), replaces the merit function avoiding the penalty parameter estimation and the difficulties related to the nondifferentiability. In the IR approach two independent phases are performed in each iteration, the feasibility and the optimality phases. The line search filter is combined with the first one phase to generate a “more feasible” point, and then it is used in the optimality phase to reach an “optimal” point. Numerical experiences with a collection of AMPL problems and a performance comparison with IPOPT are provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Master’s Double Degree in Finance from Maastricht University and NOVA – School of Business and Economics

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diabetic neuropathy is an important complication of the disease, responsible for ulceration and amputation of the foot. Prevention of these problems is difficult mainly because there is no method to correctly access sensibility on the skin of the foot. The introduction of the Pressure-Specified Sensory Device (PSSD TM) in the last decade made possible the measurement of pressure thresholds sensed by the patient, such as touch, both static and in movement, on a continuous scale. This paper is the first in Brazil to report the use of this device to measure cutaneous sensibility in 3 areas of the foot: the hallux pulp, the calcaneus, and the dorsum, which are territories of the tibial and fibular nerves. METHOD: Non-diabetic patients were measured as controls, and 2 groups of diabetic patients - with and without ulcers - were compared. The PSSD TM was used to test the 3 areas described above. The following were evaluated: 1 PS (1-point static), 1 PD (1-point dynamic), 2 PS (2-points static), 2 PD (2-points dynamic). RESULTS: The diabetic group had poorer sensibility compared to controls and diabetics with ulcers had poorer sensibility when compared to diabetics without ulcers. The differences were statistically significant (P <.001). CONCLUSION: Due to the small number of patients compared, the results should be taken as a preliminary report.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new very high-order finite volume method to solve problems with harmonic and biharmonic operators for one- dimensional geometries is proposed. The main ingredient is polynomial reconstruction based on local interpolations of mean values providing accurate approximations of the solution up to the sixth-order accuracy. First developed with the harmonic operator, an extension for the biharmonic operator is obtained, which allows designing a very high-order finite volume scheme where the solution is obtained by solving a matrix-free problem. An application in elasticity coupling the two operators is presented. We consider a beam subject to a combination of tensile and bending loads, where the main goal is the stress critical point determination for an intramedullary nail.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[Excerpt] Introduction: There has been a considerable amount of controversy about the use of manometric methods to measure catalase activity. As Maehly and Chance point out in their excellent review] the advantages of these methods is "... that they can be used for any kind of biological material, and purification of the enzyme is not required. The assay is independent of small amounts of peroxidase activity. It is fairly simple to perform, it is rapid and it can be adapted to continuous reading of the reaction". A variety of drawbacks are also listed by the same authors, viz, the inactivation of the enzyme under the experimental conditions and the time lag before a constant rate of oxygen evolution is reached. [...]

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Seismic analysis, horizon matching, fault tracking, marked point process,stochastic annealing

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For many drugs, finding the balance between efficacy and toxicity requires monitoring their concentrations in the patient's blood. Quantifying drug levels at the bedside or at home would have advantages in terms of therapeutic outcome and convenience, but current techniques require the setting of a diagnostic laboratory. We have developed semisynthetic bioluminescent sensors that permit precise measurements of drug concentrations in patient samples by spotting minimal volumes on paper and recording the signal using a simple point-and-shoot camera. Our sensors have a modular design consisting of a protein-based and a synthetic part and can be engineered to selectively recognize a wide range of drugs, including immunosuppressants, antiepileptics, anticancer agents and antiarrhythmics. This low-cost point-of-care method could make therapies safer, increase the convenience of doctors and patients and make therapeutic drug monitoring available in regions with poor infrastructure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Iron deficiency is a common and undertreated problem in inflammatory bowel disease (IBD). AIM: To develop an online tool to support treatment choice at the patient-specific level. METHODS: Using the RAND/UCLA Appropriateness Method (RUAM), a European expert panel assessed the appropriateness of treatment regimens for a variety of clinical scenarios in patients with non-anaemic iron deficiency (NAID) and iron deficiency anaemia (IDA). Treatment options included adjustment of IBD medication only, oral iron supplementation, high-/low-dose intravenous (IV) regimens, IV iron plus erythropoietin-stimulating agent (ESA), and blood transfusion. The panel process consisted of two individual rating rounds (1148 treatment indications; 9-point scale) and three plenary discussion meetings. RESULTS: The panel reached agreement on 71% of treatment indications. 'No treatment' was never considered appropriate, and repeat treatment after previous failure was generally discouraged. For 98% of scenarios, at least one treatment was appropriate. Adjustment of IBD medication was deemed appropriate in all patients with active disease. Use of oral iron was mainly considered an option in NAID and mildly anaemic patients without disease activity. IV regimens were often judged appropriate, with high-dose IV iron being the preferred option in 77% of IDA scenarios. Blood transfusion and IV+ESA were indicated in exceptional cases only. CONCLUSIONS: The RUAM revealed high agreement amongst experts on the management of iron deficiency in patients with IBD. High-dose IV iron was more often considered appropriate than other options. To facilitate dissemination of the recommendations, panel outcomes were embedded in an online tool, accessible via http://ferroscope.com/.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of Geographic Information Systems has revolutionalized the handling and the visualization of geo-referenced data and has underlined the critic role of spatial analysis. The usual tools for such a purpose are geostatistics which are widely used in Earth science. Geostatistics are based upon several hypothesis which are not always verified in practice. On the other hand, Artificial Neural Network (ANN) a priori can be used without special assumptions and are known to be flexible. This paper proposes to discuss the application of ANN in the case of the interpolation of a geo-referenced variable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present two new stabilized high-resolution numerical methods for the convection–diffusion–reaction (CDR) and the Helmholtz equations respectively. The work embarks upon a priori analysis of some consistency recovery procedures for some stabilization methods belonging to the Petrov–Galerkin framework. It was found that the use of some standard practices (e.g. M-Matrices theory) for the design of essentially non-oscillatory numerical methods is not feasible when consistency recovery methods are employed. Hence, with respect to convective stabilization, such recovery methods are not preferred. Next, we present the design of a high-resolution Petrov–Galerkin (HRPG) method for the 1D CDR problem. The problem is studied from a fresh point of view, including practical implications on the formulation of the maximum principle, M-Matrices theory, monotonicity and total variation diminishing (TVD) finite volume schemes. The current method is next in line to earlier methods that may be viewed as an upwinding plus a discontinuity-capturing operator. Finally, some remarks are made on the extension of the HRPG method to multidimensions. Next, we present a new numerical scheme for the Helmholtz equation resulting in quasi-exact solutions. The focus is on the approximation of the solution to the Helmholtz equation in the interior of the domain using compact stencils. Piecewise linear/bilinear polynomial interpolation are considered on a structured mesh/grid. The only a priori requirement is to provide a mesh/grid resolution of at least eight elements per wavelength. No stabilization parameters are involved in the definition of the scheme. The scheme consists of taking the average of the equation stencils obtained by the standard Galerkin finite element method and the classical finite difference method. Dispersion analysis in 1D and 2D illustrate the quasi-exact properties of this scheme. Finally, some remarks are made on the extension of the scheme to unstructured meshes by designing a method within the Petrov–Galerkin framework.