881 resultados para mining algorithm
Resumo:
This paper compares two well known scan matching algorithms: the MbICP and the pIC. As a result of the study, it is proposed the MSISpIC, a probabilistic scan matching algorithm for the localization of an Autonomous Underwater Vehicle (AUV). The technique uses range scans gathered with a Mechanical Scanning Imaging Sonar (MSIS), and the robot displacement estimated through dead-reckoning with the help of a Doppler Velocity Log (DVL) and a Motion Reference Unit (MRU). The proposed method is an extension of the pIC algorithm. Its major contribution consists in: 1) using an EKF to estimate the local path traveled by the robot while grabbing the scan as well as its uncertainty and 2) proposing a method to group into a unique scan, with a convenient uncertainty model, all the data grabbed along the path described by the robot. The algorithm has been tested on an AUV guided along a 600m path within a marina environment with satisfactory results
Resumo:
Ultra-high-throughput sequencing (UHTS) techniques are evolving rapidly and may soon become an affordable and routine tool for sequencing plant DNA, even in smaller plant biology labs. Here we review recent insights into intraspecific genome variation gained from UHTS, which offers a glimpse of the rather unexpected levels of structural variability among Arabidopsis thaliana accessions. The challenges that will need to be addressed to efficiently assemble and exploit this information are also discussed.
Resumo:
Nominal Unification is an extension of first-order unification where terms can contain binders and unification is performed modulo α equivalence. Here we prove that the existence of nominal unifiers can be decided in quadratic time. First, we linearly-reduce nominal unification problems to a sequence of freshness and equalities between atoms, modulo a permutation, using ideas as Paterson and Wegman for first-order unification. Second, we prove that solvability of these reduced problems may be checked in quadràtic time. Finally, we point out how using ideas of Brown and Tarjan for unbalanced merging, we could solve these reduced problems more efficiently
Resumo:
Summary Background: We previously derived a clinical prognostic algorithm to identify patients with pulmonary embolism (PE) who are at low-risk of short-term mortality who could be safely discharged early or treated entirely in an outpatient setting. Objectives: To externally validate the clinical prognostic algorithm in an independent patient sample. Methods: We validated the algorithm in 983 consecutive patients prospectively diagnosed with PE at an emergency department of a university hospital. Patients with none of the algorithm's 10 prognostic variables (age >/= 70 years, cancer, heart failure, chronic lung disease, chronic renal disease, cerebrovascular disease, pulse >/= 110/min., systolic blood pressure < 100 mm Hg, oxygen saturation < 90%, and altered mental status) at baseline were defined as low-risk. We compared 30-day overall mortality among low-risk patients based on the algorithm between the validation and the original derivation sample. We also assessed the rate of PE-related and bleeding-related mortality among low-risk patients. Results: Overall, the algorithm classified 16.3% of patients with PE as low-risk. Mortality at 30 days was 1.9% among low-risk patients and did not differ between the validation and the original derivation sample. Among low-risk patients, only 0.6% died from definite or possible PE, and 0% died from bleeding. Conclusions: This study validates an easy-to-use, clinical prognostic algorithm for PE that accurately identifies patients with PE who are at low-risk of short-term mortality. Low-risk patients based on our algorithm are potential candidates for less costly outpatient treatment.
Resumo:
The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.
Resumo:
Tannery residues and coal mine waste are heavily polluting sources in Brazil, mainly in the Southern States of Rio Grande do Sul and Santa Catarina. In order to study the effects of residues of chrome leather tanning (sludge and leather shavings) and coal waste on soybean and maize crops, a field experiment is in progress since 1996, at the Federal University of Rio Grande do Sul Experimental Station, county of Eldorado do Sul, Brazil. The residues were applied twice (growing seasons 1996/97 and 1999/00). The amounts of tannery residues were applied according to their neutralizing value, at rates of up to 86.8 t ha-1, supplying from 671 to 1.342 kg ha-1 Cr(III); coal waste was applied at a total rate of 164 t ha-1. Crop yield and dry matter production were evaluated, as well as the nutrients (N, P, K, Ca, Mg, Cu and Zn) and Cr contents. Crop yields with tannery sludge application were similar to those obtained with N and lime supplied with mineral amendments. Plant Cr absorption did not increase significantly with the residue application. Tannery sludge can be used also to neutralize the high acidity developed in the soil by coal mine waste.
Resumo:
We consider stochastic partial differential equations with multiplicative noise. We derive an algorithm for the computer simulation of these equations. The algorithm is applied to study domain growth of a model with a conserved order parameter. The numerical results corroborate previous analytical predictions obtained by linear analysis.
Resumo:
We apply majorization theory to study the quantum algorithms known so far and find that there is a majorization principle underlying the way they operate. Grover's algorithm is a neat instance of this principle where majorization works step by step until the optimal target state is found. Extensions of this situation are also found in algorithms based in quantum adiabatic evolution and the family of quantum phase-estimation algorithms, including Shor's algorithm. We state that in quantum algorithms the time arrow is a majorization arrow.
Resumo:
Tests for bioaccessibility are useful in human health risk assessment. No research data with the objective of determining bioaccessible arsenic (As) in areas affected by gold mining and smelting activities have been published so far in Brazil. Samples were collected from four areas: a private natural land reserve of Cerrado; mine tailings; overburden; and refuse from gold smelting of a mining company in Paracatu, Minas Gerais. The total, bioaccessible and Mehlich-1-extractable As levels were determined. Based on the reproducibility and the accuracy/precision of the in vitro gastrointestinal (IVG) determination method of bioaccessible As in the reference material NIST 2710, it was concluded that this procedure is adequate to determine bioaccessible As in soil and tailing samples from gold mining areas in Brazil. All samples from the studied mining area contained low percentages of bioaccessible As.
Resumo:
We herein present a preliminary practical algorithm for evaluating complementary and alternative medicine (CAM) for children which relies on basic bioethical principles and considers the influence of CAM on global child healthcare. CAM is currently involved in almost all sectors of pediatric care and frequently represents a challenge to the pediatrician. The aim of this article is to provide a decision-making tool to assist the physician, especially as it remains difficult to keep up-to-date with the latest developments in the field. The reasonable application of our algorithm together with common sense should enable the pediatrician to decide whether pediatric (P)-CAM represents potential harm to the patient, and allow ethically sound counseling. In conclusion, we propose a pragmatic algorithm designed to evaluate P-CAM, briefly explain the underlying rationale and give a concrete clinical example.
Resumo:
We present a numerical method for spectroscopic ellipsometry of thick transparent films. When an analytical expression for the dispersion of the refractive index (which contains several unknown coefficients) is assumed, the procedure is based on fitting the coefficients at a fixed thickness. Then the thickness is varied within a range (according to its approximate value). The final result given by our method is as follows: The sample thickness is considered to be the one that gives the best fitting. The refractive index is defined by the coefficients obtained for this thickness.
Resumo:
The construction of a soil after surface coal mining involves heavy machinery traffic during the topographic regeneration of the area, resulting in compaction of the relocated soil layers. This leads to problems with water infiltration and redistribution along the new profile, causing water erosion and consequently hampering the revegetation of the reconstructed soil. The planting of species useful in the process of soil decompaction is a promising strategy for the recovery of the soil structural quality. This study investigated the influence of different perennial grasses on the recovery of reconstructed soil aggregation in a coal mining area of the Companhia Riograndense de Mineração, located in Candiota-RS, which were planted in September/October 2007. The treatments consisted of planting: T1- Cynodon dactylon cv vaquero; T2 - Urochloa brizantha; T3 - Panicum maximun; T4 - Urochloa humidicola; T5 - Hemarthria altissima; T6 - Cynodon dactylon cv tifton 85. Bare reconstructed soil, adjacent to the experimental area, was used as control treatment (T7) and natural soil adjacent to the mining area covered with native vegetation was used as reference area (T8). Disturbed and undisturbed soil samples were collected in October/2009 (layers 0.00-0.05 and 0.10-0.15 m) to determine the percentage of macro- and microaggregates, mean weight diameter (MWD) of aggregates, organic matter content, bulk density, and macro- and microporosity. The lower values of macroaggregates and MWD in the surface than in the subsurface layer of the reconstructed soil resulted from the high degree of compaction caused by the traffic of heavy machinery on the clay material. After 24 months, all experimental grass treatments showed improvements in soil aggregation compared to the bare reconstructed soil (control), mainly in the 0.00-0.05 m layer, particularly in the two Urochloa treatments (T2 and T4) and Hemarthria altissima (T5). However, the great differences between the treatments with grasses and natural soil (reference) indicate that the recovery of the pre-mining soil structure could take decades.