980 resultados para Point cloud processing
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Objective: The purpose of this study was to compare the dental movement that occurs during the processing of maxillary complete dentures with 3 different base thicknesses, using 2 investment methods, and microwave polymerization.Methods: A sample of 42 denture models was randomly divided into 6 groups (n = 7), with base thicknesses of 1.25, 2.50, and 3.75 mm and gypsum or silicone flask investment. Points were demarcated on the distal surface of the second molars and on the back of the gypsum cast at the alveolar ridge level to allow linear and angular measurement using AutoCAD software. The data were subjected to analysis of variance with double factor, Tukey test and Fisher (post hoc).Results: Angular analysis of the varying methods and their interactions generated a statistical difference (P = 0.023) when the magnitudes of molar inclination were compared. Tooth movement was greater for thin-based prostheses, 1.25 mm (-0.234), versus thick 3.75 mm (0.2395), with antagonistic behavior. Prosthesis investment with silicone (0.053) showed greater vertical change compared with the gypsum investment (0.032). There was a difference between the point of analysis, demonstrating that the changes were not symmetric.Conclusions: All groups evaluated showed change in the position of artificial teeth after processing. The complete denture with a thin base (1.25 mm) and silicone investment showed the worst results, whereas intermediate thickness (2.50 mm) was demonstrated to be ideal for the denture base.
Resumo:
Peanuts are likely to be infested by fungi with consequent contamination by aflatoxin in post-harvest industries. A hazard analysis critical control point (HACCP) plan is proposed for a typical Brazilian post-harvest industry from raw in-shell reception to the unpeeled peanuts transportation. Codex Alimentarius Commission guidelines were followed, with four critical control points (CCP) for aflatoxin being identified. The process steps with highest probability of aflatoxin occurrence (risk) are the in-shell reception, the dried in-shell storage, and the unpeeled kernel storage. During the storage steps there is a lack of control of air moisture and temperature. Therefore, there is no option but to keep rigid monitoring and control over each CCP, and detour lots with high aflatoxin levels to either oil or seed production. Attempts to correlate the aflatoxin levels with the rainfall showed an irregular trend of the toxin level. (c) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Indium-tin oxide nanowires were deposited by excimer laser ablation onto catalyst-free oxidized silicon substrates at a low temperature of 500 degrees C in a nitrogen atmosphere. The nanowires have branches with spheres at the tips, indicating a vapor-liquid-solid (VLS) growth. The deposition time and pressure have a strong influence on the areal density and length of the nanowires. At the earlier stages of growth, lower pressures promote a larger number of nucleation centers. With the increase in deposition time, both the number and length of the wires increase up to an areal density of about 70 wires/mu m(2). After this point all the material arriving at the substrate is used for lengthening the existing wires and their branches. The nanowires present the single-crystalline cubic bixbyite structure of indium oxide, oriented in the [100] direction. These structures have potential applications in electrical and optical nanoscale devices.
Resumo:
Until mid 2006, SCIAMACHY data processors for the operational retrieval of nitrogen dioxide (NO2) column data were based on the historical version 2 of the GOME Data Processor (GDP). On top of known problems inherent to GDP 2, ground-based validations of SCIAMACHY NO2 data revealed issues specific to SCIAMACHY, like a large cloud-dependent offset occurring at Northern latitudes. In 2006, the GDOAS prototype algorithm of the improved GDP version 4 was transferred to the off-line SCIAMACHY Ground Processor (SGP) version 3.0. In parallel, the calibration of SCIAMACHY radiometric data was upgraded. Before operational switch-on of SGP 3.0 and public release of upgraded SCIAMACHY NO2 data, we have investigated the accuracy of the algorithm transfer: (a) by checking the consistency of SGP 3.0 with prototype algorithms; and (b) by comparing SGP 3.0 NO2 data with ground-based observations reported by the WMO/GAW NDACC network of UV-visible DOAS/SAOZ spectrometers. This delta-validation study concludes that SGP 3.0 is a significant improvement with respect to the previous processor IPF 5.04. For three particular SCIAMACHY states, the study reveals unexplained features in the slant columns and air mass factors, although the quantitative impact on SGP 3.0 vertical columns is not significant.
Resumo:
Simulations of overshooting, tropical deep convection using a Cloud Resolving Model with bulk microphysics are presented in order to examine the effect on the water content of the TTL (Tropical Tropopause Layer) and lower stratosphere. This case study is a subproject of the HIBISCUS (Impact of tropical convection on the upper troposphere and lower stratosphere at global scale) campaign, which took place in Bauru, Brazil (22° S, 49° W), from the end of January to early March 2004. Comparisons between 2-D and 3-D simulations suggest that the use of 3-D dynamics is vital in order to capture the mixing between the overshoot and the stratospheric air, which caused evaporation of ice and resulted in an overall moistening of the lower stratosphere. In contrast, a dehydrating effect was predicted by the 2-D simulation due to the extra time, allowed by the lack of mixing, for the ice transported to the region to precipitate out of the overshoot air. Three different strengths of convection are simulated in 3-D by applying successively lower heating rates (used to initiate the convection) in the boundary layer. Moistening is produced in all cases, indicating that convective vigour is not a factor in whether moistening or dehydration is produced by clouds that penetrate the tropopause, since the weakest case only just did so. An estimate of the moistening effect of these clouds on an air parcel traversing a convective region is made based on the domain mean simulated moistening and the frequency of convective events observed by the IPMet (Instituto de Pesquisas Meteorológicas, Universidade Estadual Paulista) radar (S-band type at 2.8 Ghz) to have the same 10 dBZ echo top height as those simulated. These suggest a fairly significant mean moistening of 0.26, 0.13 and 0.05 ppmv in the strongest, medium and weakest cases, respectively, for heights between 16 and 17 km. Since the cold point and WMO (World Meteorological Organization) tropopause in this region lies at ∼ 15.9 km, this is likely to represent direct stratospheric moistening. Much more moistening is predicted for the 15-16 km height range with increases of 0.85-2.8 ppmv predicted. However, it would be required that this air is lofted through the tropopause via the Brewer Dobson circulation in order for it to have a stratospheric effect. Whether this is likely is uncertain and, in addition, the dehydration of air as it passes through the cold trap and the number of times that trajectories sample convective regions needs to be taken into account to gauge the overall stratospheric effect. Nevertheless, the results suggest a potentially significant role for convection in determining the stratospheric water content. Sensitivity tests exploring the impact of increased aerosol numbers in the boundary layer suggest that a corresponding rise in cloud droplet numbers at cloud base would increase the number concentrations of the ice crystals transported to the TTL, which had the effect of reducing the fall speeds of the ice and causing a ∼13% rise in the mean vapour increase in both the 15-16 and 16-17 km height ranges, respectively, when compared to the control case. Increases in the total water were much larger, being 34% and 132% higher for the same height ranges, but it is unclear whether the extra ice will be able to evaporate before precipitating from the region. These results suggest a possible impact of natural and anthropogenic aerosols on how convective clouds affect stratospheric moisture levels.
Resumo:
Nowadays, L1 SBAS signals can be used in a combined GPS+SBAS data processing. However, such situation restricts the studies over short baselines. Besides of increasing the satellite availability, SBAS satellites orbit configuration is different from that of GPS. In order to analyze how these characteristics can impact GPS positioning in the southeast area of Brazil, experiments involving GPS-only and combined GPS+SBAS data were performed. Solutions using single point and relative positioning were computed to show the impact over satellite geometry, positioning accuracy and short baseline ambiguity resolution. Results showed that the inclusion of SBAS satellites can improve the accuracy of positioning. Nevertheless, the bad quality of the data broadcasted by these satellites limits their usage. © Springer-Verlag Berlin Heidelberg 2012.
Resumo:
Grinding is a parts finishing process for advanced products and surfaces. However, continuous friction between the workpiece and the grinding wheel causes the latter to lose its sharpness, thus impairing the grinding results. This is when the dressing process is required, which consists of sharpening the worn grains of the grinding wheel. The dressing conditions strongly affect the performance of the grinding operation; hence, monitoring them throughout the process can increase its efficiency. The objective of this study was to estimate the wear of a single-point dresser using intelligent systems whose inputs were obtained by the digital processing of acoustic emission signals. Two intelligent systems, the multilayer perceptron and the Kohonen neural network, were compared in terms of their classifying ability. The harmonic content of the acoustic emission signal was found to be influenced by the condition of dresser, and when used to feed the neural networks it is possible to classify the condition of the tool under study.
Resumo:
Eumelanin pigments show hydration-dependent conductivity, broad-band UV-vis absorption, and chelation of metal ions. Solution-processing of synthetic eumelanins opens new possibilities for the characterization of eumelanin in thin film form and its integration into bioelectronic devices. We investigate the effect of different synthesis routes and processing solvents on the growth, the morphology, and the chemical composition of eumelanin thin films using atomic force microscopy and X-ray photoelectron spectroscopy. We further characterize the films by transient electrical current measurements obtained at 50% to 90% relative humidity, relevant for bioelectronic applications. We show that the use of dimethyl sulfoxide is preferable over ammonia solution as processing solvent, yielding homogeneous films with surface roughnesses below 0.5 nm and a chemical composition in agreement with the eumelanin molecular structure. These eumelanin films grow in a quasi layer-by-layer mode, each layer being composed of nanoaggregates, 1-2 nm high, 10-30 nm large. The transient electrical measurements using a planar two-electrode device suggest that there are two contributions to the current, electronic and ionic, the latter being increasingly dominant at higher hydration, and point to the importance of time-dependent electrical characterization of eumelanin films. This journal is © 2013 The Royal Society of Chemistry.
Resumo:
Grinding is a workpiece finishing process for advanced products and surfaces. However, the constant friction between workpiece and grinding wheel causes the latter to lose its sharpness, thereby impairing the result of the grinding process. When this occurs, the dressing process is essential to sharpen the worn grains of the grinding wheel. The dressing conditions strongly influence the performance of the grinding operation; hence, monitoring them throughout the process can increase its efficiency. The purpose of this study was to classify the wear condition of a single-point dresser using intelligent systems whose inputs were obtained by digitally processing acoustic emission signals. Two multilayer perceptron (MLP) neural networks were compared for their classification ability, one using the root mean square (RMS) statistics and another the ratio of power (ROP) statistics as input. In this study, it was found that the harmonic content of the acoustic emission signal is influenced by the condition of the dresser, and that the condition of the tool under study can be classified by using the aforementioned statistics to feed a neural network. © IFAC.
Resumo:
Includes bibliography.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The representation of real objects in virtual environments has applications in many areas, such as cartography, mixed reality and reverse engineering. The generation of these objects can be performed through two ways: manually, with CAD (Computer Aided Design) tools, or automatically, by means of surface reconstruction techniques. The simpler the 3D model, the easier it is to process and store it. However, this methods can generate very detailed virtual elements, that can result in some problems when processing the resulting mesh, because it has a lot of edges and polygons that have to be checked at visualization. Considering this context, it can be applied simplification algorithms to eliminate polygons from resulting mesh, without change its topology, generating a lighter mesh with less irrelevant details. The project aimed the study, implementation and comparative tests of simplification algorithms applied to meshes generated through a reconstruction pipeline based on point clouds. This work proposes the realization of the simplification step, like a complement to the pipeline developed by (ONO et al., 2012), that developed reconstruction through cloud points obtained by Microsoft Kinect, and then using Poisson algorithm
Resumo:
The human dentition is naturally translucent, opalescent and fluorescent. Differences between the level of fluorescence of tooth structure and restorative materials may result in distinct metameric properties and consequently perceptible disparate esthetic behavior, which impairs the esthetic result of the restorations, frustrating both patients and staff. In this study, we evaluated the level of fluorescence of different composites (Durafill in tones A2 (Du), Charisma in tones A2 (Ch), Venus in tone A2 (Ve), Opallis enamel and dentin in tones A2 (OPD and OPE), Point 4 in tones A2 (P4), Z100 in tones A2 ( Z1), Z250 in tones A2 (Z2), Te-Econom in tones A2 (TE), Tetric Ceram in tones A2 (TC), Tetric Ceram N in tones A1, A2, A4 (TN1, TN2, TN4), Four seasons enamel and dentin in tones A2 (and 4SD 4SE), Empress Direct enamel and dentin in tones A2 (EDE and EDD) and Brilliant in tones A2 (Br)). Cylindrical specimens were prepared, coded and photographed in a standardized manner with a Canon EOS digital camera (400 ISO, 2.8 aperture and 1/ 30 speed), in a dark environment under the action of UV light (25 W). The images were analyzed with the software ScanWhite©-DMC/Darwin systems. The results showed statistical differences between the groups (p < 0.05), and between these same groups and the average fluorescence of the dentition of young (18 to 25 years) and adults (40 to 45 years) taken as control. It can be concluded that: Composites Z100, Z250 (3M ESPE) and Point 4 (Kerr) do not match with the fluorescence of human dentition and the fluorescence of the materials was found to be affected by their own tone.