986 resultados para surface modeling
Resumo:
The goal of this thesis is to implement software for creating 3D models from point clouds. Point clouds are acquired with stereo cameras, monocular systems or laser scanners. The created 3D models are triangular models or NURBS (Non-Uniform Rational B-Splines) models. Triangular models are constructed from selected areas from the point clouds and resulted triangular models are translated into a set of quads. The quads are further translated into an estimated grid structure and used for NURBS surface approximation. Finally, we have a set of NURBS surfaces which represent the whole model. The problem wasn’t so easy to solve. The selected triangular surface reconstruction algorithm did not deal well with noise in point clouds. To handle this problem, a clustering method is introduced for simplificating the model and removing noise. As we had better results with the smaller point clouds produced by clustering, we used points in clusters to better estimate the grids for NURBS models. The overall results were good when the point cloud did not have much noise. The point clouds with small amount of error had good results as the triangular model was solid. NURBS surface reconstruction performed well on solid models.
Resumo:
Monitoring Earth's terrestrial water conditions is critically important to many hydrological applications such as global food production; assessing water resources sustainability; and flood, drought, and climate change prediction. These needs have motivated the development of pilot monitoring and prediction systems for terrestrial hydrologic and vegetative states, but to date only at the rather coarse spatial resolutions (∼10–100 km) over continental to global domains. Adequately addressing critical water cycle science questions and applications requires systems that are implemented globally at much higher resolutions, on the order of 1 km, resolutions referred to as hyperresolution in the context of global land surface models. This opinion paper sets forth the needs and benefits for a system that would monitor and predict the Earth's terrestrial water, energy, and biogeochemical cycles. We discuss six major challenges in developing a system: improved representation of surface‐subsurface interactions due to fine‐scale topography and vegetation; improved representation of land‐atmospheric interactions and resulting spatial information on soil moisture and evapotranspiration; inclusion of water quality as part of the biogeochemical cycle; representation of human impacts from water management; utilizing massively parallel computer systems and recent computational advances in solving hyperresolution models that will have up to 109 unknowns; and developing the required in situ and remote sensing global data sets. We deem the development of a global hyperresolution model for monitoring the terrestrial water, energy, and biogeochemical cycles a “grand challenge” to the community, and we call upon the international hydrologic community and the hydrological science support infrastructure to endorse the effort.
Resumo:
The present study describes a methodology of dosage of glycerol kinase (GK) from baker's yeast. The standardization of the activity of the glycerol kinase from baker's yeast was accomplished using the diluted enzymatic preparation containing glycerol phosphate oxidase (GPO) and glycerol kinase. The mixture was incubated at 60 degrees C by 15 min and the reaction was stopped by the SDS solution addition. A first set of experiments was carried out in order to investigate the individual effect of temperature (7), pH and substrate concentration (S), on GK activity and stability. The pH and temperature stability tests showed that the enzyme presented a high stability to pH 6.0-8.0 and the thermal stability were completely maintained up to 50 degrees C during 1 h. The K(m) of the enzyme for glycerol was calculated to be 2 mM and V(max) to be 1.15 U/mL. In addition, modeling and optimization of reaction conditions was attempted by response surface methodology (RSM). Higher activity values will be attained at temperatures between 52 and 56 degrees C, pH around 10.2-10.5 and substrate concentrations from 150 to 170 mM.This low cost method for glycerol kinase dosage in a sequence of reactions is of great importance for many industries, like food, sugar and alcohol. RSM showed to be an adequate approach for modeling the reaction and optimization of reaction conditions to maximize glycerol kinase activity. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
This chapter will present the conceptual and applied approaches to capture the interaction of anesthetic hypnotic drugs with opioid drugs, as used in the clinical anesthetic state. The graphic and mathematical approaches used to capture hypnotic/opiate anesthetic drug interactions will be presented. This chapter is not a review article about interaction modeling, but focuses on specific drug interactions within a quite narrow field, anesthesia.
Resumo:
BACKGROUND: Propofol and sevoflurane display additivity for gamma-aminobutyric acid receptor activation, loss of consciousness, and tolerance of skin incision. Information about their interaction regarding electroencephalographic suppression is unavailable. This study examined this interaction as well as the interaction on the probability of tolerance of shake and shout and three noxious stimulations by using a response surface methodology. METHODS: Sixty patients preoperatively received different combined concentrations of propofol (0-12 microg/ml) and sevoflurane (0-3.5 vol.%) according to a crisscross design (274 concentration pairs, 3 to 6 per patient). After having reached pseudo-steady state, the authors recorded bispectral index, state and response entropy and the response to shake and shout, tetanic stimulation, laryngeal mask airway insertion, and laryngoscopy. For the analysis of the probability of tolerance by logistic regression, a Greco interaction model was used. For the separate analysis of bispectral index, state and response entropy suppression, a fractional Emax Greco model was used. All calculations were performed with NONMEM V (GloboMax LLC, Hanover, MD). RESULTS: Additivity was found for all endpoints, the Ce(50, PROP)/Ce(50, SEVO) for bispectral index suppression was 3.68 microg. ml(-1)/ 1.53 vol.%, for tolerance of shake and shout 2.34 microg . ml(-1)/ 1.03 vol.%, tetanic stimulation 5.34 microg . ml(-1)/ 2.11 vol.%, laryngeal mask airway insertion 5.92 microg. ml(-1) / 2.55 vol.%, and laryngoscopy 6.55 microg. ml(-1)/2.83 vol.%. CONCLUSION: For both electroencephalographic suppression and tolerance to stimulation, the interaction of propofol and sevoflurane was identified as additive. The response surface data can be used for more rational dose finding in case of sequential and coadministration of propofol and sevoflurane.
Resumo:
Semi-automatic building detection and extraction is a topic of growing interest due to its potential application in such areas as cadastral information systems, cartographic revision, and GIS. One of the existing strategies for building extraction is to use a digital surface model (DSM) represented by a cloud of known points on a visible surface, and comprising features such as trees or buildings. Conventional surface modeling using stereo-matching techniques has its drawbacks, the most obvious being the effect of building height on perspective, shadows, and occlusions. The laser scanner, a recently developed technological tool, can collect accurate DSMs with high spatial frequency. This paper presents a methodology for semi-automatic modeling of buildings which combines a region-growing algorithm with line-detection methods applied over the DSM.
Resumo:
BACKGROUND:: The interaction of sevoflurane and opioids can be described by response surface modeling using the hierarchical model. We expanded this for combined administration of sevoflurane, opioids, and 66 vol.% nitrous oxide (N2O), using historical data on the motor and hemodynamic responsiveness to incision, the minimal alveolar concentration, and minimal alveolar concentration to block autonomic reflexes to nociceptive stimuli, respectively. METHODS:: Four potential actions of 66 vol.% N2O were postulated: (1) N2O is equivalent to A ng/ml of fentanyl (additive); (2) N2O reduces C50 of fentanyl by factor B; (3) N2O is equivalent to X vol.% of sevoflurane (additive); (4) N2O reduces C50 of sevoflurane by factor Y. These four actions, and all combinations, were fitted on the data using NONMEM (version VI, Icon Development Solutions, Ellicott City, MD), assuming identical interaction parameters (A, B, X, Y) for movement and sympathetic responses. RESULTS:: Sixty-six volume percentage nitrous oxide evokes an additive effect corresponding to 0.27 ng/ml fentanyl (A) with an additive effect corresponding to 0.54 vol.% sevoflurane (X). Parameters B and Y did not improve the fit. CONCLUSION:: The effect of nitrous oxide can be incorporated into the hierarchical interaction model with a simple extension. The model can be used to predict the probability of movement and sympathetic responses during sevoflurane anesthesia taking into account interactions with opioids and 66 vol.% N2O.
Resumo:
The Everglades Depth Estimation Network (EDEN) is an integrated network of realtime water-level monitoring, ground-elevation modeling, and water-surface modeling that provides scientists and managers with current (2000-present), online water-stage and water-depth information for the entire freshwater portion of the Greater Everglades. Continuous daily spatial interpolations of the EDEN network stage data are presented on grid with 400-square-meter spacing. EDEN offers a consistent and documented dataset that can be used by scientists and managers to: (1) guide large-scale field operations, (2) integrate hydrologic and ecological responses, and (3) support biological and ecological assessments that measure ecosystem responses to the implementation of the Comprehensive Everglades Restoration Plan (CERP) (U.S. Army Corps of Engineers, 1999). The target users are biologists and ecologists examining trophic level responses to hydrodynamic changes in the Everglades. The first objective of this report is to validate the spatially continuous EDEN water-surface model for the Everglades, Florida developed by Pearlstine et al. (2007) by using an independent field-measured data-set. The second objective is to demonstrate two applications of the EDEN water-surface model: to estimate site-specific ground elevation by using the validated EDEN water-surface model and observed water depth data; and to create water-depth hydrographs for tree islands. We found that there are no statistically significant differences between model-predicted and field-observed water-stage data in both southern Water Conservation Area (WCA) 3A and WCA 3B. Tree island elevations were derived by subtracting field water-depth measurements from the predicted EDEN water-surface. Water-depth hydrographs were then computed by subtracting tree island elevations from the EDEN water stage. Overall, the model is reliable by a root mean square error (RMSE) of 3.31 cm. By region, the RMSE is 2.49 cm and 7.77 cm in WCA 3A and 3B, respectively. This new landscape-scale hydrological model has wide applications for ongoing research and management efforts that are vital to restoration of the Florida Everglades. The accurate, high-resolution hydrological data, generated over broad spatial and temporal scales by the EDEN model, provides a previously missing key to understanding the habitat requirements and linkages among native and invasive populations, including fish, wildlife, wading birds, and plants. The EDEN model is a powerful tool that could be adapted for other ecosystem-scale restoration and management programs worldwide.
Resumo:
The world's largest fossil oyster reef, formed by the giant oyster Crassostrea gryphoides and located in Stetten (north of Vienna, Austria) is studied by Harzhauser et al., 2015, 2016; Djuricic et al., 2016. Digital documentation of the unique geological site is provided by terrestrial laser scanning (TLS) at the millimeter scale. Obtaining meaningful results is not merely a matter of data acquisition with a suitable device; it requires proper planning, data management, and postprocessing. Terrestrial laser scanning technology has a high potential for providing precise 3D mapping that serves as the basis for automatic object detection in different scenarios; however, it faces challenges in the presence of large amounts of data and the irregular geometry of an oyster reef. We provide a detailed description of the techniques and strategy used for data collection and processing in Djuricic et al., 2016. The use of laser scanning provided the ability to measure surface points of 46,840 (estimated) shells. They are up to 60-cm-long oyster specimens, and their surfaces are modeled with a high accuracy of 1 mm. In addition to laser scanning measurements, more than 300 photographs were captured, and an orthophoto mosaic was generated with a ground sampling distance (GSD) of 0.5 mm. This high-resolution 3D information and the photographic texture serve as the basis for ongoing and future geological and paleontological analyses. Moreover, they provide unprecedented documentation for conservation issues at a unique natural heritage site.
Resumo:
An efficient technique to cut polygonal meshes as a step in the geometric modeling of topographic and geological data has been developed. In boundary represented models of outcropping strata and faulted horizons polygonal meshes often intersect each other. TRICUT determines the line of intersection and re-triangulates the area of contact. Along this line the mesh is split in two or more parts which can be selected for removal. The user interaction takes place in the 3D-model space. The intersection, selection and removal are under graphic control. The visualization of outcropping geological structures in digital terrain models is improved by determining intersections against a slightly shifted terrain model. Thus, the outcrop line becomes a surface which overlaps the terrain in its initial position. The area of this overlapping surface changes with respect to the strike and dip of the structure, the morphology and the offset. Some applications of TRICUT on different real datasets are shown. TRICUT is implemented in C+ + using the Visualization Toolkit in conjunction with the RAPID and TRIANGLE libraries. The program runs under LINUX and UNIX using the MESA OpenGL library. This work gives an example of solving a complex 3D geometric problem by integrating available robust public domain software. (C) 2002 Elsevier B.V. Ltd. All rights reserved.
Resumo:
The present experiment aimed to study the influence of point positioning in an irradiated field to produce a planialtimetric plant. A planialtimetric evaluation was carried out in a 14-acre experimental area with well-defined topographic variations. Planialtimetric maps were designed using manual procedures, Datageosis and Topoesalq. Datageosis built all the curves after numerical surface modeling. Topoesalq provided only height reports and the drawing of curves was done manually. The third method was a manual procedure. Because there were planialtimetry representation differences, longitudinal profiles were used in the sites where there was a great divergence among plants. When obtained profiles and plants were compared, it was verified that the one produced by Datageosis represented the relief plant better. Later, only the irradiated field points were evaluated and each point presented positioned readings before and after each relief map variation. The processing through the three methods resulted in significant plants of the local planialtimetry, according to the control profiles. It was concluded that the planning of the field procedure should be suitable to the posterior treatment method of obtained data in order to make a planialtimetric plant to accord to the evaluated local topography.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
1. Distance sampling is a widely used technique for estimating the size or density of biological populations. Many distance sampling designs and most analyses use the software Distance. 2. We briefly review distance sampling and its assumptions, outline the history, structure and capabilities of Distance, and provide hints on its use. 3. Good survey design is a crucial prerequisite for obtaining reliable results. Distance has a survey design engine, with a built-in geographic information system, that allows properties of different proposed designs to be examined via simulation, and survey plans to be generated. 4. A first step in analysis of distance sampling data is modeling the probability of detection. Distance contains three increasingly sophisticated analysis engines for this: conventional distance sampling, which models detection probability as a function of distance from the transect and assumes all objects at zero distance are detected; multiple-covariate distance sampling, which allows covariates in addition to distance; and mark–recapture distance sampling, which relaxes the assumption of certain detection at zero distance. 5. All three engines allow estimation of density or abundance, stratified if required, with associated measures of precision calculated either analytically or via the bootstrap. 6. Advanced analysis topics covered include the use of multipliers to allow analysis of indirect surveys (such as dung or nest surveys), the density surface modeling analysis engine for spatial and habitat-modeling, and information about accessing the analysis engines directly from other software. 7. Synthesis and applications. Distance sampling is a key method for producing abundance and density estimates in challenging field conditions. The theory underlying the methods continues to expand to cope with realistic estimation situations. In step with theoretical developments, state-of- the-art software that implements these methods is described that makes the methods accessible to practicing ecologists.