947 resultados para Numerical surface modeling


Relevância:

90.00% 90.00%

Publicador:

Resumo:

We have calculated the equilibrium shape of the axially symmetric meniscus along which a spherical bubble contacts a flat liquid surface, by analytically integrating the Young-Laplace equation in the presence of gravity, in the limit of large Bond numbers. This method has the advantage that it provides semi-analytical expressions for key geometrical properties of the bubble in terms of the Bond number. Results are in good overall agreement with experimental data and are consistent with fully numerical (Surface Evolver) calculations. In particular, we are able to describe how the bubble shape changes from hemispherical, with a shallow flat bottom, to lenticular, with a deeper, curved bottom, as the Bond number is decreased.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The pulmonary crackling and the formation of liquid bridges are problems that for centuries have been attracting the attention of scientists. In order to study these phenomena, it was developed a canonical cubic lattice-gas­ like model to explain the rupture of liquid bridges in lung airways [A. Alencar et al., 2006, PRE]. Here, we further develop this model and add entropy analysis to study thermodynamic properties, such as free energy and force. The simulations were performed using the Monte Carlo method with Metropolis algorithm. The exchange between gas and liquid particles were performed randomly according to the Kawasaki dynamics and weighted by the Boltzmann factor. Each particle, which can be solid (s), liquid (l) or gas (g), has 26 neighbors: 6 + 12 + 8, with distances 1, √2 and √3, respectively. The energy of a lattice's site m is calculated by the following expression: Em = ∑k=126 Ji(m)j(k) in witch (i, j) = g, l or s. Specifically, it was studied the surface free energy of the liquid bridge, trapped between two planes, when its height is changed. For that, was considered two methods. First, just the internal energy was calculated. Then was considered the entropy. It was fond no difference in the surface free energy between this two methods. We calculate the liquid bridge force between the two planes using the numerical surface free energy. This force is strong for small height, and decreases as the distance between the two planes, height, is increased. The liquid-gas system was also characterized studying the variation of internal energy and heat capacity with the temperature. For that, was performed simulation with the same proportion of liquid and gas particle, but different lattice size. The scale of the liquid-gas system was also studied, for low temperature, using different values to the interaction Jij.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND:: The interaction of sevoflurane and opioids can be described by response surface modeling using the hierarchical model. We expanded this for combined administration of sevoflurane, opioids, and 66 vol.% nitrous oxide (N2O), using historical data on the motor and hemodynamic responsiveness to incision, the minimal alveolar concentration, and minimal alveolar concentration to block autonomic reflexes to nociceptive stimuli, respectively. METHODS:: Four potential actions of 66 vol.% N2O were postulated: (1) N2O is equivalent to A ng/ml of fentanyl (additive); (2) N2O reduces C50 of fentanyl by factor B; (3) N2O is equivalent to X vol.% of sevoflurane (additive); (4) N2O reduces C50 of sevoflurane by factor Y. These four actions, and all combinations, were fitted on the data using NONMEM (version VI, Icon Development Solutions, Ellicott City, MD), assuming identical interaction parameters (A, B, X, Y) for movement and sympathetic responses. RESULTS:: Sixty-six volume percentage nitrous oxide evokes an additive effect corresponding to 0.27 ng/ml fentanyl (A) with an additive effect corresponding to 0.54 vol.% sevoflurane (X). Parameters B and Y did not improve the fit. CONCLUSION:: The effect of nitrous oxide can be incorporated into the hierarchical interaction model with a simple extension. The model can be used to predict the probability of movement and sympathetic responses during sevoflurane anesthesia taking into account interactions with opioids and 66 vol.% N2O.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Everglades Depth Estimation Network (EDEN) is an integrated network of realtime water-level monitoring, ground-elevation modeling, and water-surface modeling that provides scientists and managers with current (2000-present), online water-stage and water-depth information for the entire freshwater portion of the Greater Everglades. Continuous daily spatial interpolations of the EDEN network stage data are presented on grid with 400-square-meter spacing. EDEN offers a consistent and documented dataset that can be used by scientists and managers to: (1) guide large-scale field operations, (2) integrate hydrologic and ecological responses, and (3) support biological and ecological assessments that measure ecosystem responses to the implementation of the Comprehensive Everglades Restoration Plan (CERP) (U.S. Army Corps of Engineers, 1999). The target users are biologists and ecologists examining trophic level responses to hydrodynamic changes in the Everglades. The first objective of this report is to validate the spatially continuous EDEN water-surface model for the Everglades, Florida developed by Pearlstine et al. (2007) by using an independent field-measured data-set. The second objective is to demonstrate two applications of the EDEN water-surface model: to estimate site-specific ground elevation by using the validated EDEN water-surface model and observed water depth data; and to create water-depth hydrographs for tree islands. We found that there are no statistically significant differences between model-predicted and field-observed water-stage data in both southern Water Conservation Area (WCA) 3A and WCA 3B. Tree island elevations were derived by subtracting field water-depth measurements from the predicted EDEN water-surface. Water-depth hydrographs were then computed by subtracting tree island elevations from the EDEN water stage. Overall, the model is reliable by a root mean square error (RMSE) of 3.31 cm. By region, the RMSE is 2.49 cm and 7.77 cm in WCA 3A and 3B, respectively. This new landscape-scale hydrological model has wide applications for ongoing research and management efforts that are vital to restoration of the Florida Everglades. The accurate, high-resolution hydrological data, generated over broad spatial and temporal scales by the EDEN model, provides a previously missing key to understanding the habitat requirements and linkages among native and invasive populations, including fish, wildlife, wading birds, and plants. The EDEN model is a powerful tool that could be adapted for other ecosystem-scale restoration and management programs worldwide.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The world's largest fossil oyster reef, formed by the giant oyster Crassostrea gryphoides and located in Stetten (north of Vienna, Austria) is studied by Harzhauser et al., 2015, 2016; Djuricic et al., 2016. Digital documentation of the unique geological site is provided by terrestrial laser scanning (TLS) at the millimeter scale. Obtaining meaningful results is not merely a matter of data acquisition with a suitable device; it requires proper planning, data management, and postprocessing. Terrestrial laser scanning technology has a high potential for providing precise 3D mapping that serves as the basis for automatic object detection in different scenarios; however, it faces challenges in the presence of large amounts of data and the irregular geometry of an oyster reef. We provide a detailed description of the techniques and strategy used for data collection and processing in Djuricic et al., 2016. The use of laser scanning provided the ability to measure surface points of 46,840 (estimated) shells. They are up to 60-cm-long oyster specimens, and their surfaces are modeled with a high accuracy of 1 mm. In addition to laser scanning measurements, more than 300 photographs were captured, and an orthophoto mosaic was generated with a ground sampling distance (GSD) of 0.5 mm. This high-resolution 3D information and the photographic texture serve as the basis for ongoing and future geological and paleontological analyses. Moreover, they provide unprecedented documentation for conservation issues at a unique natural heritage site.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper deals with fractional differential equations, with dependence on a Caputo fractional derivative of real order. The goal is to show, based on concrete examples and experimental data from several experiments, that fractional differential equations may model more efficiently certain problems than ordinary differential equations. A numerical optimization approach based on least squares approximation is used to determine the order of the fractional operator that better describes real data, as well as other related parameters.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An efficient technique to cut polygonal meshes as a step in the geometric modeling of topographic and geological data has been developed. In boundary represented models of outcropping strata and faulted horizons polygonal meshes often intersect each other. TRICUT determines the line of intersection and re-triangulates the area of contact. Along this line the mesh is split in two or more parts which can be selected for removal. The user interaction takes place in the 3D-model space. The intersection, selection and removal are under graphic control. The visualization of outcropping geological structures in digital terrain models is improved by determining intersections against a slightly shifted terrain model. Thus, the outcrop line becomes a surface which overlaps the terrain in its initial position. The area of this overlapping surface changes with respect to the strike and dip of the structure, the morphology and the offset. Some applications of TRICUT on different real datasets are shown. TRICUT is implemented in C+ + using the Visualization Toolkit in conjunction with the RAPID and TRIANGLE libraries. The program runs under LINUX and UNIX using the MESA OpenGL library. This work gives an example of solving a complex 3D geometric problem by integrating available robust public domain software. (C) 2002 Elsevier B.V. Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

1. Distance sampling is a widely used technique for estimating the size or density of biological populations. Many distance sampling designs and most analyses use the software Distance. 2. We briefly review distance sampling and its assumptions, outline the history, structure and capabilities of Distance, and provide hints on its use. 3. Good survey design is a crucial prerequisite for obtaining reliable results. Distance has a survey design engine, with a built-in geographic information system, that allows properties of different proposed designs to be examined via simulation, and survey plans to be generated. 4. A first step in analysis of distance sampling data is modeling the probability of detection. Distance contains three increasingly sophisticated analysis engines for this: conventional distance sampling, which models detection probability as a function of distance from the transect and assumes all objects at zero distance are detected; multiple-covariate distance sampling, which allows covariates in addition to distance; and mark–recapture distance sampling, which relaxes the assumption of certain detection at zero distance. 5. All three engines allow estimation of density or abundance, stratified if required, with associated measures of precision calculated either analytically or via the bootstrap. 6. Advanced analysis topics covered include the use of multipliers to allow analysis of indirect surveys (such as dung or nest surveys), the density surface modeling analysis engine for spatial and habitat-modeling, and information about accessing the analysis engines directly from other software. 7. Synthesis and applications. Distance sampling is a key method for producing abundance and density estimates in challenging field conditions. The theory underlying the methods continues to expand to cope with realistic estimation situations. In step with theoretical developments, state-of- the-art software that implements these methods is described that makes the methods accessible to practicing ecologists.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis investigates interactive scene reconstruction and understanding using RGB-D data only. Indeed, we believe that depth cameras will still be in the near future a cheap and low-power 3D sensing alternative suitable for mobile devices too. Therefore, our contributions build on top of state-of-the-art approaches to achieve advances in three main challenging scenarios, namely mobile mapping, large scale surface reconstruction and semantic modeling. First, we will describe an effective approach dealing with Simultaneous Localization And Mapping (SLAM) on platforms with limited resources, such as a tablet device. Unlike previous methods, dense reconstruction is achieved by reprojection of RGB-D frames, while local consistency is maintained by deploying relative bundle adjustment principles. We will show quantitative results comparing our technique to the state-of-the-art as well as detailed reconstruction of various environments ranging from rooms to small apartments. Then, we will address large scale surface modeling from depth maps exploiting parallel GPU computing. We will develop a real-time camera tracking method based on the popular KinectFusion system and an online surface alignment technique capable of counteracting drift errors and closing small loops. We will show very high quality meshes outperforming existing methods on publicly available datasets as well as on data recorded with our RGB-D camera even in complete darkness. Finally, we will move to our Semantic Bundle Adjustment framework to effectively combine object detection and SLAM in a unified system. Though the mathematical framework we will describe does not restrict to a particular sensing technology, in the experimental section we will refer, again, only to RGB-D sensing. We will discuss successful implementations of our algorithm showing the benefit of a joint object detection, camera tracking and environment mapping.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Temporal hollowing due to temporal muscle atrophy after standard skull base surgery is common. Various techniques have been previously described to correct the disfiguring defect. Most often reconstruction is performed using freehand molded polymethylmethacrylate cement. This method and material are insufficient in terms of aesthetic results and implant characteristics. We herein propose reconstruction of such defects with a polyetheretherketone (PEEK)-based patient-specific implant (PSI) including soft-tissue augmentation to preserve normal facial topography. We describe a patient who presented with a large temporo-orbital hemangioma that had been repaired with polymethylmethacrylate 25 years earlier. Because of a toxic skin atrophy fistula, followed by infection and meningitis, this initial implant had to be removed. The large, disfiguring temporo-orbital defect was reconstructed with a PEEK-based PSI. The lateral orbital wall and the temporal muscle atrophy were augmented with computer-aided design and surface modeling techniques. The operative procedure to implant and adopt the reconstructed PEEK-based PSI was simple, and an excellent cosmetic outcome was achieved. The postoperative clinical course was uneventful over a 5-year follow-up period. Polyetheretherketone-based combined bony and soft contour remodeling is a feasible and effective method for cranioplasty including combined bone and soft-tissue reconstruction of temporo-orbital defects. Manual reconstruction of this cosmetically delicate area carries an exceptional risk of disfiguring results. Augmentation surgery in this anatomic location needs accurate PSIs to achieve satisfactory cosmetic results. The cosmetic outcome achieved in this case is superior compared with previously reported techniques.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We developed an anatomical mapping technique to detect hippocampal and ventricular changes in Alzheimer disease (AD). The resulting maps are sensitive to longitudinal changes in brain structure as the disease progresses. An anatomical surface modeling approach was combined with surface-based statistics to visualize the region and rate of atrophy in serial MRI scans and isolate where these changes link with cognitive decline. Fifty-two high-resolution MRI scans were acquired from 12 AD patients (age: 68.4 +/- 1.9 years) and 14 matched controls (age: 71.4 +/- 0.9 years), each scanned twice (2.1 +/- 0.4 years apart). 3D parametric mesh models of the hippocampus and temporal horns were created in sequential scans and averaged across subjects to identify systematic patterns of atrophy. As an index of radial atrophy, 3D distance fields were generated relating each anatomical surface point to a medial curve threading down the medial axis of each structure. Hippocampal atrophic rates and ventricular expansion were assessed statistically using surface-based permutation testing and were faster in AD than in controls. Using color-coded maps and video sequences, these changes were visualized as they progressed anatomically over time. Additional maps localized regions where atrophic changes linked with cognitive decline. Temporal horn expansion maps were more sensitive to AD progression than maps of hippocampal atrophy, but both maps correlated with clinical deterioration. These quantitative, dynamic visualizations of hippocampal atrophy and ventricular expansion rates in aging and AD may provide a promising measure to track AD progression in drug trials. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In addition to enhance agricultural productivity, synthetic nitrogen (N) and phosphorous (P) fertilizer application in croplands dramatically altered global nutrient budget, water quality, greenhouse gas balance, and their feedbacks to the climate system. However, due to the lack of geospatial fertilizer input data, current Earth system/land surface modeling studies have to ignore or use over-simplified data (e.g., static, spatially uniform fertilizer use) to characterize agricultural N and P input over decadal or century-long period. We therefore develop a global time-series gridded data of annual synthetic N and P fertilizer use rate in croplands, matched with HYDE 3,2 historical land use maps, at a resolution of 0.5º latitude by longitude during 1900-2013. Our data indicate N and P fertilizer use rates increased by approximately 8 times and 3 times, respectively, since the year 1961, when IFA (International Fertilizer Industry Association) and FAO (Food and Agricultural Organization) survey of country-level fertilizer input were available. Considering cropland expansion, increase of total fertilizer consumption amount is even larger. Hotspots of agricultural N fertilizer use shifted from the U.S. and Western Europe in the 1960s to East Asia in the early 21st century. P fertilizer input show the similar pattern with additional hotspot in Brazil. We find a global increase of fertilizer N/P ratio by 0.8 g N/g P per decade (p< 0.05) during 1961-2013, which may have important global implication of human impacts on agroecosystem functions in the long run. Our data can serve as one of critical input drivers for regional and global assessment on agricultural productivity, crop yield, agriculture-derived greenhouse gas balance, global nutrient budget, land-to-aquatic nutrient loss, and ecosystem feedback to the climate system.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A finite difference method for simulating voltammograms of electrochemically driven enzyme catalysis is presented. The method enables any enzyme mechanism to be simulated. The finite difference equations can be represented as a matrix equation containing a nonlinear sparse matrix. This equation has been solved using the software package Mathematica. Our focus is on the use of cyclic voltammetry since this is the most commonly employed electrochemical method used to elucidate mechanisms. The use of cyclic voltammetry to obtain data from systems obeying Michaelis-Menten kinetics is discussed, and we then verify our observations on the Michaelis-Menten system using the finite difference simulation. Finally, we demonstrate how the method can be used to obtain mechanistic information on a real redox enzyme system, the complex bacterial molybdoenzyme xanthine dehydrogenase.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

PECUBE is a three-dimensional thermal-kinematic code capable of solving the heat production-diffusion-advection equation under a temporally varying surface boundary condition. It was initially developed to assess the effects of time-varying surface topography (relief) on low-temperature thermochronological datasets. Thermochronometric ages are predicted by tracking the time-temperature histories of rock-particles ending up at the surface and by combining these with various age-prediction models. In the decade since its inception, the PECUBE code has been under continuous development as its use became wider and addressed different tectonic-geomorphic problems. This paper describes several major recent improvements in the code, including its integration with an inverse-modeling package based on the Neighborhood Algorithm, the incorporation of fault-controlled kinematics, several different ways to address topographic and drainage change through time, the ability to predict subsurface (tunnel or borehole) data, prediction of detrital thermochronology data and a method to compare these with observations, and the coupling with landscape-evolution (or surface-process) models. Each new development is described together with one or several applications, so that the reader and potential user can clearly assess and make use of the capabilities of PECUBE. We end with describing some developments that are currently underway or should take place in the foreseeable future. (C) 2012 Elsevier B.V. All rights reserved.