932 resultados para Numerical surface modeling


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The pulmonary crackling and the formation of liquid bridges are problems that for centuries have been attracting the attention of scientists. In order to study these phenomena, it was developed a canonical cubic lattice-gas­ like model to explain the rupture of liquid bridges in lung airways [A. Alencar et al., 2006, PRE]. Here, we further develop this model and add entropy analysis to study thermodynamic properties, such as free energy and force. The simulations were performed using the Monte Carlo method with Metropolis algorithm. The exchange between gas and liquid particles were performed randomly according to the Kawasaki dynamics and weighted by the Boltzmann factor. Each particle, which can be solid (s), liquid (l) or gas (g), has 26 neighbors: 6 + 12 + 8, with distances 1, √2 and √3, respectively. The energy of a lattice's site m is calculated by the following expression: Em = ∑k=126 Ji(m)j(k) in witch (i, j) = g, l or s. Specifically, it was studied the surface free energy of the liquid bridge, trapped between two planes, when its height is changed. For that, was considered two methods. First, just the internal energy was calculated. Then was considered the entropy. It was fond no difference in the surface free energy between this two methods. We calculate the liquid bridge force between the two planes using the numerical surface free energy. This force is strong for small height, and decreases as the distance between the two planes, height, is increased. The liquid-gas system was also characterized studying the variation of internal energy and heat capacity with the temperature. For that, was performed simulation with the same proportion of liquid and gas particle, but different lattice size. The scale of the liquid-gas system was also studied, for low temperature, using different values to the interaction Jij.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND:: The interaction of sevoflurane and opioids can be described by response surface modeling using the hierarchical model. We expanded this for combined administration of sevoflurane, opioids, and 66 vol.% nitrous oxide (N2O), using historical data on the motor and hemodynamic responsiveness to incision, the minimal alveolar concentration, and minimal alveolar concentration to block autonomic reflexes to nociceptive stimuli, respectively. METHODS:: Four potential actions of 66 vol.% N2O were postulated: (1) N2O is equivalent to A ng/ml of fentanyl (additive); (2) N2O reduces C50 of fentanyl by factor B; (3) N2O is equivalent to X vol.% of sevoflurane (additive); (4) N2O reduces C50 of sevoflurane by factor Y. These four actions, and all combinations, were fitted on the data using NONMEM (version VI, Icon Development Solutions, Ellicott City, MD), assuming identical interaction parameters (A, B, X, Y) for movement and sympathetic responses. RESULTS:: Sixty-six volume percentage nitrous oxide evokes an additive effect corresponding to 0.27 ng/ml fentanyl (A) with an additive effect corresponding to 0.54 vol.% sevoflurane (X). Parameters B and Y did not improve the fit. CONCLUSION:: The effect of nitrous oxide can be incorporated into the hierarchical interaction model with a simple extension. The model can be used to predict the probability of movement and sympathetic responses during sevoflurane anesthesia taking into account interactions with opioids and 66 vol.% N2O.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Everglades Depth Estimation Network (EDEN) is an integrated network of realtime water-level monitoring, ground-elevation modeling, and water-surface modeling that provides scientists and managers with current (2000-present), online water-stage and water-depth information for the entire freshwater portion of the Greater Everglades. Continuous daily spatial interpolations of the EDEN network stage data are presented on grid with 400-square-meter spacing. EDEN offers a consistent and documented dataset that can be used by scientists and managers to: (1) guide large-scale field operations, (2) integrate hydrologic and ecological responses, and (3) support biological and ecological assessments that measure ecosystem responses to the implementation of the Comprehensive Everglades Restoration Plan (CERP) (U.S. Army Corps of Engineers, 1999). The target users are biologists and ecologists examining trophic level responses to hydrodynamic changes in the Everglades. The first objective of this report is to validate the spatially continuous EDEN water-surface model for the Everglades, Florida developed by Pearlstine et al. (2007) by using an independent field-measured data-set. The second objective is to demonstrate two applications of the EDEN water-surface model: to estimate site-specific ground elevation by using the validated EDEN water-surface model and observed water depth data; and to create water-depth hydrographs for tree islands. We found that there are no statistically significant differences between model-predicted and field-observed water-stage data in both southern Water Conservation Area (WCA) 3A and WCA 3B. Tree island elevations were derived by subtracting field water-depth measurements from the predicted EDEN water-surface. Water-depth hydrographs were then computed by subtracting tree island elevations from the EDEN water stage. Overall, the model is reliable by a root mean square error (RMSE) of 3.31 cm. By region, the RMSE is 2.49 cm and 7.77 cm in WCA 3A and 3B, respectively. This new landscape-scale hydrological model has wide applications for ongoing research and management efforts that are vital to restoration of the Florida Everglades. The accurate, high-resolution hydrological data, generated over broad spatial and temporal scales by the EDEN model, provides a previously missing key to understanding the habitat requirements and linkages among native and invasive populations, including fish, wildlife, wading birds, and plants. The EDEN model is a powerful tool that could be adapted for other ecosystem-scale restoration and management programs worldwide.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The world's largest fossil oyster reef, formed by the giant oyster Crassostrea gryphoides and located in Stetten (north of Vienna, Austria) is studied by Harzhauser et al., 2015, 2016; Djuricic et al., 2016. Digital documentation of the unique geological site is provided by terrestrial laser scanning (TLS) at the millimeter scale. Obtaining meaningful results is not merely a matter of data acquisition with a suitable device; it requires proper planning, data management, and postprocessing. Terrestrial laser scanning technology has a high potential for providing precise 3D mapping that serves as the basis for automatic object detection in different scenarios; however, it faces challenges in the presence of large amounts of data and the irregular geometry of an oyster reef. We provide a detailed description of the techniques and strategy used for data collection and processing in Djuricic et al., 2016. The use of laser scanning provided the ability to measure surface points of 46,840 (estimated) shells. They are up to 60-cm-long oyster specimens, and their surfaces are modeled with a high accuracy of 1 mm. In addition to laser scanning measurements, more than 300 photographs were captured, and an orthophoto mosaic was generated with a ground sampling distance (GSD) of 0.5 mm. This high-resolution 3D information and the photographic texture serve as the basis for ongoing and future geological and paleontological analyses. Moreover, they provide unprecedented documentation for conservation issues at a unique natural heritage site.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper deals with fractional differential equations, with dependence on a Caputo fractional derivative of real order. The goal is to show, based on concrete examples and experimental data from several experiments, that fractional differential equations may model more efficiently certain problems than ordinary differential equations. A numerical optimization approach based on least squares approximation is used to determine the order of the fractional operator that better describes real data, as well as other related parameters.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Transient hyperopic refractive shifts occur on a timescale of weeks in some patients after initiation of therapy for hyperglycemia, and are usually followed by recovery to the original refraction. Possible lenticular origin of these changes is considered in terms of a paraxial gradient index model. Assuming that the lens thickness and curvatures remain unchanged, as observed in practice, it appears possible to account for initial hyperopic refractive shifts of up to a few diopters by reduction in refractive index near the lens center and alteration in the rate of change between center and surface, so that most of the index change occurs closer to the lens surface. Restoration of the original refraction depends on further change in the refractive index distribution with more gradual changes in refractive index from the lens center to its surface. Modeling limitations are discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We developed an anatomical mapping technique to detect hippocampal and ventricular changes in Alzheimer disease (AD). The resulting maps are sensitive to longitudinal changes in brain structure as the disease progresses. An anatomical surface modeling approach was combined with surface-based statistics to visualize the region and rate of atrophy in serial MRI scans and isolate where these changes link with cognitive decline. Fifty-two high-resolution MRI scans were acquired from 12 AD patients (age: 68.4 ± 1.9 years) and 14 matched controls (age: 71.4 ± 0.9 years), each scanned twice (2.1 ± 0.4 years apart). 3D parametric mesh models of the hippocampus and temporal horns were created in sequential scans and averaged across subjects to identify systematic patterns of atrophy. As an index of radial atrophy, 3D distance fields were generated relating each anatomical surface point to a medial curve threading down the medial axis of each structure. Hippocampal atrophic rates and ventricular expansion were assessed statistically using surface-based permutation testing and were faster in AD than in controls. Using color-coded maps and video sequences, these changes were visualized as they progressed anatomically over time. Additional maps localized regions where atrophic changes linked with cognitive decline. Temporal horn expansion maps were more sensitive to AD progression than maps of hippocampal atrophy, but both maps correlated with clinical deterioration. These quantitative, dynamic visualizations of hippocampal atrophy and ventricular expansion rates in aging and AD may provide a promising measure to track AD progression in drug trials.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Population-based brain mapping provides great insight into the trajectory of aging and dementia, as well as brain changes that normally occur over the human life span.We describe three novel brain mapping techniques, cortical thickness mapping, tensor-based morphometry (TBM), and hippocampal surface modeling, which offer enormous power for measuring disease progression in drug trials, and shed light on the neuroscience of brain degeneration in Alzheimer's disease (AD) and mild cognitive impairment (MCI).We report the first time-lapse maps of cortical atrophy spreading dynamically in the living brain, based on averaging data from populations of subjects with Alzheimer's disease and normal subjects imaged longitudinally with MRI. These dynamic sequences show a rapidly advancing wave of cortical atrophy sweeping from limbic and temporal cortices into higher-order association and ultimately primary sensorimotor areas, in a pattern that correlates with cognitive decline. A complementary technique, TBM, reveals the 3D profile of atrophic rates, at each point in the brain. A third technique, hippocampal surface modeling, plots the profile of shape alterations across the hippocampal surface. The three techniques provide moderate to highly automated analyses of images, have been validated on hundreds of scans, and are sensitive to clinically relevant changes in individual patients and groups undergoing different drug treatments. We compare time-lapse maps of AD, MCI, and other dementias, correlate these changes with cognition, and relate them to similar time-lapse maps of childhood development, schizophrenia, and HIV-associated brain degeneration. Strengths and weaknesses of these different imaging measures for basic neuroscience and drug trials are discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The objective of this thesis is to develop a framework to conduct velocity resolved - scalar modeled (VR-SM) simulations, which will enable accurate simulations at higher Reynolds and Schmidt (Sc) numbers than are currently feasible. The framework established will serve as a first step to enable future simulation studies for practical applications. To achieve this goal, in-depth analyses of the physical, numerical, and modeling aspects related to Sc>>1 are presented, specifically when modeling in the viscous-convective subrange. Transport characteristics are scrutinized by examining scalar-velocity Fourier mode interactions in Direct Numerical Simulation (DNS) datasets and suggest that scalar modes in the viscous-convective subrange do not directly affect large-scale transport for high Sc. Further observations confirm that discretization errors inherent in numerical schemes can be sufficiently large to wipe out any meaningful contribution from subfilter models. This provides strong incentive to develop more effective numerical schemes to support high Sc simulations. To lower numerical dissipation while maintaining physically and mathematically appropriate scalar bounds during the convection step, a novel method of enforcing bounds is formulated, specifically for use with cubic Hermite polynomials. Boundedness of the scalar being transported is effected by applying derivative limiting techniques, and physically plausible single sub-cell extrema are allowed to exist to help minimize numerical dissipation. The proposed bounding algorithm results in significant performance gain in DNS of turbulent mixing layers and of homogeneous isotropic turbulence. Next, the combined physical/mathematical behavior of the subfilter scalar-flux vector is analyzed in homogeneous isotropic turbulence, by examining vector orientation in the strain-rate eigenframe. The results indicate no discernible dependence on the modeled scalar field, and lead to the identification of the tensor-diffusivity model as a good representation of the subfilter flux. Velocity resolved - scalar modeled simulations of homogeneous isotropic turbulence are conducted to confirm the behavior theorized in these a priori analyses, and suggest that the tensor-diffusivity model is ideal for use in the viscous-convective subrange. Simulations of a turbulent mixing layer are also discussed, with the partial objective of analyzing Schmidt number dependence of a variety of scalar statistics. Large-scale statistics are confirmed to be relatively independent of the Schmidt number for Sc>>1, which is explained by the dominance of subfilter dissipation over resolved molecular dissipation in the simulations. Overall, the VR-SM framework presented is quite effective in predicting large-scale transport characteristics of high Schmidt number scalars, however, it is determined that prediction of subfilter quantities would entail additional modeling intended specifically for this purpose. The VR-SM simulations presented in this thesis provide us with the opportunity to overlap with experimental studies, while at the same time creating an assortment of baseline datasets for future validation of LES models, thereby satisfying the objectives outlined for this work.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The effects ofdisk flexibility and multistage coupling on the dynamics of bladed disks with and without blade mistuning are investigated. Both free and forced responses are examined using finite element representations of example single and two-stage rotor models. The reported work demonstrates the importance of proper treatment of interstage (stage-to-stage) boundaries in order to yield adequate capture of disk-blade modal interaction in eigenfrequency veering regions. The modified disk-blade modal interactions resulting from interstage-coupling-induced changes in disk flexibility are found to have a significant impact on (a) tuned responses due to excitations passing through eigenfrequency veering regions, and (b) a design's sensitivity to blade mistuning. Hence, the findings in this paper suggest that multistage analyses may be required when excitations are expected to fall in or near eigenfrequency veering regions or when the sensitivity to blade mistuning is to be accounted for Conversely, the observed sensitivity to disk flexibility also indicates that the severity of unfavorable structural interblade coupling may be reduced significantly by redesigning the disk(s) and stage-to-stage connectivity. The relatively drastic effects of such modifications illustrated in this work indicate that the design modifications required to alleviate veering-related response problems may be less comprehensive than what might have been expected.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An efficient technique to cut polygonal meshes as a step in the geometric modeling of topographic and geological data has been developed. In boundary represented models of outcropping strata and faulted horizons polygonal meshes often intersect each other. TRICUT determines the line of intersection and re-triangulates the area of contact. Along this line the mesh is split in two or more parts which can be selected for removal. The user interaction takes place in the 3D-model space. The intersection, selection and removal are under graphic control. The visualization of outcropping geological structures in digital terrain models is improved by determining intersections against a slightly shifted terrain model. Thus, the outcrop line becomes a surface which overlaps the terrain in its initial position. The area of this overlapping surface changes with respect to the strike and dip of the structure, the morphology and the offset. Some applications of TRICUT on different real datasets are shown. TRICUT is implemented in C+ + using the Visualization Toolkit in conjunction with the RAPID and TRIANGLE libraries. The program runs under LINUX and UNIX using the MESA OpenGL library. This work gives an example of solving a complex 3D geometric problem by integrating available robust public domain software. (C) 2002 Elsevier B.V. Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

1. Distance sampling is a widely used technique for estimating the size or density of biological populations. Many distance sampling designs and most analyses use the software Distance. 2. We briefly review distance sampling and its assumptions, outline the history, structure and capabilities of Distance, and provide hints on its use. 3. Good survey design is a crucial prerequisite for obtaining reliable results. Distance has a survey design engine, with a built-in geographic information system, that allows properties of different proposed designs to be examined via simulation, and survey plans to be generated. 4. A first step in analysis of distance sampling data is modeling the probability of detection. Distance contains three increasingly sophisticated analysis engines for this: conventional distance sampling, which models detection probability as a function of distance from the transect and assumes all objects at zero distance are detected; multiple-covariate distance sampling, which allows covariates in addition to distance; and mark–recapture distance sampling, which relaxes the assumption of certain detection at zero distance. 5. All three engines allow estimation of density or abundance, stratified if required, with associated measures of precision calculated either analytically or via the bootstrap. 6. Advanced analysis topics covered include the use of multipliers to allow analysis of indirect surveys (such as dung or nest surveys), the density surface modeling analysis engine for spatial and habitat-modeling, and information about accessing the analysis engines directly from other software. 7. Synthesis and applications. Distance sampling is a key method for producing abundance and density estimates in challenging field conditions. The theory underlying the methods continues to expand to cope with realistic estimation situations. In step with theoretical developments, state-of- the-art software that implements these methods is described that makes the methods accessible to practicing ecologists.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis investigates interactive scene reconstruction and understanding using RGB-D data only. Indeed, we believe that depth cameras will still be in the near future a cheap and low-power 3D sensing alternative suitable for mobile devices too. Therefore, our contributions build on top of state-of-the-art approaches to achieve advances in three main challenging scenarios, namely mobile mapping, large scale surface reconstruction and semantic modeling. First, we will describe an effective approach dealing with Simultaneous Localization And Mapping (SLAM) on platforms with limited resources, such as a tablet device. Unlike previous methods, dense reconstruction is achieved by reprojection of RGB-D frames, while local consistency is maintained by deploying relative bundle adjustment principles. We will show quantitative results comparing our technique to the state-of-the-art as well as detailed reconstruction of various environments ranging from rooms to small apartments. Then, we will address large scale surface modeling from depth maps exploiting parallel GPU computing. We will develop a real-time camera tracking method based on the popular KinectFusion system and an online surface alignment technique capable of counteracting drift errors and closing small loops. We will show very high quality meshes outperforming existing methods on publicly available datasets as well as on data recorded with our RGB-D camera even in complete darkness. Finally, we will move to our Semantic Bundle Adjustment framework to effectively combine object detection and SLAM in a unified system. Though the mathematical framework we will describe does not restrict to a particular sensing technology, in the experimental section we will refer, again, only to RGB-D sensing. We will discuss successful implementations of our algorithm showing the benefit of a joint object detection, camera tracking and environment mapping.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Temporal hollowing due to temporal muscle atrophy after standard skull base surgery is common. Various techniques have been previously described to correct the disfiguring defect. Most often reconstruction is performed using freehand molded polymethylmethacrylate cement. This method and material are insufficient in terms of aesthetic results and implant characteristics. We herein propose reconstruction of such defects with a polyetheretherketone (PEEK)-based patient-specific implant (PSI) including soft-tissue augmentation to preserve normal facial topography. We describe a patient who presented with a large temporo-orbital hemangioma that had been repaired with polymethylmethacrylate 25 years earlier. Because of a toxic skin atrophy fistula, followed by infection and meningitis, this initial implant had to be removed. The large, disfiguring temporo-orbital defect was reconstructed with a PEEK-based PSI. The lateral orbital wall and the temporal muscle atrophy were augmented with computer-aided design and surface modeling techniques. The operative procedure to implant and adopt the reconstructed PEEK-based PSI was simple, and an excellent cosmetic outcome was achieved. The postoperative clinical course was uneventful over a 5-year follow-up period. Polyetheretherketone-based combined bony and soft contour remodeling is a feasible and effective method for cranioplasty including combined bone and soft-tissue reconstruction of temporo-orbital defects. Manual reconstruction of this cosmetically delicate area carries an exceptional risk of disfiguring results. Augmentation surgery in this anatomic location needs accurate PSIs to achieve satisfactory cosmetic results. The cosmetic outcome achieved in this case is superior compared with previously reported techniques.