961 resultados para surface modeling


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper demonstrates a modeling and design approach that couples computational mechanics techniques with numerical optimisation and statistical models for virtual prototyping and testing in different application areas concerning reliability of eletronic packages. The integrated software modules provide a design engineer in the electronic manufacturing sector with fast design and process solutions by optimizing key parameters and taking into account complexity of certain operational conditions. The integrated modeling framework is obtained by coupling the multi-phsyics finite element framework - PHYSICA - with the numerical optimisation tool - VisualDOC into a fully automated design tool for solutions of electronic packaging problems. Response Surface Modeling Methodolgy and Design of Experiments statistical tools plus numerical optimisaiton techniques are demonstrated as a part of the modeling framework. Two different problems are discussed and solved using the integrated numerical FEM-Optimisation tool. First, an example of thermal management of an electronic package on a board is illustrated. Location of the device is optimized to ensure reduced junction temperature and stress in the die subject to certain cooling air profile and other heat dissipating active components. In the second example thermo-mechanical simulations of solder creep deformations are presented to predict flip-chip reliability and subsequently used to optimise the life-time of solder interconnects under thermal cycling.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

An efficient technique to cut polygonal meshes as a step in the geometric modeling of topographic and geological data has been developed. In boundary represented models of outcropping strata and faulted horizons polygonal meshes often intersect each other. TRICUT determines the line of intersection and re-triangulates the area of contact. Along this line the mesh is split in two or more parts which can be selected for removal. The user interaction takes place in the 3D-model space. The intersection, selection and removal are under graphic control. The visualization of outcropping geological structures in digital terrain models is improved by determining intersections against a slightly shifted terrain model. Thus, the outcrop line becomes a surface which overlaps the terrain in its initial position. The area of this overlapping surface changes with respect to the strike and dip of the structure, the morphology and the offset. Some applications of TRICUT on different real datasets are shown. TRICUT is implemented in C+ + using the Visualization Toolkit in conjunction with the RAPID and TRIANGLE libraries. The program runs under LINUX and UNIX using the MESA OpenGL library. This work gives an example of solving a complex 3D geometric problem by integrating available robust public domain software. (C) 2002 Elsevier B.V. Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The present experiment aimed to study the influence of point positioning in an irradiated field to produce a planialtimetric plant. A planialtimetric evaluation was carried out in a 14-acre experimental area with well-defined topographic variations. Planialtimetric maps were designed using manual procedures, Datageosis and Topoesalq. Datageosis built all the curves after numerical surface modeling. Topoesalq provided only height reports and the drawing of curves was done manually. The third method was a manual procedure. Because there were planialtimetry representation differences, longitudinal profiles were used in the sites where there was a great divergence among plants. When obtained profiles and plants were compared, it was verified that the one produced by Datageosis represented the relief plant better. Later, only the irradiated field points were evaluated and each point presented positioned readings before and after each relief map variation. The processing through the three methods resulted in significant plants of the local planialtimetry, according to the control profiles. It was concluded that the planning of the field procedure should be suitable to the posterior treatment method of obtained data in order to make a planialtimetric plant to accord to the evaluated local topography.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

1. Distance sampling is a widely used technique for estimating the size or density of biological populations. Many distance sampling designs and most analyses use the software Distance. 2. We briefly review distance sampling and its assumptions, outline the history, structure and capabilities of Distance, and provide hints on its use. 3. Good survey design is a crucial prerequisite for obtaining reliable results. Distance has a survey design engine, with a built-in geographic information system, that allows properties of different proposed designs to be examined via simulation, and survey plans to be generated. 4. A first step in analysis of distance sampling data is modeling the probability of detection. Distance contains three increasingly sophisticated analysis engines for this: conventional distance sampling, which models detection probability as a function of distance from the transect and assumes all objects at zero distance are detected; multiple-covariate distance sampling, which allows covariates in addition to distance; and mark–recapture distance sampling, which relaxes the assumption of certain detection at zero distance. 5. All three engines allow estimation of density or abundance, stratified if required, with associated measures of precision calculated either analytically or via the bootstrap. 6. Advanced analysis topics covered include the use of multipliers to allow analysis of indirect surveys (such as dung or nest surveys), the density surface modeling analysis engine for spatial and habitat-modeling, and information about accessing the analysis engines directly from other software. 7. Synthesis and applications. Distance sampling is a key method for producing abundance and density estimates in challenging field conditions. The theory underlying the methods continues to expand to cope with realistic estimation situations. In step with theoretical developments, state-of- the-art software that implements these methods is described that makes the methods accessible to practicing ecologists.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis investigates interactive scene reconstruction and understanding using RGB-D data only. Indeed, we believe that depth cameras will still be in the near future a cheap and low-power 3D sensing alternative suitable for mobile devices too. Therefore, our contributions build on top of state-of-the-art approaches to achieve advances in three main challenging scenarios, namely mobile mapping, large scale surface reconstruction and semantic modeling. First, we will describe an effective approach dealing with Simultaneous Localization And Mapping (SLAM) on platforms with limited resources, such as a tablet device. Unlike previous methods, dense reconstruction is achieved by reprojection of RGB-D frames, while local consistency is maintained by deploying relative bundle adjustment principles. We will show quantitative results comparing our technique to the state-of-the-art as well as detailed reconstruction of various environments ranging from rooms to small apartments. Then, we will address large scale surface modeling from depth maps exploiting parallel GPU computing. We will develop a real-time camera tracking method based on the popular KinectFusion system and an online surface alignment technique capable of counteracting drift errors and closing small loops. We will show very high quality meshes outperforming existing methods on publicly available datasets as well as on data recorded with our RGB-D camera even in complete darkness. Finally, we will move to our Semantic Bundle Adjustment framework to effectively combine object detection and SLAM in a unified system. Though the mathematical framework we will describe does not restrict to a particular sensing technology, in the experimental section we will refer, again, only to RGB-D sensing. We will discuss successful implementations of our algorithm showing the benefit of a joint object detection, camera tracking and environment mapping.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Temporal hollowing due to temporal muscle atrophy after standard skull base surgery is common. Various techniques have been previously described to correct the disfiguring defect. Most often reconstruction is performed using freehand molded polymethylmethacrylate cement. This method and material are insufficient in terms of aesthetic results and implant characteristics. We herein propose reconstruction of such defects with a polyetheretherketone (PEEK)-based patient-specific implant (PSI) including soft-tissue augmentation to preserve normal facial topography. We describe a patient who presented with a large temporo-orbital hemangioma that had been repaired with polymethylmethacrylate 25 years earlier. Because of a toxic skin atrophy fistula, followed by infection and meningitis, this initial implant had to be removed. The large, disfiguring temporo-orbital defect was reconstructed with a PEEK-based PSI. The lateral orbital wall and the temporal muscle atrophy were augmented with computer-aided design and surface modeling techniques. The operative procedure to implant and adopt the reconstructed PEEK-based PSI was simple, and an excellent cosmetic outcome was achieved. The postoperative clinical course was uneventful over a 5-year follow-up period. Polyetheretherketone-based combined bony and soft contour remodeling is a feasible and effective method for cranioplasty including combined bone and soft-tissue reconstruction of temporo-orbital defects. Manual reconstruction of this cosmetically delicate area carries an exceptional risk of disfiguring results. Augmentation surgery in this anatomic location needs accurate PSIs to achieve satisfactory cosmetic results. The cosmetic outcome achieved in this case is superior compared with previously reported techniques.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We developed an anatomical mapping technique to detect hippocampal and ventricular changes in Alzheimer disease (AD). The resulting maps are sensitive to longitudinal changes in brain structure as the disease progresses. An anatomical surface modeling approach was combined with surface-based statistics to visualize the region and rate of atrophy in serial MRI scans and isolate where these changes link with cognitive decline. Fifty-two high-resolution MRI scans were acquired from 12 AD patients (age: 68.4 +/- 1.9 years) and 14 matched controls (age: 71.4 +/- 0.9 years), each scanned twice (2.1 +/- 0.4 years apart). 3D parametric mesh models of the hippocampus and temporal horns were created in sequential scans and averaged across subjects to identify systematic patterns of atrophy. As an index of radial atrophy, 3D distance fields were generated relating each anatomical surface point to a medial curve threading down the medial axis of each structure. Hippocampal atrophic rates and ventricular expansion were assessed statistically using surface-based permutation testing and were faster in AD than in controls. Using color-coded maps and video sequences, these changes were visualized as they progressed anatomically over time. Additional maps localized regions where atrophic changes linked with cognitive decline. Temporal horn expansion maps were more sensitive to AD progression than maps of hippocampal atrophy, but both maps correlated with clinical deterioration. These quantitative, dynamic visualizations of hippocampal atrophy and ventricular expansion rates in aging and AD may provide a promising measure to track AD progression in drug trials. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In addition to enhance agricultural productivity, synthetic nitrogen (N) and phosphorous (P) fertilizer application in croplands dramatically altered global nutrient budget, water quality, greenhouse gas balance, and their feedbacks to the climate system. However, due to the lack of geospatial fertilizer input data, current Earth system/land surface modeling studies have to ignore or use over-simplified data (e.g., static, spatially uniform fertilizer use) to characterize agricultural N and P input over decadal or century-long period. We therefore develop a global time-series gridded data of annual synthetic N and P fertilizer use rate in croplands, matched with HYDE 3,2 historical land use maps, at a resolution of 0.5º latitude by longitude during 1900-2013. Our data indicate N and P fertilizer use rates increased by approximately 8 times and 3 times, respectively, since the year 1961, when IFA (International Fertilizer Industry Association) and FAO (Food and Agricultural Organization) survey of country-level fertilizer input were available. Considering cropland expansion, increase of total fertilizer consumption amount is even larger. Hotspots of agricultural N fertilizer use shifted from the U.S. and Western Europe in the 1960s to East Asia in the early 21st century. P fertilizer input show the similar pattern with additional hotspot in Brazil. We find a global increase of fertilizer N/P ratio by 0.8 g N/g P per decade (p< 0.05) during 1961-2013, which may have important global implication of human impacts on agroecosystem functions in the long run. Our data can serve as one of critical input drivers for regional and global assessment on agricultural productivity, crop yield, agriculture-derived greenhouse gas balance, global nutrient budget, land-to-aquatic nutrient loss, and ecosystem feedback to the climate system.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The time evolution of the film thickness and domain formation of octadecylamine molecules adsorbed oil a mica surface is investigated Using atomic force microscopy. The adsorbed Film thickness is determined by measuring the height profile across the mica-amine interface of a mica surface partially immersed in a 15 mM solution of octadecylamine in chloroform. Using this novel procedure, adsorption of amine on mica is found to occur in three distinct stages, with morphologically distinct domain Formation and growth occurring during each stage. In the first stage, where adsorption is primarily in the thin-film regime, all average Film thickness of 0.2 (+/- 0.3) nm is formed for exposure times below 30 s and 0.8 (+/- 0.2) nm for 60 s of immersion time. During this stage, large sample spanning domains are observed. The second stage, which occurs between 60-300 s, is associated with it regime of rapid film growth, and the film thickness increases from about 0.8 to 25 nm during this stage. Once the thick-film regime is established, further exposure to the amine solution results in all increase in the domain area, and it regime of lateral domain growth is observed. In this stage, the domain area coverage grows from 38 to 75%, and the FTIR spectra reveal an increased level of crystallinity in the film. Using it diffusion-controlled model and it two-step Langmuir isotherm, the time evolution of the film growth is quantitatively captured. The model predicts the time at which the thin to thick film transition occurs as well its the time required for complete film growth at longer times. The Ward-Tordai equation is also solved to determine the model parameters in the monolayer (thin-film) regime, which occurs during the initial stages of film growth.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Solar UV radiation is harmful for life on planet Earth, but fortunately the atmospheric oxygen and ozone absorb almost entirely the most energetic UVC radiation photons. However, part of the UVB radiation and much of the UVA radiation reaches the surface of the Earth, and affect human health, environment, materials and drive atmospheric and aquatic photochemical processes. In order to quantify these effects and processes there is a need for ground-based UV measurements and radiative transfer modeling to estimate the amounts of UV radiation reaching the biosphere. Satellite measurements with their near-global spatial coverage and long-term data conti-nuity offer an attractive option for estimation of the surface UV radiation. This work focuses on radiative transfer theory based methods used for estimation of the UV radiation reaching the surface of the Earth. The objectives of the thesis were to implement the surface UV algorithm originally developed at NASA Goddard Space Flight Center for estimation of the surface UV irradiance from the meas-urements of the Dutch-Finnish built Ozone Monitoring Instrument (OMI), to improve the original surface UV algorithm especially in relation with snow cover, to validate the OMI-derived daily surface UV doses against ground-based measurements, and to demonstrate how the satellite-derived surface UV data can be used to study the effects of the UV radiation. The thesis consists of seven original papers and a summary. The summary includes an introduction of the OMI instrument, a review of the methods used for modeling of the surface UV using satellite data as well as the con-clusions of the main results of the original papers. The first two papers describe the algorithm used for estimation of the surface UV amounts from the OMI measurements as well as the unique Very Fast Delivery processing system developed for processing of the OMI data received at the Sodankylä satellite data centre. The third and the fourth papers present algorithm improvements related to the surface UV albedo of the snow-covered land. Fifth paper presents the results of the comparison of the OMI-derived daily erythemal doses with those calculated from the ground-based measurement data. It gives an estimate of the expected accuracy of the OMI-derived sur-face UV doses for various atmospheric and other conditions, and discusses the causes of the differences between the satellite-derived and ground-based data. The last two papers demonstrate the use of the satellite-derived sur-face UV data. Sixth paper presents an assessment of the photochemical decomposition rates in aquatic environment. Seventh paper presents use of satellite-derived daily surface UV doses for planning of the outdoor material weathering tests.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The objective of this paper is to investigate and model the characteristics of the prevailing volatility smiles and surfaces on the DAX- and ESX-index options markets. Continuing on the trend of Implied Volatility Functions, the Standardized Log-Moneyness model is introduced and fitted to historical data. The model replaces the constant volatility parameter of the Black & Scholes pricing model with a matrix of volatilities with respect to moneyness and maturity and is tested out-of-sample. Considering the dynamics, the results show support for the hypotheses put forward in this study, implying that the smile increases in magnitude when maturity and ATM volatility decreases and that there is a negative/positive correlation between a change in the underlying asset/time to maturity and implied ATM volatility. Further, the Standardized Log-Moneyness model indicates an improvement to pricing accuracy compared to previous Implied Volatility Function models, however indicating that the parameters of the models are to be re-estimated continuously for the models to fully capture the changing dynamics of the volatility smiles.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Plastic-coated paper is shown to possess reflectivity characteristics quite similar to those of the surface of water. This correspondence has been used with a conversion factor to model a sea surface by means of plastic-coated paper. Such a paper model is then suitably illuminated and photographed, yielding physically simulated daylight imagery of the sea surface under controlled conditions. A simple example of sinusoidal surface simulation is described.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The present work is based on four static molds using nozzles of different port diameter, port angle, and immersion depth. It has been observed that the meniscus is wavy. The wave amplitude shows a parabolic variation with the nozzle exit velocity. The dimensionless amplitude is found to vary linearly with the Froude number. Vortex formation and bubble entrainment by the wave occurs at the meniscus beyond a critical flow rate, depending upon the nozzle configuration, immersion depth, and the mold aspect ratio.