944 resultados para 3D model acquisition
Resumo:
This work presents a finite difference technique for simulating three-dimensional free surface flows governed by the Upper-Convected Maxwell (UCM) constitutive equation. A Marker-and-Cell approach is employed to represent the fluid free surface and formulations for calculating the non-Newtonian stress tensor on solid boundaries are developed. The complete free surface stress conditions are employed. The momentum equation is solved by an implicit technique while the UCM constitutive equation is integrated by the explicit Euler method. The resulting equations are solved by the finite difference method on a 3D-staggered grid. By using an exact solution for fully developed flow inside a pipe, validation and convergence results are provided. Numerical results include the simulation of the transient extrudate swell and the comparison between jet buckling of UCM and Newtonian fluids.
Resumo:
We use two-photon polymerization to fabricate 3D scaffolds with precise control over pore size and shape for studying cell migration in 3D. These scaffolds allow movement of cells in all directions. The fabrication, imaging, and quantitative analysis method developed here can be used to do systematic cell studies in 3D.
Resumo:
5-HT(1A) receptor antagonists have been employed to treat depression, but the lack of structural information on this receptor hampers the design of specific and selective ligands. In this study, we have performed CoMFA studies on a training set of arylpiperazines (high affinity 5-HT(1A) receptor ligands) and to produce an effective alignment of the data set, a pharmacophore model was produced using Galahad. A statistically significant model was obtained, indicating a good internal consistency and predictive ability for untested compounds. The information gathered from our receptor-independent pharmacophore hypothesis is in good agreement with results from independent studies using different approaches. Therefore, this work provides important insights on the chemical and structural basis involved in the molecular recognition of these compounds. (C) 2010 Elsevier Masson SAS. All rights reserved.
Resumo:
In this paper, we present a 3D face photography system based on a facial expression training dataset, composed of both facial range images (3D geometry) and facial texture (2D photography). The proposed system allows one to obtain a 3D geometry representation of a given face provided as a 2D photography, which undergoes a series of transformations through the texture and geometry spaces estimated. In the training phase of the system, the facial landmarks are obtained by an active shape model (ASM) extracted from the 2D gray-level photography. Principal components analysis (PCA) is then used to represent the face dataset, thus defining an orthonormal basis of texture and another of geometry. In the reconstruction phase, an input is given by a face image to which the ASM is matched. The extracted facial landmarks and the face image are fed to the PCA basis transform, and a 3D version of the 2D input image is built. Experimental tests using a new dataset of 70 facial expressions belonging to ten subjects as training set show rapid reconstructed 3D faces which maintain spatial coherence similar to the human perception, thus corroborating the efficiency and the applicability of the proposed system.
Resumo:
A three-dimensional time-dependent hydrodynamic and heat transport model of Lake Binaba, a shallow and small dam reservoir in Ghana, emphasizing the simulation of dynamics and thermal structure has been developed. Most numerical studies of temperature dynamics in reservoirs are based on one- or two-dimensional models. These models are not applicable for reservoirs characterized with complex flow pattern and unsteady heat exchange between the atmosphere and water surface. Continuity, momentum and temperature transport equations have been solved. Proper assignment of boundary conditions, especially surface heat fluxes, has been found crucial in simulating the lake’s hydrothermal dynamics. This model is based on the Reynolds Average Navier-Stokes equations, using a Boussinesq approach, with a standard k − ε turbulence closure to solve the flow field. The thermal model includes a heat source term, which takes into account the short wave radiation and also heat convection at the free surface, which is function of air temperatures, wind velocity and stability conditions of atmospheric boundary layer over the water surface. The governing equations of the model have been solved by OpenFOAM; an open source, freely available CFD toolbox. As its core, OpenFOAM has a set of efficient C++ modules that are used to build solvers. It uses collocated, polyhedral numerics that can be applied on unstructured meshes and can be easily extended to run in parallel. A new solver has been developed to solve the hydrothermal model of lake. The simulated temperature was compared against a 15 days field data set. Simulated and measured temperature profiles in the probe locations show reasonable agreement. The model might be able to compute total heat storage of water bodies to estimate evaporation from water surface.
Resumo:
This paper studies a model of a sequential auction where bidders are allowed to acquire further information about their valuations of the object in the middle of the auction. It is shown that, in any equilibrium where the distribution of the final price is atornless, a bidder's best response has a simple characterization. In particular, the optimal information acquisition point is the same, regardless of the other bidders' actions. This makes it natural to focus on symmetric, undominated equilibria, as in the Vickrey auction. An existence theorem for such a class of equilibria is presented. The paper also presents some results and numerical simulations that compare this sequential auction with the one-shot auction. 8equential auctions typically yield more expected revenue for the seller than their one-shot counterparts. 80 the possibility of mid-auction information acquisition can provide an explanation for why sequential procedures are more often adopted.
Resumo:
This thesis elaborates the creation of value in private equity and in particular analyzes value creation in 3G Capital’s acquisition of Burger King. In this sense, a specific model is applied that composes value creation into several drivers, in order to answer the question of how value creation can be addressed in private equity investments. Although previous research by Achleitner et al. (2010) introduced a specific model that addresses value creation in private equity, the respective model was neither applied to an individual company, nor linked to indirect drivers that explain the dynamics and rationales for the creation of value. In turn this paper applies the quantitative model to an ongoing private equity investment and thereby provides different extensions to turn the model into a better forecasting model for ongoing investments, instead of only analyzing a deal that has already been divested from an ex post perspective. The chosen research approach is a case study about the Burger King buyout that first includes an extensive review about the current status of academic literature, second a quantitative calculation and qualitative interpretation of different direct value drivers, third a qualitative breakdown of indirect drivers, and lastly a recapitulating discussion about value creation and value drivers. Presenting a very successful private equity investment and elaborately demonstrating the dynamics and mechanisms that drive value creation in this case, provides important implications for other private equity firms as well as public firms in order to develop their proprietary approach towards value creation.
Resumo:
My dissertation focuses on dynamic aspects of coordination processes such as reversibility of early actions, option to delay decisions, and learning of the environment from the observation of other people’s actions. This study proposes the use of tractable dynamic global games where players privately and passively learn about their actions’ true payoffs and are able to adjust early investment decisions to the arrival of new information to investigate the consequences of the presence of liquidity shocks to the performance of a Tobin tax as a policy intended to foster coordination success (chapter 1), and the adequacy of the use of a Tobin tax in order to reduce an economy’s vulnerability to sudden stops (chapter 2). Then, it analyzes players’ incentive to acquire costly information in a sequential decision setting (chapter 3). In chapter 1, a continuum of foreign agents decide whether to enter or not in an investment project. A fraction λ of them are hit by liquidity restrictions in a second period and are forced to withdraw early investment or precluded from investing in the interim period, depending on the actions they chose in the first period. Players not affected by the liquidity shock are able to revise early decisions. Coordination success is increasing in the aggregate investment and decreasing in the aggregate volume of capital exit. Without liquidity shocks, aggregate investment is (in a pivotal contingency) invariant to frictions like a tax on short term capitals. In this case, a Tobin tax always increases success incidence. In the presence of liquidity shocks, this invariance result no longer holds in equilibrium. A Tobin tax becomes harmful to aggregate investment, which may reduces success incidence if the economy does not benefit enough from avoiding capital reversals. It is shown that the Tobin tax that maximizes the ex-ante probability of successfully coordinated investment is decreasing in the liquidity shock. Chapter 2 studies the effects of a Tobin tax in the same setting of the global game model proposed in chapter 1, with the exception that the liquidity shock is considered stochastic, i.e, there is also aggregate uncertainty about the extension of the liquidity restrictions. It identifies conditions under which, in the unique equilibrium of the model with low probability of liquidity shocks but large dry-ups, a Tobin tax is welfare improving, helping agents to coordinate on the good outcome. The model provides a rationale for a Tobin tax on economies that are prone to sudden stops. The optimal Tobin tax tends to be larger when capital reversals are more harmful and when the fraction of agents hit by liquidity shocks is smaller. Chapter 3 focuses on information acquisition in a sequential decision game with payoff complementar- ity and information externality. When information is cheap relatively to players’ incentive to coordinate actions, only the first player chooses to process information; the second player learns about the true payoff distribution from the observation of the first player’s decision and follows her action. Miscoordination requires that both players privately precess information, which tends to happen when it is expensive and the prior knowledge about the distribution of the payoffs has a large variance.
Resumo:
Humans can perceive three dimension, our world is three dimensional and it is becoming increasingly digital too. We have the need to capture and preserve our existence in digital means perhaps due to our own mortality. We have also the need to reproduce objects or create small identical objects to prototype, test or study them. Some objects have been lost through time and are only accessible through old photographs. With robust model generation from photographs we can use one of the biggest human data sets and reproduce real world objects digitally and physically with printers. What is the current state of development in three dimensional reconstruction through photographs both in the commercial world and in the open source world? And what tools are available for a developer to build his own reconstruction software? To answer these questions several pieces of software were tested, from full commercial software packages to open source small projects, including libraries aimed at computer vision. To bring to the real world the 3D models a 3D printer was built, tested and analyzed, its problems and weaknesses evaluated. Lastly using a computer vision library a small software with limited capabilities was developed.
Resumo:
The seismic method is of extreme importance in geophysics. Mainly associated with oil exploration, this line of research focuses most of all investment in this area. The acquisition, processing and interpretation of seismic data are the parts that instantiate a seismic study. Seismic processing in particular is focused on the imaging that represents the geological structures in subsurface. Seismic processing has evolved significantly in recent decades due to the demands of the oil industry, and also due to the technological advances of hardware that achieved higher storage and digital information processing capabilities, which enabled the development of more sophisticated processing algorithms such as the ones that use of parallel architectures. One of the most important steps in seismic processing is imaging. Migration of seismic data is one of the techniques used for imaging, with the goal of obtaining a seismic section image that represents the geological structures the most accurately and faithfully as possible. The result of migration is a 2D or 3D image which it is possible to identify faults and salt domes among other structures of interest, such as potential hydrocarbon reservoirs. However, a migration fulfilled with quality and accuracy may be a long time consuming process, due to the mathematical algorithm heuristics and the extensive amount of data inputs and outputs involved in this process, which may take days, weeks and even months of uninterrupted execution on the supercomputers, representing large computational and financial costs, that could derail the implementation of these methods. Aiming at performance improvement, this work conducted the core parallelization of a Reverse Time Migration (RTM) algorithm, using the parallel programming model Open Multi-Processing (OpenMP), due to the large computational effort required by this migration technique. Furthermore, analyzes such as speedup, efficiency were performed, and ultimately, the identification of the algorithmic scalability degree with respect to the technological advancement expected by future processors
Resumo:
A contaminated site from a downstream municipal solid waste disposal site in Brazil was investigated by using a 3D resistivity and induced polarization (IP) imaging technique. This investigation purpose was to detect and delineate contamination plume produced by wastes. The area was selected based on previous geophysical investigations, and chemical analyses carried out in the site, indicating the presence of a contamination plume in the area. Resistivity model has successfully imaged waste presence (rho < 20 Omega m), water table depth, and groundwater flow direction. A conductive anomaly (rho < 20 Omega m) outside wastes placement was interpreted as a contamination plume. Chargeability model was also able to imaging waste presence (m > 31 mV/V), water table depth, and groundwater flow direction. A higher chargeability zone (m > 31 mV/V) outside wastes placement and following conductive anomaly was interpreted as a contamination plume. Normalized chargeability (MN = m/rho) confirmed polarizable zone, which could be an effect of a salinity increase (contamination plume), and the clay presence in the environment.
Resumo:
Continuing development of new materials makes systems lighter and stronger permitting more complex systems to provide more functionality and flexibility that demands a more effective evaluation of their structural health. Smart material technology has become an area of increasing interest in this field. The combination of smart materials and artificial neural networks can be used as an excellent tool for pattern recognition, turning their application adequate for monitoring and fault classification of equipment and structures. In order to identify the fault, the neural network must be trained using a set of solutions to its corresponding forward Variational problem. After the training process, the net can successfully solve the inverse variational problem in the context of monitoring and fault detection because of their pattern recognition and interpolation capabilities. The use of structural frequency response function is a fundamental portion of structural dynamic analysis, and it can be extracted from measured electric impedance through the electromechanical interaction of a piezoceramic and a structure. In this paper we use the FRF obtained by a mathematical model (FEM) in order to generate the training data for the neural networks, and the identification of damage can be done by measuring electric impedance, since suitable data normalization correlates FRF and electrical impedance.
Resumo:
PURPOSE: To assess the acquisition of suture skills by training on ethylene-vinyl acetate bench model in novice medical students.METHODS: Sixteen medical students without previous surgery experience (novices) were randomly divided into two groups. During one hour group A trained sutures on ethylene-vinyl acetate (EVA) bench model with feedback of instructors, while group B (control) received a faculty-directed training based on books and instructional videos. All students underwent a both pre-and post-tests to perform two-and three-dimensional sutures on ox tongue. All recorded performances were evaluated by two blinded evaluators, using the Global Rating Scale.RESULTS: Although both groups have had a better performance (p<0.05) in the post-test when compared with the pre-test, the analysis of post-test showed that group A (EVA) had a better performance (p<0.05) when compared with group B (control).CONCLUSION: The ethylene vinyl acetate bench model allowed the novice students to acquire suture skills faster when compared to the traditional model of teaching.
Resumo:
In Fazenda Belém oil field (Potiguar Basin, Ceará State, Brazil) occur frequently sinkholes and sudden terrain collapses associated to an unconsolidated sedimentary cap covering the Jandaíra karst. This research was carried out in order to understand the mechanisms of generation of these collapses. The main tool used was Ground Penetrating Radar (GPR). This work is developed twofold: one aspect concerns methodology improvements in GPR data processing whilst another aspect concerns the geological study of the Jandaíra karst. This second aspect was strongly supported both by the analysis of outcropping karst structures (in another regions of Potiguar Basin) and by the interpretation of radargrams from the subsurface karst in Fazenda Belém. It was designed and tested an adequate flux to process GPR data which was adapted from an usual flux to process seismic data. The changes were introduced to take into account important differences between GPR and Reflection Seismic methods, in particular: poor coupling between source and ground, mixed phase of the wavelet, low signal-to-noise ratio, monochannel acquisition, and high influence of wave propagation effects, notably dispersion. High frequency components of the GPR pulse suffer more pronounced effects of attenuation than low frequency components resulting in resolution losses in radargrams. In Fazenda Belém, there is a stronger need of an suitable flux to process GPR data because both the presence of a very high level of aerial events and the complexity of the imaged subsurface karst structures. The key point of the processing flux was an improvement in the correction of the attenuation effects on the GPR pulse based on their influence on the amplitude and phase spectra of GPR signals. In low and moderate losses dielectric media the propagated signal suffers significant changes only in its amplitude spectrum; that is, the phase spectrum of the propagated signal remains practically unaltered for the usual travel time ranges. Based on this fact, it is shown using real data that the judicious application of the well known tools of time gain and spectral balancing can efficiently correct the attenuation effects. The proposed approach can be applied in heterogeneous media and it does not require the precise knowledge of the attenuation parameters of the media. As an additional benefit, the judicious application of spectral balancing promotes a partial deconvolution of the data without changing its phase. In other words, the spectral balancing acts in a similar way to a zero phase deconvolution. In GPR data the resolution increase obtained with spectral balancing is greater than those obtained with spike and predictive deconvolutions. The evolution of the Jandaíra karst in Potiguar Basin is associated to at least three events of subaerial exposition of the carbonatic plataform during the Turonian, Santonian, and Campanian. In Fazenda Belém region, during the mid Miocene, the Jandaíra karst was covered by continental siliciclastic sediments. These sediments partially filled the void space associated to the dissolution structures and fractures. Therefore, the development of the karst in this region was attenuated in comparison to other places in Potiguar Basin where this karst is exposed. In Fazenda Belém, the generation of sinkholes and terrain collapses are controlled mainly by: (i) the presence of an unconsolidated sedimentary cap which is thick enough to cover completely the karst but with sediment volume lower than the available space associated to the dissolution structures in the karst; (ii) the existence of important structural of SW-NE and NW-SE alignments which promote a localized increase in the hydraulic connectivity allowing the channeling of underground water, thus facilitating the carbonatic dissolution; and (iii) the existence of a hydraulic barrier to the groundwater flow, associated to the Açu-4 Unity. The terrain collapse mechanisms in Fazenda Belém occur according to the following temporal evolution. The meteoric water infiltrates through the unconsolidated sedimentary cap and promotes its remobilization to the void space associated with the dissolution structures in Jandaíra Formation. This remobilization is initiated at the base of the sedimentary cap where the flow increases its abrasion due to a change from laminar to turbulent flow regime when the underground water flow reaches the open karst structures. The remobilized sediments progressively fill from bottom to top the void karst space. So, the void space is continuously migrated upwards ultimately reaching the surface and causing the sudden observed terrain collapses. This phenomenon is particularly active during the raining season, when the water table that normally is located in the karst may be temporarily located in the unconsolidated sedimentary cap
Resumo:
The aim of this work was to describe the methodological procedures that were mandatory to develop a 3D digital imaging of the external and internal geometry of the analogue outcrops from reservoirs and to build a Virtual Outcrop Model (VOM). The imaging process of the external geometry was acquired by using the Laser Scanner, the Geodesic GPS and the Total Station procedures. On the other hand, the imaging of the internal geometry was evaluated by GPR (Ground Penetrating Radar).The produced VOMs were adapted with much more detailed data with addition of the geological data and the gamma ray and permeability profiles. As a model for the use of the methodological procedures used on this work, the adapted VOM, two outcrops, located at the east part of the Parnaiba Basin, were selected. On the first one, rocks from the aeolian deposit of the Piaui Formation (Neo-carboniferous) and tidal flat deposits from the Pedra de Fogo Formation (Permian), which arises in a large outcrops located between Floriano and Teresina (Piauí), are present. The second area, located at the National Park of Sete Cidades, also at the Piauí, presents rocks from the Cabeças Formation deposited in fluvial-deltaic systems during the Late Devonian. From the data of the adapted VOMs it was possible to identify lines, surfaces and 3D geometry, and therefore, quantify the geometry of interest. Among the found parameterization values, a table containing the thickness and width, obtained in canal and lobes deposits at the outcrop Paredão and Biblioteca were the more relevant ones. In fact, this table can be used as an input for stochastic simulation of reservoirs. An example of the direct use of such table and their predicted radargrams was the identification of the bounding surface at the aeolian sites from the Piauí Formation. In spite of such radargrams supply only bi-dimensional data, the acquired lines followed of a mesh profile were used to add a third dimension to the imaging of the internal geometry. This phenomenon appears to be valid for all studied outcrops. As a conclusion, the tool here presented can became a new methodology in which the advantages of the digital imaging acquired from the Laser Scanner (precision, accuracy and speed of acquisition) were combined with the Total Station procedure (precision) using the classical digital photomosaic technique