966 resultados para 3D Modeling
Resumo:
In this beginning of the XXI century, the Geology moves for new ways that demand a capacity to work with different information and new tools. It is within this context that the analog characterization has important in the prediction and understanding the lateral changes in the geometry and facies distribution. In the present work was developed a methodology for integration the geological and geophysical data in transitional recent deposits, the modeling of petroliferous reservoirs, the volume calculation and the uncertainties associate with this volume. For this purpose it was carried planialtimetric and geophysics (Ground Penetrating Radar) surveys in three areas of the Parnaíba River. With this information, it was possible to visualize the overlap of different estuary channels and make the delimitation of the channel geometry (width and thickness). For three-dimensional visualization and modeling were used two of the main reservoirs modeling software. These studies were performed with the collected parameters and the data of two reservoirs. The first was created with the Potiguar Basin wells data existents in the literature and corresponding to Açu IV unit. In the second case was used a real database of the Northern Sea. In the procedures of reservoirs modeling different workflows were created and generated five study cases with their volume calculation. Afterwards an analysis was realized to quantify the uncertainties in the geological modeling and their influence in the volume. This analysis was oriented to test the generating see and the analogous data use in the model construction
Resumo:
Background: Cysticercosis and hydatidosis seriously affect human health and are responsible for considerable economic loss in animal husbandry in non-developed and developed countries. S3Pvac and EG95 are the only field trial-tested vaccine candidates against cysticercosis and hydatidosis, respectively. S3Pvac is composed of three peptides (KETc1, GK1 and KETc12), originally identified in a Taenia crassiceps cDNA library. S3Pvac synthetically and recombinantly expressed is effective against experimentally and naturally acquired cysticercosis.Methodology/ Principal Findings: In this study, the homologous sequences of two of the S3Pvac peptides, GK1 and KETc1, were identified and further characterized in Taenia crassiceps WFU, Taenia solium, Taenia saginata, Echinococcus granulosus and Echinococcus multilocularis. Comparisons of the nucleotide and amino acid sequences coding for KETc1 and GK1 revealed significant homologies in these species. The predicted secondary structure of GK1 is almost identical between the species, while some differences were observed in the C terminal region of KETc1 according to 3D modeling. A KETc1 variant with a deletion of three C-terminal amino acids protected to the same extent against experimental murine cysticercosis as the entire peptide. on the contrary, immunization with the truncated GK1 failed to induce protection. Immunolocalization studies revealed the non stage-specificity of the two S3Pvac epitopes and their persistence in the larval tegument of all species and in Taenia adult tapeworms.Conclusions/ Significance: These results indicate that GK1 and KETc1 may be considered candidates to be included in the formulation of a multivalent and multistage vaccine against these cestodiases because of their enhancing effects on other available vaccine candidates.
Resumo:
Objectives: The clinical translation of stem cell-based Regenerative Endodontics demands further development of suitable injectable scaffolds. Puramatrix™ is a defined, self-assembling peptide hydrogel which instantaneously polymerizes under normal physiological conditions. Here, we assessed the compatibility of Puramatrix™ with dental pulp stem cell (DPSC) growth and differentiation. Methods: DPSC cells were grown in 0.05-0.25% Puramatrix™. Cell viability was measured colorimetrically using the WST-1 assay. Cell morphology was observed in 3D modeling using confocal microscopy. In addition, we used the human tooth slice model with Puramatrix™ to verify DPSC differentiation into odontoblast-like cells, as measured by expression of DSPP and DMP-1. Results: DPSC survived and proliferated in Puramatrix™ for at least three weeks in culture. Confocal microscopy revealed that cells seeded in Puramatrix™ presented morphological features of healthy cells, and some cells exhibited cytoplasmic elongations. Notably, after 21 days in tooth slices containing Puramatrix™, DPSC cells expressed DMP-1 and DSPP, putative markers of odontoblastic differentiation. Significance: Collectively, these data suggest that self-assembling peptide hydrogels might be useful injectable scaffolds for stem cell-based Regenerative Endodontics. © 2012 Academy of Dental Materials.
Resumo:
Geophysical methods are widely used in mineral exploration. This paper discusses the results of geological and geophysical studies in supergene manganese deposits of southern Brazil. Mineralized zones as described in geological surveys were characterized as of low resistivity (20 Omega.m) and high chargeability (30ms), pattern found also in oxides and sulfite mineral deposits. Pseudo-3D modeling of geophysical data allowed mapping at several depths. A relationship between high chargeability and low resistivity may define a pattern for high grade gonditic manganese ore. Large areas of high chargeability and high resistivity may result in accumulation of manganese and iron hydroxides, due to weathering of the gonditic ore, dissolution, percolation and precipitation.
Resumo:
Mass reduction coupled with the mechanical performance in service has been the goal of many projects related to the transport area, considering the advantages that mass reduction can bring. However, make a simple material substitution without design a new geometry to corroborate for the best component performance, often makes the replacement unviable. In this study, it was investigated the advantages of replacing the prototype BAJA SAE front suspension lower arm of Equipe Piratas do Vale de BAJA SAE - Universidade Paulista, Campus Guaratinguetá, actually produced with steel, for a new component made of carbon fiber composite. The new geometry has been developed to provide the best possible performance for this component and your easy manufacturing. The study was done using the 3D modeling tools and computer simulations via finite element method. The first stage of this work consisted on calculation of the estimated maximum contact force tire / soil in a prototype landing after jump at one meter high, drop test in the laboratory with the current vehicle, current front suspension lower arm 3D modeling, finite element simulation and analysis of critical regions. After all current component analysis, a new geometry for the part in study was designed and simulated in order to reduce the component mass and provide a technological innovation using composite materials. With this work it was possible to obtain a theoretical component mass reduction of 25,15% maintaining the mechanical strength necessary for the appropriated component performance when incited
Resumo:
This work aims the design and analysis of a thrust frame system for a liquid fuel rocket engine. The project was developed following the design requirements established by the Division of Space Propulsion of the Institute of Aeronautics and Space. The layout of the structure was developed with the aid of a software of 3D modeling and static and dynamic analysis were performed by using a finite element package. The results of the analyzes helped in defining the layout of the structure which met all design requirements. The safety factor and the mass achieved were comfortably low, which may be useful in the future because the liquid fuel rocket engine is still in development
Resumo:
The finite element method is of great importance for the development and analysis of a new product being designed or already on the market, and that requires some specific request or special application. The tower crane, being an essential equipment for modern construction to increase productivity and safety on construction sites, is required for many types of special applications day after day, in many kinds of work. Paying attention to this growing need for handling special projects for the tower crane, faced with the importance and necessity of development and improvement of knowledge in more accurate and practical calculation methods such as the finite element method , for greater agility and precision in the response to a new project. The tower crane is defined by the maximum load moment that it can act with a certain amount of load. The tower crane which will be analyzed in this work , for example, is a tower crane with a resulting capacity of 85 Metric Tons which are considered basic dimensions data of a fisical tower crane of a crane company Liebherr in Guaratinguetá . Thus, the project analysis will begin with the threedimensional representation of the crane lines with AutoCAD software , conversion of this model to the format accepted ANSYS Workbench and completion of 3D modeling of structural components in Design module ANSYS software. After structural modeling is completed, the simulation is performed in static simulation of ANSYS Workbench software mode. The standards will be adopted to DIN (Deutsches Institut für Normung) and EN 14439 (Europäische Normung 14439) and some NR 's related to specific security class of tower cranes, which will be referred throughout the work
Resumo:
The building budgeting quickly and accurately is a challenge faced by the companies in the sector. The cost estimation process is performed from the quantity takeoff and this process of quantification, historically, through the analysis of the project, scope of work and project information contained in 2D design, text files and spreadsheets. This method, in many cases, present itself flawed, influencing the making management decisions, once it is closely coupled to time and cost management. In this scenario, this work intends to make a critical analysis of conventional process of quantity takeoff, from the quantification through 2D designs, and with the use of the software Autodesk Revit 2016, which uses the concepts of building information modeling for automated quantity takeoff of 3D model construction. It is noted that the 3D modeling process should be aligned with the goals of budgeting. The use of BIM technology programs provides several benefits compared to traditional quantity takeoff process, representing gains in productivity, transparency and assertiveness
Resumo:
The building budgeting quickly and accurately is a challenge faced by the companies in the sector. The cost estimation process is performed from the quantity takeoff and this process of quantification, historically, through the analysis of the project, scope of work and project information contained in 2D design, text files and spreadsheets. This method, in many cases, present itself flawed, influencing the making management decisions, once it is closely coupled to time and cost management. In this scenario, this work intends to make a critical analysis of conventional process of quantity takeoff, from the quantification through 2D designs, and with the use of the software Autodesk Revit 2016, which uses the concepts of building information modeling for automated quantity takeoff of 3D model construction. It is noted that the 3D modeling process should be aligned with the goals of budgeting. The use of BIM technology programs provides several benefits compared to traditional quantity takeoff process, representing gains in productivity, transparency and assertiveness
Resumo:
Among the experimental methods commonly used to define the behaviour of a full scale system, dynamic tests are the most complete and efficient procedures. A dynamic test is an experimental process, which would define a set of characteristic parameters of the dynamic behaviour of the system, such as natural frequencies of the structure, mode shapes and the corresponding modal damping values associated. An assessment of these modal characteristics can be used both to verify the theoretical assumptions of the project, to monitor the performance of the structural system during its operational use. The thesis is structured in the following chapters: The first introductive chapter recalls some basic notions of dynamics of structure, focusing the discussion on the problem of systems with multiply degrees of freedom (MDOF), which can represent a generic real system under study, when it is excited with harmonic force or in free vibration. The second chapter is entirely centred on to the problem of dynamic identification process of a structure, if it is subjected to an experimental test in forced vibrations. It first describes the construction of FRF through classical FFT of the recorded signal. A different method, also in the frequency domain, is subsequently introduced; it allows accurately to compute the FRF using the geometric characteristics of the ellipse that represents the direct input-output comparison. The two methods are compared and then the attention is focused on some advantages of the proposed methodology. The third chapter focuses on the study of real structures when they are subjected to experimental test, where the force is not known, like in an ambient or impact test. In this analysis we decided to use the CWT, which allows a simultaneous investigation in the time and frequency domain of a generic signal x(t). The CWT is first introduced to process free oscillations, with excellent results both in terms of frequencies, dampings and vibration modes. The application in the case of ambient vibrations defines accurate modal parameters of the system, although on the damping some important observations should be made. The fourth chapter is still on the problem of post processing data acquired after a vibration test, but this time through the application of discrete wavelet transform (DWT). In the first part the results obtained by the DWT are compared with those obtained by the application of CWT. Particular attention is given to the use of DWT as a tool for filtering the recorded signal, in fact in case of ambient vibrations the signals are often affected by the presence of a significant level of noise. The fifth chapter focuses on another important aspect of the identification process: the model updating. In this chapter, starting from the modal parameters obtained from some environmental vibration tests, performed by the University of Porto in 2008 and the University of Sheffild on the Humber Bridge in England, a FE model of the bridge is defined, in order to define what type of model is able to capture more accurately the real dynamic behaviour of the bridge. The sixth chapter outlines the necessary conclusions of the presented research. They concern the application of a method in the frequency domain in order to evaluate the modal parameters of a structure and its advantages, the advantages in applying a procedure based on the use of wavelet transforms in the process of identification in tests with unknown input and finally the problem of 3D modeling of systems with many degrees of freedom and with different types of uncertainty.
Resumo:
“Cartographic heritage” is different from “cartographic history”. The second term refers to the study of the development of surveying and drawing techniques related to maps, through time, i.e. through different types of cultural environment which were background for the creation of maps. The first term concerns the whole amount of ancient maps, together with these different types of cultural environment, which the history has brought us and which we perceive as cultural values to be preserved and made available to many users (public, institutions, experts). Unfortunately, ancient maps often suffer preservation problems of their analog support, mostly due to aging. Today, metric recovery in digital form and digital processing of historical cartography allow preserving map heritage. Moreover, modern geomatic techniques give us new chances of using historical information, which would be unachievable on analog supports. In this PhD thesis, the whole digital processing of recovery and elaboration of ancient cartography is reported, with a special emphasis on the use of digital tools in preservation and elaboration of cartographic heritage. It is possible to divide the workflow into three main steps, that reflect the chapter structure of the thesis itself: • map acquisition: conversion of the ancient map support from analog to digital, by means of high resolution scanning or 3D surveying (digital photogrammetry or laser scanning techniques); this process must be performed carefully, with special instruments, in order to reduce deformation as much as possible; • map georeferencing: reproducing in the digital image the native metric content of the map, or even improving it by selecting a large number of still existing ground control points; this way it is possible to understand the projection features of the historical map, as well as to evaluate and represent the degree of deformation induced by the old type of cartographic transformation (that can be unknown to us), by surveying errors or by support deformation, usually all errors of too high value with respect to our standards; • data elaboration and management in a digital environment, by means of modern software tools: vectorization, giving the map a new and more attractive graphic view (for instance, by creating a 3D model), superimposing it on current base maps, comparing it to other maps, and finally inserting it in GIS or WebGIS environment as a specific layer. The study is supported by some case histories, each of them interesting from the point of view of one digital cartographic elaboration step at least. The ancient maps taken into account are the following ones: • three maps of the Po river delta, made at the end of the XVI century by a famous land-surveyor, Ottavio Fabri (he is single author in the first map, co-author with Gerolamo Pontara in the second map, co-author with Bonajuto Lorini and others in the third map), who wrote a methodological textbook where he explains a new topographical instrument, the squadra mobile (mobile square) invented and used by himself; today all maps are preserved in the State Archive of Venice; • the Ichnoscenografia of Bologna by Filippo de’ Gnudi, made in the 1702 and today preserved in the Archiginnasio Library of Bologna; it is a scenographic view of the city, captured in a bird’s eye flight, but also with an icnographic value, as the author himself declares; • the map of Bologna by the periti Gregorio Monari and Antonio Laghi, the first map of the city derived from a systematic survey, even though it was made only ten years later (1711–1712) than the map by de’ Gnudi; in this map the scenographic view was abandoned, in favor of a more correct representation by means of orthogonal projection; today the map is preserved in the State Archive of Bologna; • the Gregorian Cadastre of Bologna, made in 1831 and updated until 1927, now preserved in the State Archive of Bologna; it is composed by 140 maps and 12 brogliardi (register volumes). In particular, the three maps of the Po river delta and the Cadastre were studied with respect to their acquisition procedure. Moreover, the first maps were analyzed from the georeferencing point of view, and the Cadastre was analyzed with respect to a possible GIS insertion. Finally, the Ichnoscenografia was used to illustrate a possible application of digital elaboration, such as 3D modeling. Last but not least, we must not forget that the study of an ancient map should start, whenever possible, from the consultation of the precious original analogical document; analysis by means of current digital techniques allow us new research opportunities in a rich and modern multidisciplinary context.
Resumo:
English: The assessment of safety in existing bridges and viaducts led the Ministry of Public Works of the Netherlands to finance a specific campaing aimed at the study of the response of the elements of these infrastructures. Therefore, this activity is focused on the investigation of the behaviour of reinforced concrete slabs under concentrated loads, adopting finite element modeling and comparison with experimental results. These elements are characterized by shear behaviour and crisi, whose modeling is, from a computational point of view, a hard challeng, due to the brittle behavior combined with three-dimensional effects. The numerical modeling of the failure is studied through Sequentially Linear Analysis (SLA), an alternative Finite Element method, with respect to traditional incremental and iterative approaches. The comparison between the two different numerical techniques represents one of the first works and comparisons in a three-dimensional environment. It's carried out adopting one of the experimental test executed on reinforced concrete slabs as well. The advantage of the SLA is to avoid the well known problems of convergence of typical non-linear analysis, by directly specifying a damage increment, in terms of reduction of stiffness and resistance in particular finite element, instead of load or displacement increasing on the whole structure . For the first time, particular attention has been paid to specific aspects of the slabs, like an accurate constraints modeling and sensitivity of the solution with respect to the mesh density. This detailed analysis with respect to the main parameters proofed a strong influence of the tensile fracture energy, mesh density and chosen model on the solution in terms of force-displacement diagram, distribution of the crack patterns and shear failure mode. The SLA showed a great potential, but it requires a further developments for what regards two aspects of modeling: load conditions (constant and proportional loads) and softening behaviour of brittle materials (like concrete) in the three-dimensional field, in order to widen its horizons in these new contexts of study.
Resumo:
Throughout this research, the whole life cycle of a building will be analyzed, with a special focus on the most common issues that affect the construction sector nowadays, such as safety. In fact, the goal is to enhance the management of the entire construction process in order to reduce the risk of accidents. The contemporary trend is that of researching new tools capable of reducing, or even eliminating, the most common mistakes that usually lead to safety risks. That is one of the main reasons why new technologies and tools have been introduced in the field. The one we will focus on is the so-called BIM: Building Information Modeling. With the term BIM we refer to wider and more complex analysis tool than a simple 3D modeling software. Through BIM technologies we are able to generate a multi-dimension 3D model which contains all the information about the project. This innovative approach aims at a better understanding and control of the project by taking into consideration the entire life cycle and resulting in a faster and more sustainable way of management. Furthermore, BIM software allows for the sharing of all the information among the different aspects of the project and among the different participants involved thus improving the cooperation and communication. In addition, BIM software utilizes smart tools that simulate and visualize the process in advance, thus preventing issues that might not have been taking into consideration during the design process. This leads to higher chances of avoiding risks, delays and cost increases. Using a hospital case study, we will apply this approach for the completion of a safety plan, with a special focus onto the construction phase.
Resumo:
BACKGROUND: Infantile hypophosphatasia (IH) is an inherited disorder characterized by defective bone mineralization and a deficiency of alkaline phosphatase activity. OBJECTIVE/DESIGN: The aim of the study was to evaluate a new compound heterozygous TNSALP mutation for its residual enzyme activity and localization of the comprised amino acid residues in a 3D-modeling. PATIENT: We report on a 4-week old girl with craniotabes, severe defects of ossification, and failure to thrive. Typical clinical features as low serum alkaline phosphatase, high serum calcium concentration, increased urinary calcium excretion, and nephrocalcinosis were observed. Vitamin D was withdrawn and the patient was started on calcitonin and hydrochlorothiazide. Nonetheless, the girl died at the age of 5 months from respiratory failure. RESULTS: Sequence analysis of the patient's TNSALP gene revealed two heterozygous mutations [c.653T>C (I201T), c.1171C>T (R374C)]. Transfection studies of the unique I201T variant in COS-7 cells yielded a mutant TNSALP protein with only a residual enzyme activity (3.7%) compared with wild-type, whereas the R374C variant was previously shown to reduce normal activity to 10.3%. 3D-modeling of the mutated enzyme showed that I201T resides in a region that does not belong to any known functional site. CONCLUSION: We note that I201, which has been conserved during evolution, is buried in a hydrophobic pocket and, therefore, the I>T-change should affect its functional properties. Residue R374C is located in the interface between monomers and it has been previously suggested that this mutation affects dimerization. These findings explain the patient's clinical picture and severe course.
Resumo:
For broadcasting purposes MIXED REALITY, the combination of real and virtual scene content, has become ubiquitous nowadays. Mixed Reality recording still requires expensive studio setups and is often limited to simple color keying. We present a system for Mixed Reality applications which uses depth keying and provides threedimensional mixing of real and artificial content. It features enhanced realism through automatic shadow computation which we consider a core issue to obtain realism and a convincing visual perception, besides the correct alignment of the two modalities and correct occlusion handling. Furthermore we present a possibility to support placement of virtual content in the scene. Core feature of our system is the incorporation of a TIME-OF-FLIGHT (TOF)-camera device. This device delivers real-time depth images of the environment at a reasonable resolution and quality. This camera is used to build a static environment model and it also allows correct handling of mutual occlusions between real and virtual content, shadow computation and enhanced content planning. The presented system is inexpensive, compact, mobile, flexible and provides convenient calibration procedures. Chroma-keying is replaced by depth-keying which is efficiently performed on the GRAPHICS PROCESSING UNIT (GPU) by the usage of an environment model and the current ToF-camera image. Automatic extraction and tracking of dynamic scene content is herewith performed and this information is used for planning and alignment of virtual content. An additional sustainable feature is that depth maps of the mixed content are available in real-time, which makes the approach suitable for future 3DTV productions. The presented paper gives an overview of the whole system approach including camera calibration, environment model generation, real-time keying and mixing of virtual and real content, shadowing for virtual content and dynamic object tracking for content planning.