32 resultados para Molecules - Models - Computer simulation
Resumo:
This Masters Degree dissertation seeks to make a comparative study of internal air temperature data, simulated through the thermal computer application DesignBuilder 1.2, and data registered in loco through HOBO® Temp Data Logger, in a Social Housing Prototype (HIS), located at the Central Campus of the Federal University of Rio Grande do Norte UFRN. The prototype was designed and built seeking strategies of thermal comfort recommended for the local climate where the study was carried out, and built with panels of cellular concrete by Construtora DoisA, a collaborator of research project REPESC Rede de Pesquisa em Eficiência Energética de Sistemas Construtivos (Research Network on Energy Efficiency of Construction Systems), an integral part of Habitare program. The methodology employed carefully examined the problem, reviewed the bibliography, analyzing the major aspects related to computer simulations for thermal performance of buildings, such as climate characterization of the region under study and users thermal comfort demands. The DesignBuilder 1.2 computer application was used as a simulation tool, and theoretical alterations were carried out in the prototype, then they were compared with the parameters of thermal comfort adopted, based on the area s current technical literature. Analyses of the comparative studies were performed through graphical outputs for a better understanding of air temperature amplitudes and thermal comfort conditions. The data used for the characterization of external air temperature were obtained from the Test Reference Year (TRY), defined for the study area (Natal-RN). Thus the author also performed comparative studies for TRY data registered in the years 2006, 2007 and 2008, at weather station Davis Precision Station, located at the Instituto Nacional de Pesquisas Espaciais INPE-CRN (National Institute of Space Research), in a neighboring area of UFRN s Central Campus. The conclusions observed from the comparative studies performed among computer simulations, and the local records obtained from the studied prototype, point out that the simulations performed in naturally ventilated buildings is quite a complex task, due to the applications limitations, mainly owed to the complexity of air flow phenomena, the influence of comfort conditions in the surrounding areas and climate records. Lastly, regarding the use of the application DesignBuilder 1.2 in the present study, one may conclude that it is a good tool for computer simulations. However, it needs some adjustments to improve reliability in its use. There is a need for continued research, considering the dedication of users to the prototype, as well as the thermal charges of the equipment, in order to check sensitivity
Resumo:
The building envelope is the principal mean of interaction between indoors and environment, with direct influence on thermal and energy performance of the building. By intervening in the envelope, with the proposal of specific architectural elements, it is possible to promote the use of passive strategies of conditioning, such as natural ventilation. The cross ventilation is recommended by the NBR 15220-3 as the bioclimatic main strategy for the hot and humid climate of Natal/RN, offering among other benefits, the thermal comfort of occupants. The analysis tools of natural ventilation, on the other hand, cover a variety of techniques, from the simplified calculation methods to computer fluid dynamics, whose limitations are discussed in several papers, but without detailing the problems encountered. In this sense, the present study aims to evaluate the potential of wind catchers, envelope elements used to increase natural ventilation in the building, through CFD simplified simulation. Moreover, it seeks to quantify the limitations encountered during the analysis. For this, the procedure adopted to evaluate the elements implementation and efficiency was the CFD simulation, abbreviation for Computer Fluid Dynamics, with the software DesignBuilder CFD. It was defined a base case, where wind catchers were added with various settings, to compare them with each other and appreciate the differences in flows and air speeds encountered. Initially there has been done sensitivity tests for familiarization with the software and observe simulation patterns, mapping the settings used and simulation time for each case simulated. The results show the limitations encountered during the simulation process, as well as an overview of the efficiency and potential of wind catchers, with the increase of ventilation with the use of catchers, differences in air flow patterns and significant increase in air speeds indoors, besides changes found due to different element geometries. It is considered that the software used can help designers during preliminary analysis in the early stages of design
Resumo:
Na unfolding method of linear intercept distributions and secction área distribution was implemented for structures with spherical grains. Although the unfolding routine depends on the grain shape, structures with spheroidal grains can also be treated by this routine. Grains of non-spheroidal shape can be treated only as approximation. A software was developed with two parts. The first part calculates the probability matrix. The second part uses this matrix and minimizes the chi-square. The results are presented with any number of size classes as required. The probability matrix was determined by means of the linear intercept and section area distributions created by computer simulation. Using curve fittings the probability matrix for spheres of any sizes could be determined. Two kinds of tests were carried out to prove the efficiency of the Technique. The theoretical tests represent ideal cases. The software was able to exactly find the proposed grain size distribution. In the second test, a structure was simulated in computer and images of its slices were used to produce the corresponding linear intercept the section area distributions. These distributions were then unfolded. This test simulates better reality. The results show deviations from the real size distribution. This deviations are caused by statistic fluctuation. The unfolding of the linear intercept distribution works perfectly, but the unfolding of section area distribution does not work due to a failure in the chi-square minimization. The minimization method uses a matrix inversion routine. The matrix generated by this procedure cannot be inverted. Other minimization method must be used
Resumo:
Oil wells subjected to cyclic steam injection present important challenges for the development of well cementing systems, mainly due to tensile stresses caused by thermal gradients during its useful life. Cement sheath failures in wells using conventional high compressive strength systems lead to the use of cement systems that are more flexible and/or ductile, with emphasis on Portland cement systems with latex addition. Recent research efforts have presented geopolymeric systems as alternatives. These cementing systems are based on alkaline activation of amorphous aluminosilicates such as metakaolin or fly ash and display advantageous properties such as high compressive strength, fast setting and thermal stability. Basic geopolymeric formulations can be found in the literature, which meet basic oil industry specifications such as rheology, compressive strength and thickening time. In this work, new geopolymeric formulations were developed, based on metakaolin, potassium silicate, potassium hydroxide, silica fume and mineral fiber, using the state of the art in chemical composition, mixture modeling and additivation to optimize the most relevant properties for oil well cementing. Starting from molar ratios considered ideal in the literature (SiO2/Al2O3 = 3.8 e K2O/Al2O3 = 1.0), a study of dry mixtures was performed,based on the compressive packing model, resulting in an optimal volume of 6% for the added solid material. This material (silica fume and mineral fiber) works both as an additional silica source (in the case of silica fume) and as mechanical reinforcement, especially in the case of mineral fiber, which incremented the tensile strength. The first triaxial mechanical study of this class of materials was performed. For comparison, a mechanical study of conventional latex-based cementing systems was also carried out. Regardless of differences in the failure mode (brittle for geopolymers, ductile for latex-based systems), the superior uniaxial compressive strength (37 MPa for the geopolymeric slurry P5 versus 18 MPa for the conventional slurry P2), similar triaxial behavior (friction angle 21° for P5 and P2) and lower stifness (in the elastic region 5.1 GPa for P5 versus 6.8 GPa for P2) of the geopolymeric systems allowed them to withstand a similar amount of mechanical energy (155 kJ/m3 for P5 versus 208 kJ/m3 for P2), noting that geopolymers work in the elastic regime, without the microcracking present in the case of latex-based systems. Therefore, the geopolymers studied on this work must be designed for application in the elastic region to avoid brittle failure. Finally, the tensile strength of geopolymers is originally poor (1.3 MPa for the geopolymeric slurry P3) due to its brittle structure. However, after additivation with mineral fiber, the tensile strength became equivalent to that of latex-based systems (2.3 MPa for P5 and 2.1 MPa for P2). The technical viability of conventional and proposed formulations was evaluated for the whole well life, including stresses due to cyclic steam injection. This analysis was performed using finite element-based simulation software. It was verified that conventional slurries are viable up to 204ºF (400ºC) and geopolymeric slurries are viable above 500ºF (260ºC)
Resumo:
In this work we developed a computer simulation program for physics porous structures based on programming language C + + using a Geforce 9600 GT with the PhysX chip, originally developed for video games. With this tool, the ability of physical interaction between simulated objects is enlarged, allowing to simulate a porous structure, for example, reservoir rocks and structures with high density. The initial procedure for developing the simulation is the construction of porous cubic structure consisting of spheres with a single size and with varying sizes. In addition, structures can also be simulated with various volume fractions. The results presented are divided into two parts: first, the ball shall be deemed as solid grains, ie the matrix phase represents the porosity, the second, the spheres are considered as pores. In this case the matrix phase represents the solid phase. The simulations in both cases are the same, but the simulated structures are intrinsically different. To validate the results presented by the program, simulations were performed by varying the amount of grain, the grain size distribution and void fraction in the structure. All results showed statistically reliable and consistent with those presented in the literature. The mean values and distributions of stereological parameters measured, such as intercept linear section of perimeter area, sectional area and mean free path are in agreement with the results obtained in the literature for the structures simulated. The results may help the understanding of real structures.
Resumo:
This work focuses on the creation and applications of a dynamic simulation software in order to study the hard metal structure (WC-Co). The technological ground used to increase the GPU hardware capacity was Geforce 9600 GT along with the PhysX chip created to make games more realistic. The software simulates the three-dimensional carbide structure to the shape of a cubic box where tungsten carbide (WC) are modeled as triangular prisms and truncated triangular prisms. The program was proven effective regarding checking testes, ranging from calculations of parameter measures such as the capacity to increase the number of particles simulated dynamically. It was possible to make an investigation of both the mean parameters and distributions stereological parameters used to characterize the carbide structure through cutting plans. Grounded on the cutting plans concerning the analyzed structures, we have investigated the linear intercepts, the intercepts to the area, and the perimeter section of the intercepted grains as well as the binder phase to the structure by calculating the mean value and distribution of the free path. As literature shows almost consensually that the distribution of the linear intercepts is lognormal, this suggests that the grain distribution is also lognormal. Thus, a routine was developed regarding the program which made possible a more detailed research on this issue. We have observed that it is possible, under certain values for the parameters which define the shape and size of the Prismatic grain to find out the distribution to the linear intercepts that approach the lognormal shape. Regarding a number of developed simulations, we have observed that the distribution curves of the linear and area intercepts as well as the perimeter section are consistent with studies on static computer simulation to these parameters.
Resumo:
In Brazilian Northeast there are reservoirs with heavy oil, which use steam flooding as a recovery method. This process allows to reduce oil viscosity, increasing its mobility and consequently its oil recovery. Steam injection is a thermal method and can occurs in continues or cyclic form. Cyclic steam stimulation (CSS) can be repeated several times. Each cycle consisting of three stages: steam injection, soaking time and production phase. CSS becomes less efficient with an increase of number of cycles. Thus, this work aims to study the influence of compositional models in cyclic steam injection and the effects of some parameters, such like: flow injection, steam quality and temperature of steam injected, analyzing the influence of pseudocomponents numbers on oil rate, cumulative oil, oil recovery and simulation time. In the situations analyzed was compared the model of fluid of three phases and three components known as Blackoil . Simulations were done using commercial software (CMG), it was analyzed a homogeneous reservoir with characteristics similar to those found in Brazilian Northeast. It was observed that an increase of components number, increase the time spent in simulation. As for analyzed parameters, it appears that the steam rate, and steam quality has influence on cumulative oil and oil recovery. The number of components did not a lot influenced on oil recovery, however it has influenced on gas production
Resumo:
This work aims presenting the development of a model and computer simulation of a sucker rod pumping system. This system take into account the well geometry, the flow through the tubing, the dynamic behavior of the rod string and the use of a induction motor model. The rod string were modeled using concentrated parameters, allowing the use of ordinary differential equations systems to simulate it s behavior
Resumo:
The objective of the thermal recovery is to heat the resevoir and the oil in it to increase its recovery. In the Potiguar river basin there are located several heavy oil reservoirs whose primary recovery energy provides us with a little oil flow, which makes these reservoirs great candidates for application of a method of recovery advanced of the oil, especially the thermal. The steam injection can occur on a cyclical or continuous manner. The continuous steam injection occurs through injection wells, which in its vicinity form a zone of steam that expands itself, having as a consequence the displace of the oil with viscosity and mobility improved towards the producing wells. Another possible mechanism of displacement of oil in reservoirs subjected to continuous injection of steam is the distillation of oil by steam, which at high temperatures; their lighter fractions can be vaporized by changing the composition of the oil produced, of the oil residual or to shatter in the amount of oil produced. In this context, this paper aims to study the influence of compositional models in the continuous injection of steam through in the analysis of some parameters such as flow injection steam and temperature of injection. Were made various leading comparative analysis taking the various models of fluid, varying from a good elementary, with 03 pseudocomponents to a modeling of fluids with increasing numbers of pseudocomponents. A commercial numerical simulator was used for the study from a homogeneous reservoir model with similar features to those found in northeastern Brazil. Some conclusions as the increasing of the simulation time with increasing number of pseudocomponents, the significant influence of flow injection on cumulative production of oil and little influence of the number of pseudocomponents in the flows and cumulative production of oil were found
Resumo:
The Electrical Submersible Pump (ESP) has been one of the most appropriate solutions for lifting method in onshore and offshore applications. The typical features for this application are adverse temperature, viscosity fluids and gas environments. The difficulties in equipments maintenance and setup contributing to increasing costs of oil production in deep water, therefore, the optimization through automation can be a excellent approach for decrease costs and failures in subsurface equipment. This work describe a computer simulation related with the artificial lifting method ESP. This tool support the dynamic behavior of ESP approach, considering the source and electric energy transmission model for the motor, the electric motor model (including the thermal calculation), flow tubbing simulation, centrifugal pump behavior simulation with liquid nature effects and reservoir requirements. In addition, there are tri-dimensional animation for each ESP subsytem (transformer, motor, pump, seal, gas separator, command unit). This computer simulation propose a improvement for monitoring oil wells for maximization of well production. Currenty, the proprietaries simulators are based on specific equipments manufactures. Therefore, it is not possible simulation equipments of another manufactures. In the propose approach there are support for diverse kinds of manufactures equipments
Resumo:
Modern wireless systems employ adaptive techniques to provide high throughput while observing desired coverage, Quality of Service (QoS) and capacity. An alternative to further enhance data rate is to apply cognitive radio concepts, where a system is able to exploit unused spectrum on existing licensed bands by sensing the spectrum and opportunistically access unused portions. Techniques like Automatic Modulation Classification (AMC) could help or be vital for such scenarios. Usually, AMC implementations rely on some form of signal pre-processing, which may introduce a high computational cost or make assumptions about the received signal which may not hold (e.g. Gaussianity of noise). This work proposes a new method to perform AMC which uses a similarity measure from the Information Theoretic Learning (ITL) framework, known as correntropy coefficient. It is capable of extracting similarity measurements over a pair of random processes using higher order statistics, yielding in better similarity estimations than by using e.g. correlation coefficient. Experiments carried out by means of computer simulation show that the technique proposed in this paper presents a high rate success in classification of digital modulation, even in the presence of additive white gaussian noise (AWGN)
Resumo:
This paper describes the study, computer simulation and feasibility of implementation of vector control speed of an induction motor using for this purpose the Extended Kalman Filter as an estimator of rotor flux. The motivation for such work is the use of a control system that requires no sensors on the machine shaft, thus providing a considerable cost reduction of drives and their maintenance, increased reliability, robustness and noise immunity as compared to control systems with conventional sensors
Resumo:
This paper investigates the cognitive processes that operate in understanding narratives in this case, the novel Macunaíma, by Mário de Andrade. Our work belongs to the field of Embodied-based Cognitive Linguistics and, due to its interdisciplinary nature, it dialogues with theoretical and methodological frameworks of Psycholinguistics, Cognitive Psychology and Neurosciences. Therefore, we adopt an exploratory research design, recall and cloze tests, adapted, with postgraduation students, all native speakers of Brazilian Portuguese. The choice of Macunaíma as the novel and initial motivation for this proposal is due to the fact it is a fantastic narrative, which consists of events, circumstances and characters that are clearly distant types from what is experienced in everyday life. Thus, the novel provides adequate data to investigate the configuration of meaning, within an understanding-based model. We, therefore, seek, to answer questions that are still, generally, scarcely explored in the field of Cognitive Linguistics, such as to what extent is the activation of mental models (schemas and frames) related to the process of understanding narratives? How are we able to build sense even when words or phrases are not part of our linguistic repertoire? Why do we get emotionally involved when reading a text, even though it is fiction? To answer them, we assume the theoretical stance that meaning is not in the text, it is constructed through language, conceived as a result of the integration between the biological (which results in creating abstract imagery schemes) and the sociocultural (resulting in creating frames) apparatus. In this sense, perception, cognitive processing, reception and transmission of the information described are directly related to how language comprehension occurs. We believe that the results found in our study may contribute to the cognitive studies of language and to the development of language learning and teaching methodologies
Resumo:
In this work we have studied, by Monte Carlo computer simulation, several properties that characterize the damage spreading in the Ising model, defined in Bravais lattices (the square and the triangular lattices) and in the Sierpinski Gasket. First, we investigated the antiferromagnetic model in the triangular lattice with uniform magnetic field, by Glauber dynamics; The chaotic-frozen critical frontier that we obtained coincides , within error bars, with the paramegnetic-ferromagnetic frontier of the static transition. Using heat-bath dynamics, we have studied the ferromagnetic model in the Sierpinski Gasket: We have shown that there are two times that characterize the relaxation of the damage: One of them satisfy the generalized scaling theory proposed by Henley (critical exponent z~A/T for low temperatures). On the other hand, the other time does not obey any of the known scaling theories. Finally, we have used methods of time series analysis to study in Glauber dynamics, the damage in the ferromagnetic Ising model on a square lattice. We have obtained a Hurst exponent with value 0.5 in high temperatures and that grows to 1, close to the temperature TD, that separates the chaotic and the frozen phases
Resumo:
The research behind this master dissertation started with the installation of a DC sputtering system, from its first stage, the adaptation of a refrigerating system, passing by the introduction of a heating system for the chamber using a thermal belt, until the deposition of a series of Fe/MgO(100) single crystal nanometric film samples. The deposition rates of some materials such as Fe, Py and Cu were investigated through an Atomic Force Microscope (AFM). For the single crystal samples, five of them have the same growth parameters and a thickness of 250Å, except for the temperature, which varies from fifty degrees from one to another, from 100ºC to 300ºC. Three other samples also have the same deposition parameters and a temperature of 300ºC, but with thickness of 62,5Å, 150Å, and 250Å. Magneto-optical Kerr Effect (MOKE) of the magnetic curves measurements and Ferromagnetic Resonance (FMR) were made to in order to study the influence of the temperature and thickness on the sample s magnetic properties. In the present dissertation we discuss such techniques, and the experimental results are interpreted using phenomenological models, by simulation, and discussed from a physical point of view, taking into account the system s free magnetic energy terms. The results show the growth of the cubic anisotropy field (Hac) as the sample s deposition temperature increases, presenting an asymptotic behavior, similar to the characteristic charging curve of a capacitor in a RC circuit. A similar behavior was also observed for the Hac due to the increase in the samples thicknesses. The 250˚A sample, growth at 300°C, presented a Hac field close to the Fe bulk value