941 resultados para FLOW OF FLUIDS - Orifices
Resumo:
The relationship between industry, waste, and urbanism is one fraught with problems across the United States and in particular American cities. The interrelated nature of these systems of flows is in critical need of re-evaluation. This thesis critiques the system of Municipal Solid Waste Management as it currently exists in American cities as a necessary yet undesirable ‘invisible infrastructure’. Industry and waste environments have been pushed to the periphery of urban environments, severing the relationship between the urban environment we inhabit and the one that is required to support the way we live. The flow of garbage from cities of high density to landscapes of waste has created a model of valuing waste as a linear system that separates input from output. This thesis aims to investigate ways that industry, waste, and urban ecologies can work to reinforce one another. The goal of this thesis is to repair the physical and mental separation of waste and public activity through architecture. This thesis will propose ways to tie urban waste infrastructure and public amenities together through the merging of architecture and landscape to create new avenues for public engagement with waste processes.
Resumo:
Part 18: Optimization in Collaborative Networks
Resumo:
319 p.
Resumo:
Many geological formations consist of crystalline rocks that have very low matrix permeability but allow flow through an interconnected network of fractures. Understanding the flow of groundwater through such rocks is important in considering disposal of radioactive waste in underground repositories. A specific area of interest is the conditioning of fracture transmissivities on measured values of pressure in these formations. This is the process where the values of fracture transmissivities in a model are adjusted to obtain a good fit of the calculated pressures to measured pressure values. While there are existing methods to condition transmissivity fields on transmissivity, pressure and flow measurements for a continuous porous medium there is little literature on conditioning fracture networks. Conditioning fracture transmissivities on pressure or flow values is a complex problem because the measurements are not linearly related to the fracture transmissivities and they are also dependent on all the fracture transmissivities in the network. We present a new method for conditioning fracture transmissivities on measured pressure values based on the calculation of certain basis vectors; each basis vector represents the change to the log transmissivity of the fractures in the network that results in a unit increase in the pressure at one measurement point whilst keeping the pressure at the remaining measurement points constant. The fracture transmissivities are updated by adding a linear combination of basis vectors and coefficients, where the coefficients are obtained by minimizing an error function. A mathematical summary of the method is given. This algorithm is implemented in the existing finite element code ConnectFlow developed and marketed by Serco Technical Services, which models groundwater flow in a fracture network. Results of the conditioning are shown for a number of simple test problems as well as for a realistic large scale test case.
Resumo:
We investigate the structure of strongly nonlinear Rayleigh–Bénard convection cells in the asymptotic limit of large Rayleigh number and fixed, moderate Prandtl number. Unlike the flows analyzed in prior theoretical studies of infinite Prandtl number convection, our cellular solutions exhibit dynamically inviscid constant-vorticity cores. By solving an integral equation for the cell-edge temperature distribution, we are able to predict, as a function of cell aspect ratio, the value of the core vorticity, details of the flow within the thin boundary layers and rising/falling plumes adjacent to the edges of the convection cell, and, in particular, the bulk heat flux through the layer. The results of our asymptotic analysis are corroborated using full pseudospectral numerical simulations and confirm that the heat flux is maximized for convection cells that are roughly square in cross section.
Resumo:
With its powerful search engines and billions of published pages, the Worldwide Web has become the ultimate tool to explore the human experience. But, despite the advent of the digital revolution, e-books, at their core, have remained remarkably similar to their printed siblings. This has resulted in a clear dichotomy between two ways of reading: on one side, the multi-dimensional world of the Web; on the other, the linearity of books and e-books. My investigation of the literature indicates that the focus of attempts to merge these two modes of production, and hence of reading, has been the insertion of interactivity into fiction. As I will show in the Literature Review, a clear thrust of research since the early 1990s, and in my opinion the most significant, has concentrated on presenting the reader with choices that affect the plot. This has resulted in interactive stories in which the structure of the narrative can be altered by the reader of experimental fiction. The interest in this area of research is not surprising, as the interaction of readers with the fabric of the narrative provides a fertile ground for exploring, analysing, and discussing issues of plot consistency and continuity. I found in the literature several papers concerned with the effects of hyperlinking on literature, but none about how hyperlinked material and narrative could be integrated without compromising the narrative flow as designed by the author. It led me to think that the researchers had accepted hypertextuality and the linear organisation of fiction as being antithetical, thereby ignoring the possibility of exploiting the first while preserving the second. All the works I consulted were focussed on exploring the possibilities provided to authors (and readers) by hypertext or how hypertext literature affects literary criticism. This was true in earlier works by Landow and Harpold and remained true in later works by Bolter and Grusin. To quote another example, in his book Hypertext 3.0, Landow states: “Most who have speculated on the relation between hypertextuality and fiction concentrate [...] on the effects it will have on linear narrative”, and “hypertext opens major questions about story and plot by apparently doing away with linear organization” (Landow, 2006, pp. 220, 221). In other words, the authors have added narrative elements to Web pages, effectively placing their stories in a subordinate role. By focussing on “opening up” the plots, the researchers have missed the opportunity to maintain the integrity of their stories and use hyperlinked information to provide interactive access to backstory and factual bases. This would represent a missing link between the traditional way of reading, in which the readers have no influence on the path the author has laid out for them, and interactive narrative, in which the readers choose their way across alternatives, thereby, at least to a certain extent, creating their own path. It would be, to continue the metaphor, as if the readers could follow the main path created by the author while being able to get “sidetracked” into exploring hyperlinked material. In Hypertext 3.0, Landow refers to an “Axial structure [of hypertext] characteristic of electronic books and scholarly books with foot-and endnotes” versus a “Network structure of hypertext” (Landow, 2006, p. 70). My research aims at generalising the axial structure and extending it to fiction without losing the linearity at its core. In creative nonfiction, the introduction of places, scenes, and settings, together with characterisation, brings to life the facts without altering them; while much fiction draws on facts to provide a foundation, or narrative elements, for the work. But how can the reader distinguish between facts and representations? For example, to what extent do dialogues and perceptions present what was actually said and thought? Some authors of creative nonfiction use end-notes to provide comments and citations while minimising disruption the flow of the main text, but they are limited in scope and constrained in space. Each reader should be able to enjoy the narrative as if it were a novel but also to explore the facts at the level of detail s/he needs. For this to be possible, end-notes should provide a Web-like way of exploring in more detail what the author has already researched. My research aims to develop ways of integrating narrative prose and hyperlinked documents into a Hyperbook. Its goal is to create a new writing paradigm in which a story incorporates a gateway to detailed information. While creative nonfiction uses the techniques of fictional writing to provide reportage of actual events and fact-based fiction illuminates the affectual dimensions of what happened (e.g., Kate Grenville’s The Secret River and Hilary Mantel’s Wolf Hall), Hyperbooks go one step further and link narrative prose to the details of the events on which the narrative is based or, more in general, to information the reader might find of interest. My dissertation introduces and utilises Hyperbooks to engage in two parallel types of investigation Build knowledge about Italian WWII POWs held in Australia and present it as part of a novella in Hyperbook format. Develop a new piece of technology capable of extending the writing and reading process.
Resumo:
320 p.
Resumo:
Natural ventilation is an efficient bioclimatic strategy, one that provides thermal comfort, healthful and cooling to the edification. However, the disregard for quality environment, the uncertainties involved in the phenomenon and the popularization of artificial climate systems are held as an excuse for those who neglect the benefits of passive cooling. The unfamiliarity with the concept may be lessened if ventilation is observed in every step of the project, especially in the initial phase in which decisions bear a great impact in the construction process. The tools available in order to quantify the impact of projected decisions consist basically of the renovation rate calculations or computer simulations of fluids, commonly dubbed CFD, which stands for Computational Fluid Dynamics , both somewhat apart from the project s execution and unable to adapt for use in parametric studies. Thus, we chose to verify, through computer simulation, the representativeness of the results with a method of simplified air reconditioning rate calculation, as well as making it more compatible with the questions relevant to the first phases of the project s process. The case object consists of a model resulting from the recommendations of the Código de Obras de Natal/ RN, customized according to the NBR 15220. The study has shown the complexity in aggregating a CFD tool to the process and the need for a method capable of generating data at the compatible rate to the flow of ideas and are discarded during the project s development. At the end of our study, we discuss the necessary concessions for the realization of simulations, the applicability and the limitations of both the tools used and the method adopted, as well as the representativeness of the results obtained
Resumo:
Steam injection is the most used method of additional recovery for the extraction of heavy oil. In this type procedure is common to happen gravitational segregation and this phenomenon can affect the production of oil and therefore, it shoulds be considered in the projects of continuous steam injection. For many years, the gravitational segregation was not adequately considered in the calculation procedures in Reservoir Engineering. The effect of the gravity causes the segregation of fluids inside the porous media according to their densities. The results of simulation arising from reservoirs could provide the ability to deal with the gravity, and it became apparent that the effects of the gravity could significantly affect the performance of the reservoir. It know that the gravitational segregation can happen in almost every case where there is injection of light fluid, specially the steam, and occurs with greater intensity for viscous oil reservoirs. This work discusses the influence of some parameters of the rock-reservoir in segregation as viscosity, permeability, thickness, cover gas, porosity. From a model that shows the phenomenon with greater intensity, optimized some operational parameters as the rate flow rate steam, distance between the wells injector-producer, and interval of completion which contributed to the reduction in gravity override, thus increasing the oil recovery. It was shown a greater technical-economic viability for the model of distance between the wells 100 m. The analysis was performed using the simulator of CMG (Computer Modeling Group-Stars 2007.11, in which was observed by iterating between studied variables in heavy oil reservoirs with similar characteristics to Brazilian Northeast
Resumo:
The gas injection has become the most important IOR process in the United States. Furthermore, the year 2006 marks the first time the gas injection IOR production has surpassed that of steam injection. In Brazil, the installation of a petrochemical complex in the Northeast of Brazil (Bahia State) offers opportunities for the injection of gases in the fields located in the Recôncavo Basin. Field-scale gas injection applications have almost always been associated with design and operational difficulties. The mobility ratio, which controls the volumetric sweep, between the injected gas and displaced oil bank in gas processes, is typically unfavorable due to the relatively low viscosity of the injected gas. Furthermore, the difference between their densities results in severe gravity segregation of fluids in the reservoirs, consequently leading to poor control in the volumetric sweep. Nowadays, from the above applications of gas injection, the WAG process is most popular. However, in attempting to solve the mobility problems, the WAG process gives rise to other problems associated with increased water saturation in the reservoir including diminished gas injectivity and increased competition to the flow of oil. The low field performance of WAG floods with oil recoveries in the range of 5-10% is a clear indication of these problems. In order to find na effective alternative to WAG, the Gas Assisted Gravity Drainage (GAGD) was developed. This process is designed to take advantage of gravity force to allow vertical segregation between the injected CO2 and reservoir crude oil due to their density difference. This process consists of placing horizontal producers near the bottom of the pay zone and injecting gás through existing vertical wells in field. Homogeneous models were used in this work which can be extrapolated to commercial application for fields located in the Northeast of Brazil. The simulations were performed in a CMG simulator, the STARS 2007.11, where some parameters and their interactions were analyzed. The results have shown that the CO2 injection in GAGD process increased significantly the rate and the final recovery of oil
Resumo:
In Brazil and around the world, oil companies are looking for, and expected development of new technologies and processes that can increase the oil recovery factor in mature reservoirs, in a simple and inexpensive way. So, the latest research has developed a new process called Gas Assisted Gravity Drainage (GAGD) which was classified as a gas injection IOR. The process, which is undergoing pilot testing in the field, is being extensively studied through physical scale models and core-floods laboratory, due to high oil recoveries in relation to other gas injection IOR. This process consists of injecting gas at the top of a reservoir through horizontal or vertical injector wells and displacing the oil, taking advantage of natural gravity segregation of fluids, to a horizontal producer well placed at the bottom of the reservoir. To study this process it was modeled a homogeneous reservoir and a model of multi-component fluid with characteristics similar to light oil Brazilian fields through a compositional simulator, to optimize the operational parameters. The model of the process was simulated in GEM (CMG, 2009.10). The operational parameters studied were the gas injection rate, the type of gas injection, the location of the injector and production well. We also studied the presence of water drive in the process. The results showed that the maximum vertical spacing between the two wells, caused the maximum recovery of oil in GAGD. Also, it was found that the largest flow injection, it obtained the largest recovery factors. This parameter controls the speed of the front of the gas injected and determined if the gravitational force dominates or not the process in the recovery of oil. Natural gas had better performance than CO2 and that the presence of aquifer in the reservoir was less influential in the process. In economic analysis found that by injecting natural gas is obtained more economically beneficial than CO2
Resumo:
The objective of the thermal recovery is to heat the resevoir and the oil in it to increase its recovery. In the Potiguar river basin there are located several heavy oil reservoirs whose primary recovery energy provides us with a little oil flow, which makes these reservoirs great candidates for application of a method of recovery advanced of the oil, especially the thermal. The steam injection can occur on a cyclical or continuous manner. The continuous steam injection occurs through injection wells, which in its vicinity form a zone of steam that expands itself, having as a consequence the displace of the oil with viscosity and mobility improved towards the producing wells. Another possible mechanism of displacement of oil in reservoirs subjected to continuous injection of steam is the distillation of oil by steam, which at high temperatures; their lighter fractions can be vaporized by changing the composition of the oil produced, of the oil residual or to shatter in the amount of oil produced. In this context, this paper aims to study the influence of compositional models in the continuous injection of steam through in the analysis of some parameters such as flow injection steam and temperature of injection. Were made various leading comparative analysis taking the various models of fluid, varying from a good elementary, with 03 pseudocomponents to a modeling of fluids with increasing numbers of pseudocomponents. A commercial numerical simulator was used for the study from a homogeneous reservoir model with similar features to those found in northeastern Brazil. Some conclusions as the increasing of the simulation time with increasing number of pseudocomponents, the significant influence of flow injection on cumulative production of oil and little influence of the number of pseudocomponents in the flows and cumulative production of oil were found
Resumo:
Oil production and exploration techniques have evolved in the last decades in order to increase fluid flows and optimize how the required equipment are used. The base functioning of Electric Submersible Pumping (ESP) lift method is the use of an electric downhole motor to move a centrifugal pump and transport the fluids to the surface. The Electric Submersible Pumping is an option that has been gaining ground among the methods of Artificial Lift due to the ability to handle a large flow of liquid in onshore and offshore environments. The performance of a well equipped with ESP systems is intrinsically related to the centrifugal pump operation. It is the pump that has the function to turn the motor power into Head. In this present work, a computer model to analyze the three-dimensional flow in a centrifugal pump used in Electric Submersible Pumping has been developed. Through the commercial program, ANSYS® CFX®, initially using water as fluid flow, the geometry and simulation parameters have been defined in order to obtain an approximation of what occurs inside the channels of the impeller and diffuser pump in terms of flow. Three different geometry conditions were initially tested to determine which is most suitable to solving the problem. After choosing the most appropriate geometry, three mesh conditions were analyzed and the obtained values were compared to the experimental characteristic curve of Head provided by the manufacturer. The results have approached the experimental curve, the simulation time and the model convergence were satisfactory if it is considered that the studied problem involves numerical analysis. After the tests with water, oil was used in the simulations. The results were compared to a methodology used in the petroleum industry to correct viscosity. In general, for models with water and oil, the results with single-phase fluids were coherent with the experimental curves and, through three-dimensional computer models, they are a preliminary evaluation for the analysis of the two-phase flow inside the channels of centrifugal pump used in ESP systems
Resumo:
The present study provides a methodology that gives a predictive character the computer simulations based on detailed models of the geometry of a porous medium. We using the software FLUENT to investigate the flow of a viscous Newtonian fluid through a random fractal medium which simplifies a two-dimensional disordered porous medium representing a petroleum reservoir. This fractal model is formed by obstacles of various sizes, whose size distribution function follows a power law where exponent is defined as the fractal dimension of fractionation Dff of the model characterizing the process of fragmentation these obstacles. They are randomly disposed in a rectangular channel. The modeling process incorporates modern concepts, scaling laws, to analyze the influence of heterogeneity found in the fields of the porosity and of the permeability in such a way as to characterize the medium in terms of their fractal properties. This procedure allows numerically analyze the measurements of permeability k and the drag coefficient Cd proposed relationships, like power law, for these properties on various modeling schemes. The purpose of this research is to study the variability provided by these heterogeneities where the velocity field and other details of viscous fluid dynamics are obtained by solving numerically the continuity and Navier-Stokes equations at pore level and observe how the fractal dimension of fractionation of the model can affect their hydrodynamic properties. This study were considered two classes of models, models with constant porosity, MPC, and models with varying porosity, MPV. The results have allowed us to find numerical relationship between the permeability, drag coefficient and the fractal dimension of fractionation of the medium. Based on these numerical results we have proposed scaling relations and algebraic expressions involving the relevant parameters of the phenomenon. In this study analytical equations were determined for Dff depending on the geometrical parameters of the models. We also found a relation between the permeability and the drag coefficient which is inversely proportional to one another. As for the difference in behavior it is most striking in the classes of models MPV. That is, the fact that the porosity vary in these models is an additional factor that plays a significant role in flow analysis. Finally, the results proved satisfactory and consistent, which demonstrates the effectiveness of the referred methodology for all applications analyzed in this study.
MINING AND VERIFICATION OF TEMPORAL EVENTS WITH APPLICATIONS IN COMPUTER MICRO-ARCHITECTURE RESEARCH
Resumo:
Computer simulation programs are essential tools for scientists and engineers to understand a particular system of interest. As expected, the complexity of the software increases with the depth of the model used. In addition to the exigent demands of software engineering, verification of simulation programs is especially challenging because the models represented are complex and ridden with unknowns that will be discovered by developers in an iterative process. To manage such complexity, advanced verification techniques for continually matching the intended model to the implemented model are necessary. Therefore, the main goal of this research work is to design a useful verification and validation framework that is able to identify model representation errors and is applicable to generic simulators. The framework that was developed and implemented consists of two parts. The first part is First-Order Logic Constraint Specification Language (FOLCSL) that enables users to specify the invariants of a model under consideration. From the first-order logic specification, the FOLCSL translator automatically synthesizes a verification program that reads the event trace generated by a simulator and signals whether all invariants are respected. The second part consists of mining the temporal flow of events using a newly developed representation called State Flow Temporal Analysis Graph (SFTAG). While the first part seeks an assurance of implementation correctness by checking that the model invariants hold, the second part derives an extended model of the implementation and hence enables a deeper understanding of what was implemented. The main application studied in this work is the validation of the timing behavior of micro-architecture simulators. The study includes SFTAGs generated for a wide set of benchmark programs and their analysis using several artificial intelligence algorithms. This work improves the computer architecture research and verification processes as shown by the case studies and experiments that have been conducted.