945 resultados para Continuous steam injection and reservoir simulation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent technological developments have made it possible to design various microdevices where fluid flow and heat transfer are involved. For the proper design of such systems, the governing physics needs to be investigated. Due to the difficulty to study complex geometries in micro scales using experimental techniques, computational tools are developed to analyze and simulate flow and heat transfer in microgeometries. However, conventional numerical methods using the Navier-Stokes equations fail to predict some aspects of microflows such as nonlinear pressure distribution, increase mass flow rate, slip flow and temperature jump at the solid boundaries. This necessitates the development of new computational methods which depend on the kinetic theory that are both accurate and computationally efficient. In this study, lattice Boltzmann method (LBM) was used to investigate the flow and heat transfer in micro sized geometries. The LBM depends on the Boltzmann equation which is valid in the whole rarefaction regime that can be observed in micro flows. Results were obtained for isothermal channel flows at Knudsen numbers higher than 0.01 at different pressure ratios. LBM solutions for micro-Couette and micro-Poiseuille flow were found to be in good agreement with the analytical solutions valid in the slip flow regime (0.01 < Kn < 0.1) and direct simulation Monte Carlo solutions that are valid in the transition regime (0.1 < Kn < 10) for pressure distribution and velocity field. The isothermal LBM was further extended to simulate flows including heat transfer. The method was first validated for continuum channel flows with and without constrictions by comparing the thermal LBM results against accurate solutions obtained from analytical equations and finite element method. Finally, the capability of thermal LBM was improved by adding the effect of rarefaction and the method was used to analyze the behavior of gas flow in microchannels. The major finding of this research is that, the newly developed particle-based method described here can be used as an alternative numerical tool in order to study non-continuum effects observed in micro-electro-mechanical-systems (MEMS).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Parallel processing is prevalent in many manufacturing and service systems. Many manufactured products are built and assembled from several components fabricated in parallel lines. An example of this manufacturing system configuration is observed at a manufacturing facility equipped to assemble and test web servers. Characteristics of a typical web server assembly line are: multiple products, job circulation, and paralleling processing. The primary objective of this research was to develop analytical approximations to predict performance measures of manufacturing systems with job failures and parallel processing. The analytical formulations extend previous queueing models used in assembly manufacturing systems in that they can handle serial and different configurations of paralleling processing with multiple product classes, and job circulation due to random part failures. In addition, appropriate correction terms via regression analysis were added to the approximations in order to minimize the gap in the error between the analytical approximation and the simulation models. Markovian and general type manufacturing systems, with multiple product classes, job circulation due to failures, and fork and join systems to model parallel processing were studied. In the Markovian and general case, the approximations without correction terms performed quite well for one and two product problem instances. However, it was observed that the flow time error increased as the number of products and net traffic intensity increased. Therefore, correction terms for single and fork-join stations were developed via regression analysis to deal with more than two products. The numerical comparisons showed that the approximations perform remarkably well when the corrections factors were used in the approximations. In general, the average flow time error was reduced from 38.19% to 5.59% in the Markovian case, and from 26.39% to 7.23% in the general case. All the equations stated in the analytical formulations were implemented as a set of Matlab scripts. By using this set, operations managers of web server assembly lines, manufacturing or other service systems with similar characteristics can estimate different system performance measures, and make judicious decisions - especially setting delivery due dates, capacity planning, and bottleneck mitigation, among others.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An area of about 22,000 km² on the northern Blake Plateau, off the coast of South Carolina, contains an estimated 2 billion metric tons of phosphorite concretions, and about 1.2 billion metric tons of mixed ferromanganese-phosphorite pavement. Other offshore phosphorites occur between the Blake Plateau and known continental deposits, buried under variable thicknesses of sediments. The phosphorite resembles other marine phosphorites in composition, consisting primarily of carbonate-fluorapatite, some calcite, minor quartz and other minerals. The apatite is optically pseudo-isotropic and contains about 6% [CO3]**2- replacing [PO4]**3- in its structure. JOIDES drillings and other evidence show that the phosphorite is a lag deposit derived from Miocene strata correlatable with phosphatic Middle Tertiary sediments on the continent. It has undergone variable cycles of erosion, reworking, partial dissolution and reprecipitation. Its present form varies from phosphatized carbonate debris, loose pellets, and pebbles, to continuous pavements, plates, and conglomeratic boulders weighing hundreds of kilograms. No primary phosphatization is currently taking place on the Blake Plateau. The primary phosphate-depositing environment involved reducing conditions and required at least temporary absence of the powerful Gulf Stream current that now sweeps the bottom of the Blake Plateau and has eroded away the bulk of the Hawthorne-equivalent sediments with which the phosphorites were once associated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The object of this study is the construction of metaphor and metonymy in comics. This work is inserted in the field of Embodied Cognitive Linguistics, specifically based on the Neural Theory of Language (FELDMAN, 2006) and, consistent with this theoretical and methodological framework, the notions of categorization (LAKOFF & JOHNSON, 1999), embodiment (GIBBS, 2005), figurativity (GIBBS, 1994; BERGEN, 2005), and mental simulation (BARSALOU, 1999; FELDMAN, 2006) have also been used. The hypothesis defended is that the construction of figurativity in texts consisting of verbal and nonverbal mechanisms is linked to the activation of neural structures related to our actions and perceptions. Thus, language is considered a cognitive faculty connected to the brain apparatus and to bodily experiences, in such a way that it provides samples of the continuous process of meaning (re)construction performed by the reader, whom (re)defines his or her views about the world as certain neural networks are (or stop being) activated during linguistic processing. The data obtained during the analysys shows that, as regards comics, the act of reading together the graphics and verbal language seems to have an important role in the construction of figurativity, including cases of metaphors which are metonymically motivated. These preliminary conclusions were drawn from the data analysis taken from V de Vingança (MOORE; LLOYD, 2006). The corpus study was guided by the methodology of introspection, i.e., the individual analysis of linguistic aspects as manifested in one's own cognition (TALMY, 2005).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The understanding of the occurrence and flow of groundwater in the subsurface is of fundamental importance in the exploitation of water, just like knowledge of all associated hydrogeological context. These factors are primarily controlled by geometry of a certain pore system, given the nature of sedimentary aquifers. Thus, the microstructural characterization, as the interconnectivity of the system, it is essential to know the macro properties porosity and permeability of reservoir rock, in which can be done on a statistical characterization by twodimensional analysis. The latter is being held on a computing platform, using image thin sections of reservoir rock, allowing the prediction of the properties effective porosity and hydraulic conductivity. For Barreiras Aquifer to obtain such parameters derived primarily from the interpretation of tests of aquifers, a practice that usually involves a fairly complex logistics in terms of equipment and personnel required in addition to high cost of operation. Thus, the analysis and digital image processing is presented as an alternative tool for the characterization of hydraulic parameters, showing up as a practical and inexpensive method. This methodology is based on a flowchart work involving sampling, preparation of thin sections and their respective images, segmentation and geometric characterization, three-dimensional reconstruction and flow simulation. In this research, computational image analysis of thin sections of rocks has shown that aquifer storage coefficients ranging from 0,035 to 0,12 with an average of 0,076, while its hydrogeological substrate (associated with the top of the carbonate sequence outcropping not region) presents effective porosities of the order of 2%. For the transport regime, it is evidenced that the methodology presents results below of those found in the bibliographic data relating to hydraulic conductivity, mean values of 1,04 x10-6 m/s, with fluctuations between 2,94 x10-6 m/s and 3,61x10-8 m/s, probably due to the larger scale study and the heterogeneity of the medium studied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The pre-salt province is composed by large amounts of light oil and with good quality, a reality that puts Brazil in a strategic position facing the great demand for energy worldwide. In this province are the largest discoveries in the world in the last ten years; areas as Libra, Franco and Lula field, everyone containing volumes greater than 8 billion recoverable oil barrels. To develop and optimize the production of these fields, a study was done for choosing the improved oil recovery methods. The main motivations were the presence of carbon dioxide (CO2) as a contaminant and the strategic decision of do not discard it, combined with high GOR (gas-oil ratio) of the reservoir fluid. The method should take advantage of the unique abundant resources: seawater and produced gas. This way, the process of matching these resources in the water alterning gas injection (WAG) became a good option. In this master’s dissertation, it was developed a reservoir model with average characteristics of the Brazilian pre-salt, where was applied the improved oil recovery method of water alternating gas. The production of this reservoir was analyzed by parameters as: the first fluid injected in the injection process, position of the injection wells completion, injection water and gas rate and cycle time. The results showed a good performance of the method, with up to 26% of gains in the recovery factor regarding the primary recovery, since the application of water injection and gas, individually, was not able to overcome 10 % of gain. The most influential parameter found in the results was the cycle time, with higher recovery factor values obtained with the use of shorter times.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The successful performance of a hydrological model is usually challenged by the quality of the sensitivity analysis, calibration and uncertainty analysis carried out in the modeling exercise and subsequent simulation results. This is especially important under changing climatic conditions where there are more uncertainties associated with climate models and downscaling processes that increase the complexities of the hydrological modeling system. In response to these challenges and to improve the performance of the hydrological models under changing climatic conditions, this research proposed five new methods for supporting hydrological modeling. First, a design of experiment aided sensitivity analysis and parameterization (DOE-SAP) method was proposed to investigate the significant parameters and provide more reliable sensitivity analysis for improving parameterization during hydrological modeling. The better calibration results along with the advanced sensitivity analysis for significant parameters and their interactions were achieved in the case study. Second, a comprehensive uncertainty evaluation scheme was developed to evaluate three uncertainty analysis methods, the sequential uncertainty fitting version 2 (SUFI-2), generalized likelihood uncertainty estimation (GLUE) and Parameter solution (ParaSol) methods. The results showed that the SUFI-2 performed better than the other two methods based on calibration and uncertainty analysis results. The proposed evaluation scheme demonstrated that it is capable of selecting the most suitable uncertainty method for case studies. Third, a novel sequential multi-criteria based calibration and uncertainty analysis (SMC-CUA) method was proposed to improve the efficiency of calibration and uncertainty analysis and control the phenomenon of equifinality. The results showed that the SMC-CUA method was able to provide better uncertainty analysis results with high computational efficiency compared to the SUFI-2 and GLUE methods and control parameter uncertainty and the equifinality effect without sacrificing simulation performance. Fourth, an innovative response based statistical evaluation method (RESEM) was proposed for estimating the uncertainty propagated effects and providing long-term prediction for hydrological responses under changing climatic conditions. By using RESEM, the uncertainty propagated from statistical downscaling to hydrological modeling can be evaluated. Fifth, an integrated simulation-based evaluation system for uncertainty propagation analysis (ISES-UPA) was proposed for investigating the effects and contributions of different uncertainty components to the total propagated uncertainty from statistical downscaling. Using ISES-UPA, the uncertainty from statistical downscaling, uncertainty from hydrological modeling, and the total uncertainty from two uncertainty sources can be compared and quantified. The feasibility of all the methods has been tested using hypothetical and real-world case studies. The proposed methods can also be integrated as a hydrological modeling system to better support hydrological studies under changing climatic conditions. The results from the proposed integrated hydrological modeling system can be used as scientific references for decision makers to reduce the potential risk of damages caused by extreme events for long-term water resource management and planning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The organisational decision making environment is complex, and decision makers must deal with uncertainty and ambiguity on a continuous basis. Managing and handling decision problems and implementing a solution, requires an understanding of the complexity of the decision domain to the point where the problem and its complexity, as well as the requirements for supporting decision makers, can be described. Research in the Decision Support Systems domain has been extensive over the last thirty years with an emphasis on the development of further technology and better applications on the one hand, and on the other hand, a social approach focusing on understanding what decision making is about and how developers and users should interact. This research project considers a combined approach that endeavours to understand the thinking behind managers’ decision making, as well as their informational and decisional guidance and decision support requirements. This research utilises a cognitive framework, developed in 1985 by Humphreys and Berkeley that juxtaposes the mental processes and ideas of decision problem definition and problem solution that are developed in tandem through cognitive refinement of the problem, based on the analysis and judgement of the decision maker. The framework facilitates the separation of what is essentially a continuous process, into five distinct levels of abstraction of manager’s thinking, and suggests a structure for the underlying cognitive activities. Alter (2004) argues that decision support provides a richer basis than decision support systems, in both practice and research. The constituent literature on decision support, especially in regard to modern high profile systems, including Business Intelligence and Business analytics, can give the impression that all ‘smart’ organisations utilise decision support and data analytics capabilities for all of their key decision making activities. However this empirical investigation indicates a very different reality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With increasing prevalence and capabilities of autonomous systems as part of complex heterogeneous manned-unmanned environments (HMUEs), an important consideration is the impact of the introduction of automation on the optimal assignment of human personnel. The US Navy has implemented optimal staffing techniques before in the 1990's and 2000's with a "minimal staffing" approach. The results were poor, leading to the degradation of Naval preparedness. Clearly, another approach to determining optimal staffing is necessary. To this end, the goal of this research is to develop human performance models for use in determining optimal manning of HMUEs. The human performance models are developed using an agent-based simulation of the aircraft carrier flight deck, a representative safety-critical HMUE. The Personnel Multi-Agent Safety and Control Simulation (PMASCS) simulates and analyzes the effects of introducing generalized maintenance crew skill sets and accelerated failure repair times on the overall performance and safety of the carrier flight deck. A behavioral model of four operator types (ordnance officers, chocks and chains, fueling officers, plane captains, and maintenance operators) is presented here along with an aircraft failure model. The main focus of this work is on the maintenance operators and aircraft failure modeling, since they have a direct impact on total launch time, a primary metric for carrier deck performance. With PMASCS I explore the effects of two variables on total launch time of 22 aircraft: 1) skill level of maintenance operators and 2) aircraft failure repair times while on the catapult (referred to as Phase 4 repair times). It is found that neither introducing a generic skill set to maintenance crews nor introducing a technology to accelerate Phase 4 aircraft repair times improves the average total launch time of 22 aircraft. An optimal manning level of 3 maintenance crews is found under all conditions, the point at which any additional maintenance crews does not reduce the total launch time. An additional discussion is included about how these results change if the operations are relieved of the bottleneck of installing the holdback bar at launch time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Along with increasing oceanic CO2 concentrations, enhanced stratification constrains phytoplankton to shallower upper mixed layers with altered light regimes and nutrient concentrations. Here, we investigate the effects of elevated pCO2 in combination with light or nitrogen-limitation on 13C fractionation (epsilon p) in four dinoflagellate species. We cultured Gonyaulax spinifera and Protoceratium reticulatum in dilute batches under low-light (LL) and high-light (HL) conditions, and grew Alexandrium fundyense and Scrippsiella trochoidea in nitrogen-limited continuous cultures (LN) and nitrogen-replete batches (HN). The observed CO2-dependency of epsilon p remained unaffected by the availability of light for both G. spinifera and P. reticulatum, though at HL epsilon p was consistently lower by about 2.7 per mil over the tested CO2 range for P. reticulatum. This may reflect increased uptake of (13C-enriched) bicarbonate fueled by increased ATP production under HL conditions. The observed CO2-dependency of epsilon p disappeared under LN conditions in both A. fundyense and S. trochoidea. The generally higher epsilon p under LN may be associated with lower organic carbon production rates and/or higher ATP:NADPH ratios. CO2-dependent epsilon p under non-limiting conditions has been observed in several dinoflagellate species, showing potential for a new CO2-proxy. Our results however demonstrate that light- and nitrogen-limitation also affect epsilon p, thereby illustrating the need to carefully consider prevailing environmental conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two direct sampling correlator-type receivers for differential chaos shift keying (DCSK) communication systems under frequency non-selective fading channels are proposed. These receivers operate based on the same hardware platform with different architectures. In the first scheme, namely sum-delay-sum (SDS) receiver, the sum of all samples in a chip period is correlated with its delayed version. The correlation value obtained in each bit period is then compared with a fixed threshold to decide the binary value of recovered bit at the output. On the other hand, the second scheme, namely delay-sum-sum (DSS) receiver, calculates the correlation value of all samples with its delayed version in a chip period. The sum of correlation values in each bit period is then compared with the threshold to recover the data. The conventional DCSK transmitter, frequency non-selective Rayleigh fading channel, and two proposed receivers are mathematically modelled in discrete-time domain. The authors evaluated the bit error rate performance of the receivers by means of both theoretical analysis and numerical simulation. The performance comparison shows that the two proposed receivers can perform well under the studied channel, where the performances get better when the number of paths increases and the DSS receiver outperforms the SDS one.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Different types of serious games have been used in elucidating computer science areas such as computer games, mobile games, Lego-based games, virtual worlds and webbased games. Different evaluation techniques have been conducted like questionnaires, interviews, discussions and tests. Simulation have been widely used in computer science as a motivational and interactive learning tool. This paper aims to evaluate the possibility of successful implementation of simulation in computer programming modules. A framework is proposed to measure the impact of serious games on enhancing students understanding of key computer science concepts. Experiments will be held on the EEECS of Queen’s University Belfast students to test the framework and attain results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, the adaptation of Wireless Sensor Networks (WSNs) to application areas requiring mobility increased the security threats against confidentiality, integrity and privacy of the information as well as against their connectivity. Since, key management plays an important role in securing both information and connectivity, a proper authentication and key management scheme is required in mobility enabled applications where the authentication of a node with the network is a critical issue. In this paper, we present an authentication and key management scheme supporting node mobility in a heterogeneous WSN that consists of several low capabilities sensor nodes and few high capabilities sensor nodes. We analyze our proposed solution by using MATLAB (analytically) and by simulation (OMNET++ simulator) to show that it has less memory requirement and has good network connectivity and resilience against attacks compared to some existing schemes. We also propose two levels of secure authentication methods for the mobile sensor nodes for secure authentication and key establishment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite its long record of successful use in human vaccines, the mechanisms underlying the immunomodulatory effects of alum are not fully understood. Alum is a potent inducer of interleukin-1 (IL-1) secretion in vitro in dendritic cells and macrophages via Nucleotide-binding domain and leucine-rich repeat-containing (NLR) family, pyrin domain-containing 3 (NLRP3) inflammasome activation. However, the contribution of IL-1 to alum-induced innate and adaptive immune responses is controversial and the role of IL-1α following alum injection has not been addressed. This study shows that IL-1 is dispensable for alum-induced antibody and CD8 T cell responses to ovalbumin. However, IL-1 is essential for neutrophil infiltration into the injection site, while recruitment of inflammatory monocytes and eosinophils is IL-1 independent. Both IL-1α and IL-1β are released at the site of injection and contribute to the neutrophil response. Surprisingly, these effects are NLRP3-inflammasome independent as is the infiltration of other cell populations. However, while NLRP3 and caspase 1 were dispensable, alum-induced IL-1β at the injection site was dependent on the cysteine protease cathepsin S. Overall, these data demonstrate a previously unreported role for cathepsin S in IL-1β secretion, show that inflammasome formation is dispensable for alum-induced innate immunity and reveal that IL-1α and IL-1β are both necessary for alum-induced neutrophil influx in vivo.