927 resultados para Continuous steam injection and reservoir simulation
Resumo:
Crosswell data set contains a range of angles limited only by the geometry of the source and receiver configuration, the separation of the boreholes and the depth to the target. However, the wide angles reflections present in crosswell imaging result in amplitude-versus-angle (AVA) features not usually observed in surface data. These features include reflections from angles that are near critical and beyond critical for many of the interfaces; some of these reflections are visible only for a small range of angles, presumably near their critical angle. High-resolution crosswell seismic surveys were conducted over a Silurian (Niagaran) reef at two fields in northern Michigan, Springdale and Coldspring. The Springdale wells extended to much greater depths than the reef, and imaging was conducted from above and from beneath the reef. Combining the results from images obtained from above with those from beneath provides additional information, by exhibiting ranges of angles that are different for the two images, especially for reflectors at shallow depths, and second, by providing additional constraints on the solutions for Zoeppritz equations. Inversion of seismic data for impedance has become a standard part of the workflow for quantitative reservoir characterization. Inversion of crosswell data using either deterministic or geostatistical methods can lead to poor results with phase change beyond the critical angle, however, the simultaneous pre-stack inversion of partial angle stacks may be best conducted with restrictions to angles less than critical. Deterministic inversion is designed to yield only a single model of elastic properties (best-fit), while the geostatistical inversion produces multiple models (realizations) of elastic properties, lithology and reservoir properties. Geostatistical inversion produces results with far more detail than deterministic inversion. The magnitude of difference in details between both types of inversion becomes increasingly pronounced for thinner reservoirs, particularly those beyond the vertical resolution of the seismic. For any interface imaged from above and from beneath, the results AVA characters must result from identical contrasts in elastic properties in the two sets of images, albeit in reverse order. An inversion approach to handle both datasets simultaneously, at pre-critical angles, is demonstrated in this work. The main exploration problem for carbonate reefs is determining the porosity distribution. Images of elastic properties, obtained from deterministic and geostatistical simultaneous inversion of a high-resolution crosswell seismic survey were used to obtain the internal structure and reservoir properties (porosity) of Niagaran Michigan reef. The images obtained are the best of any Niagaran pinnacle reef to date.
Resumo:
Developing an effective impact evaluation framework, managing and conducting rigorous impact evaluations, and developing a strong research and evaluation culture within development communication organisations presents many challenges. This is especially so when both the community and organisational context is continually changing and the outcomes of programs are complex and difficult to clearly identify.----- This paper presents a case study from a research project being conducted from 2007-2010 that aims to address these challenges and issues, entitled Assessing Communication for Social Change: A New Agenda in Impact Assessment. Building on previous development communication projects which used ethnographic action research, this project is developing, trailing and rigorously evaluating a participatory impact assessment methodology for assessing the social change impacts of community radio programs in Nepal. This project is a collaboration between Equal Access – Nepal (EAN), Equal Access – International, local stakeholders and listeners, a network of trained community researchers, and a research team from two Australian universities. A key element of the project is the establishment of an organisational culture within EAN that values and supports the impact assessment process being developed, which is based on continuous action learning and improvement. The paper describes the situation related to monitoring and evaluation (M&E) and impact assessment before the project began, in which EAN was often reliant on time-bound studies and ‘success stories’ derived from listener letters and feedback. We then outline the various strategies used in an effort to develop stronger and more effective impact assessment and M&E systems, and the gradual changes that have occurred to date. These changes include a greater understanding of the value of adopting a participatory, holistic, evidence-based approach to impact assessment. We also critically review the many challenges experienced in this process, including:----- • Tension between the pressure from donors to ‘prove’ impacts and the adoption of a bottom-up, participatory approach based on ‘improving’ programs in ways that meet community needs and aspirations.----- • Resistance from the content teams to changing their existing M&E practices and to the perceived complexity of the approach.----- • Lack of meaningful connection between the M&E and content teams.----- • Human resource problems and lack of capacity in analysing qualitative data and reporting results.----- • The contextual challenges, including extreme poverty, wide cultural and linguistic diversity, poor transport and communications infrastructure, and political instability.----- • A general lack of acceptance of the importance of evaluation within Nepal due to accepting everything as fate or ‘natural’ rather than requiring investigation into a problem.
Resumo:
Information uncertainty which is inherent in many real world applications brings more complexity to the visualisation problem. Despite the increasing number of research papers found in the literature, much more work is needed. The aims of this chapter are threefold: (1) to provide a comprehensive analysis of the requirements of visualisation of information uncertainty and their dimensions of complexity; (2) to review and assess current progress; and (3) to discuss remaining research challenges. We focus on four areas: information uncertainty modelling, visualisation techniques, management of information uncertainty modelling, propagation and visualisation, and the uptake of uncertainty visualisation in application domains.
Resumo:
3D Virtual Environments (VE) are real; they exist as digital worlds with the advantage of having none of the constraints of the real world. As such they are the perfect training ground for design students who can create, build and experiment with design solutions without the constraint of real world projects. This paper reports on an educational setting used to explore a model for using VE such as Second Life (SL) developed by Linden Labs in California, as a collaborative environment for design education. A postgraduate landscape architecture learning environment within a collaborative design unit was developed to integrate this model where the primary focus was the application of three-dimensional tools within design, not as a presentation tool, but rather as a design tool. The focus of the unit and its aims and objectives will be outlined before describing the use of SL in the unit. Attention is focused on the collaboration and learning experience before discussing the outcomes, student feedback, future projects using this model and potential for further research. The outcome of this study aims to contribute to current research on teaching and learning design in interactive VE’s. We present a case study of our first application of this model.
Resumo:
Established Monte Carlo user codes BEAMnrc and DOSXYZnrc permit the accurate and straightforward simulation of radiotherapy experiments and treatments delivered from multiple beam angles. However, when an electronic portal imaging detector (EPID) is included in these simulations, treatment delivery from non-zero beam angles becomes problematic. This study introduces CTCombine, a purpose-built code for rotating selected CT data volumes, converting CT numbers to mass densities, combining the results with model EPIDs and writing output in a form which can easily be read and used by the dose calculation code DOSXYZnrc. The geometric and dosimetric accuracy of CTCombine’s output has been assessed by simulating simple and complex treatments applied to a rotated planar phantom and a rotated humanoid phantom and comparing the resulting virtual EPID images with the images acquired using experimental measurements and independent simulations of equivalent phantoms. It is expected that CTCombine will be useful for Monte Carlo studies of EPID dosimetry as well as other EPID imaging applications.
Resumo:
Telecommunications is a key component in any country's economic infrastructure, requiring a vast amount of capital injection and ongoing technical support and innovation. Many developing countries experience handicaps in accessing capital and sustaining the required technical capability in their industralisation process. Therefore, attracting both capital investments and expertise by attuning the developing country's economic policies and legal environment to meet investors' expectations is a priority. Privatisation has been seen as a triumph by international institutions such as the World Bank, and a major requirement for developing economies to industrialise. However from a regulatory perspective, this process is far from straightforward. Implementing economic policies requires a number of regulations and regulatory instruments to be in place. Apart from the need for an independent regulator, regulatory outcomes are often dependent on the willingness of various stakeholders to comply with the course of actions undertaken by authorities. This article examines the factors steering the processes and changes in the telecommunication reforms of Indonesia and China.
Resumo:
Botnets are large networks of compromised machines under the control of a bot master. These botnets constantly evolve their defences to allow the continuation of their malicious activities. The constant development of new botnet mitigation strategies and their subsequent defensive countermeasures has lead to a technological arms race, one which the bot masters have significant incentives to win. This dissertation analyzes the current and future states of the botnet arms race by introducing a taxonomy of botnet defences and a simulation framework for evaluating botnet techniques. The taxonomy covers current botnet techniques and highlights possible future techniques for further analysis under the simulation framework. This framework allows the evaluation of the effect techniques such as reputation systems and proof of work schemes have on the resources required to disable a peer-to-peer botnet. Given the increase in the resources required, our results suggest that the prospects of eliminating the botnet threat are limited.
Resumo:
In this column, Dr. Peter Corke of CSIRO, Australia, gives us a description of MATLAB Toolboxes he has developed. He has been passionately developing tools to enable students and teachers to better understand the theoretical concepts behind classical robotics and computer vision through easy and intuitive simulation and visualization. The results of this labor of love have been packaged as MATLAB Toolboxes: the Robotics Toolbox and the Vision Toolbox. –Daniela Rus, RAS Education Cochair
Resumo:
Consumer personal information is now a valuable commodity for most corporations. Concomitant with increased value is the expansion of new legal obligations to protect personal information. Mandatory data breach notification laws are an important new development in this regard. Such laws require a corporation that has suffered a data breach, which involves personal information, such as a computer hacking incident, to notify those persons who may have been affected by the breach. Regulators may also need to be notified. Australia currently does not have a mandatory data breach notification law but this may be about to change. The Australian Law Reform Commission has suggested that a data breach notification scheme be implemented through the Privacy Act 1988 (Cth). However, the notification of data breaches may already be required under the continuous disclosure regime stipulated by the Corporations Act 2001 (Cth) and the Australian Stock Exchange (ASX) Listing Rules. Accordingly, this article examines whether the notification of data breaches is a statutory requirement of the existing continuous disclosure regime and whether the ASX should therefore be notified of such incidents.
Resumo:
With the advances in computer hardware and software development techniques in the past 25 years, digital computer simulation of train movement and traction systems has been widely adopted as a standard computer-aided engineering tool [1] during the design and development stages of existing and new railway systems. Simulators of different approaches and scales are used extensively to investigate various kinds of system studies. Simulation is now proven to be the cheapest means to carry out performance predication and system behaviour characterisation. When computers were first used to study railway systems, they were mainly employed to perform repetitive but time-consuming computational tasks, such as matrix manipulations for power network solution and exhaustive searches for optimal braking trajectories. With only simple high-level programming languages available at the time, full advantage of the computing hardware could not be taken. Hence, structured simulations of the whole railway system were not very common. Most applications focused on isolated parts of the railway system. It is more appropriate to regard those applications as primarily mechanised calculations rather than simulations. However, a railway system consists of a number of subsystems, such as train movement, power supply and traction drives, which inevitably contains many complexities and diversities. These subsystems interact frequently with each other while the trains are moving; and they have their special features in different railway systems. To further complicate the simulation requirements, constraints like track geometry, speed restrictions and friction have to be considered, not to mention possible non-linearities and uncertainties in the system. In order to provide a comprehensive and accurate account of system behaviour through simulation, a large amount of data has to be organised systematically to ensure easy access and efficient representation; the interactions and relationships among the subsystems should be defined explicitly. These requirements call for sophisticated and effective simulation models for each component of the system. The software development techniques available nowadays allow the evolution of such simulation models. Not only can the applicability of the simulators be largely enhanced by advanced software design, maintainability and modularity for easy understanding and further development, and portability for various hardware platforms are also encouraged. The objective of this paper is to review the development of a number of approaches to simulation models. Attention is, in particular, given to models for train movement, power supply systems and traction drives. These models have been successfully used to enable various ‘what-if’ issues to be resolved effectively in a wide range of applications, such as speed profiles, energy consumption, run times etc.
Resumo:
Training designed to support and strengthen higher-order mental abilities now often involves immersion in Virtual Reality where dangerous real world scenarios can be safely replicated. However despite the growing popularity of advanced training simulations, methods for evaluating their use rely heavily on subjective measures or analysis of final outcomes. Without dynamic, objective performance measures the outcome of training in terms of impact on cognitive skills and ability to transfer newly acquired skills to the real world is unknown. The relationship between affective intensity and cognitive learning provides a potential new approach to ensure the processing of cognitions which occur prior to final outcomes, such as problem-solving and decision-making, are adequately evaluated. This paper describes the technical aspects of pilot work recently undertaken to develop a new measurement tool designed to objectively track individual affect levels during simulation-based training.
Resumo:
This paper examines the ground-water flow problem associated with the injection and recovery of certain corrosive fluids into mineral bearing rock. The aim is to dissolve the minerals in situ, and then recover them in solution. In general, it is not possible to recover all the injected fluid, which is of concern economically and environmentally. However, a new strategy is proposed here, that allows all the leaching fluid to be recovered. A mathematical model of the situation is solved approximately using an asymptotic solution, and exactly using a boundary integral approach. Solutions are shown for two-dimensional flow, which is of some practical interest as it is achievable in old mine tunnels, for example.
Resumo:
This is an invited presentation made as a short preview of the virtual environment research work being undertaken at QUT in the Business Process Management (BPM) research group, known as BPMVE. Three projects are covered, spatial process visualisation, with applications to airport check-in processes, collaborative process modelling using a virtual world BPMN editing tool and business process simulation in virtual worlds using Open Simulator and the YAWL workflow system. In addition, the relationship of this work to Organisational Psychology is briefly explored. Full Video/Audio is available at: http://www.youtube.com/user/BPMVE#p/u/1/rp506c3pPms