77 resultados para Continuous steam injection and reservoir simulation
Resumo:
Established Monte Carlo user codes BEAMnrc and DOSXYZnrc permit the accurate and straightforward simulation of radiotherapy experiments and treatments delivered from multiple beam angles. However, when an electronic portal imaging detector (EPID) is included in these simulations, treatment delivery from non-zero beam angles becomes problematic. This study introduces CTCombine, a purpose-built code for rotating selected CT data volumes, converting CT numbers to mass densities, combining the results with model EPIDs and writing output in a form which can easily be read and used by the dose calculation code DOSXYZnrc. The geometric and dosimetric accuracy of CTCombine’s output has been assessed by simulating simple and complex treatments applied to a rotated planar phantom and a rotated humanoid phantom and comparing the resulting virtual EPID images with the images acquired using experimental measurements and independent simulations of equivalent phantoms. It is expected that CTCombine will be useful for Monte Carlo studies of EPID dosimetry as well as other EPID imaging applications.
Resumo:
Telecommunications is a key component in any country's economic infrastructure, requiring a vast amount of capital injection and ongoing technical support and innovation. Many developing countries experience handicaps in accessing capital and sustaining the required technical capability in their industralisation process. Therefore, attracting both capital investments and expertise by attuning the developing country's economic policies and legal environment to meet investors' expectations is a priority. Privatisation has been seen as a triumph by international institutions such as the World Bank, and a major requirement for developing economies to industrialise. However from a regulatory perspective, this process is far from straightforward. Implementing economic policies requires a number of regulations and regulatory instruments to be in place. Apart from the need for an independent regulator, regulatory outcomes are often dependent on the willingness of various stakeholders to comply with the course of actions undertaken by authorities. This article examines the factors steering the processes and changes in the telecommunication reforms of Indonesia and China.
Resumo:
Botnets are large networks of compromised machines under the control of a bot master. These botnets constantly evolve their defences to allow the continuation of their malicious activities. The constant development of new botnet mitigation strategies and their subsequent defensive countermeasures has lead to a technological arms race, one which the bot masters have significant incentives to win. This dissertation analyzes the current and future states of the botnet arms race by introducing a taxonomy of botnet defences and a simulation framework for evaluating botnet techniques. The taxonomy covers current botnet techniques and highlights possible future techniques for further analysis under the simulation framework. This framework allows the evaluation of the effect techniques such as reputation systems and proof of work schemes have on the resources required to disable a peer-to-peer botnet. Given the increase in the resources required, our results suggest that the prospects of eliminating the botnet threat are limited.
Resumo:
In this column, Dr. Peter Corke of CSIRO, Australia, gives us a description of MATLAB Toolboxes he has developed. He has been passionately developing tools to enable students and teachers to better understand the theoretical concepts behind classical robotics and computer vision through easy and intuitive simulation and visualization. The results of this labor of love have been packaged as MATLAB Toolboxes: the Robotics Toolbox and the Vision Toolbox. –Daniela Rus, RAS Education Cochair
Resumo:
Consumer personal information is now a valuable commodity for most corporations. Concomitant with increased value is the expansion of new legal obligations to protect personal information. Mandatory data breach notification laws are an important new development in this regard. Such laws require a corporation that has suffered a data breach, which involves personal information, such as a computer hacking incident, to notify those persons who may have been affected by the breach. Regulators may also need to be notified. Australia currently does not have a mandatory data breach notification law but this may be about to change. The Australian Law Reform Commission has suggested that a data breach notification scheme be implemented through the Privacy Act 1988 (Cth). However, the notification of data breaches may already be required under the continuous disclosure regime stipulated by the Corporations Act 2001 (Cth) and the Australian Stock Exchange (ASX) Listing Rules. Accordingly, this article examines whether the notification of data breaches is a statutory requirement of the existing continuous disclosure regime and whether the ASX should therefore be notified of such incidents.
Resumo:
With the advances in computer hardware and software development techniques in the past 25 years, digital computer simulation of train movement and traction systems has been widely adopted as a standard computer-aided engineering tool [1] during the design and development stages of existing and new railway systems. Simulators of different approaches and scales are used extensively to investigate various kinds of system studies. Simulation is now proven to be the cheapest means to carry out performance predication and system behaviour characterisation. When computers were first used to study railway systems, they were mainly employed to perform repetitive but time-consuming computational tasks, such as matrix manipulations for power network solution and exhaustive searches for optimal braking trajectories. With only simple high-level programming languages available at the time, full advantage of the computing hardware could not be taken. Hence, structured simulations of the whole railway system were not very common. Most applications focused on isolated parts of the railway system. It is more appropriate to regard those applications as primarily mechanised calculations rather than simulations. However, a railway system consists of a number of subsystems, such as train movement, power supply and traction drives, which inevitably contains many complexities and diversities. These subsystems interact frequently with each other while the trains are moving; and they have their special features in different railway systems. To further complicate the simulation requirements, constraints like track geometry, speed restrictions and friction have to be considered, not to mention possible non-linearities and uncertainties in the system. In order to provide a comprehensive and accurate account of system behaviour through simulation, a large amount of data has to be organised systematically to ensure easy access and efficient representation; the interactions and relationships among the subsystems should be defined explicitly. These requirements call for sophisticated and effective simulation models for each component of the system. The software development techniques available nowadays allow the evolution of such simulation models. Not only can the applicability of the simulators be largely enhanced by advanced software design, maintainability and modularity for easy understanding and further development, and portability for various hardware platforms are also encouraged. The objective of this paper is to review the development of a number of approaches to simulation models. Attention is, in particular, given to models for train movement, power supply systems and traction drives. These models have been successfully used to enable various ‘what-if’ issues to be resolved effectively in a wide range of applications, such as speed profiles, energy consumption, run times etc.
Resumo:
Training designed to support and strengthen higher-order mental abilities now often involves immersion in Virtual Reality where dangerous real world scenarios can be safely replicated. However despite the growing popularity of advanced training simulations, methods for evaluating their use rely heavily on subjective measures or analysis of final outcomes. Without dynamic, objective performance measures the outcome of training in terms of impact on cognitive skills and ability to transfer newly acquired skills to the real world is unknown. The relationship between affective intensity and cognitive learning provides a potential new approach to ensure the processing of cognitions which occur prior to final outcomes, such as problem-solving and decision-making, are adequately evaluated. This paper describes the technical aspects of pilot work recently undertaken to develop a new measurement tool designed to objectively track individual affect levels during simulation-based training.
Resumo:
This paper examines the ground-water flow problem associated with the injection and recovery of certain corrosive fluids into mineral bearing rock. The aim is to dissolve the minerals in situ, and then recover them in solution. In general, it is not possible to recover all the injected fluid, which is of concern economically and environmentally. However, a new strategy is proposed here, that allows all the leaching fluid to be recovered. A mathematical model of the situation is solved approximately using an asymptotic solution, and exactly using a boundary integral approach. Solutions are shown for two-dimensional flow, which is of some practical interest as it is achievable in old mine tunnels, for example.
Resumo:
This is an invited presentation made as a short preview of the virtual environment research work being undertaken at QUT in the Business Process Management (BPM) research group, known as BPMVE. Three projects are covered, spatial process visualisation, with applications to airport check-in processes, collaborative process modelling using a virtual world BPMN editing tool and business process simulation in virtual worlds using Open Simulator and the YAWL workflow system. In addition, the relationship of this work to Organisational Psychology is briefly explored. Full Video/Audio is available at: http://www.youtube.com/user/BPMVE#p/u/1/rp506c3pPms
Resumo:
Gradient-based approaches to direct policy search in reinforcement learning have received much recent attention as a means to solve problems of partial observability and to avoid some of the problems associated with policy degradation in value-function methods. In this paper we introduce GPOMDP, a simulation-based algorithm for generating a biased estimate of the gradient of the average reward in Partially Observable Markov Decision Processes (POMDPs) controlled by parameterized stochastic policies. A similar algorithm was proposed by Kimura, Yamamura, and Kobayashi (1995). The algorithm's chief advantages are that it requires storage of only twice the number of policy parameters, uses one free parameter β ∈ [0,1) (which has a natural interpretation in terms of bias-variance trade-off), and requires no knowledge of the underlying state. We prove convergence of GPOMDP, and show how the correct choice of the parameter β is related to the mixing time of the controlled POMDP. We briefly describe extensions of GPOMDP to controlled Markov chains, continuous state, observation and control spaces, multiple-agents, higher-order derivatives, and a version for training stochastic policies with internal states. In a companion paper (Baxter, Bartlett, & Weaver, 2001) we show how the gradient estimates generated by GPOMDP can be used in both a traditional stochastic gradient algorithm and a conjugate-gradient procedure to find local optima of the average reward. ©2001 AI Access Foundation and Morgan Kaufmann Publishers. All rights reserved.
Resumo:
Concrete is commonly used as a primary construction material for tall building construction. Load bearing components such as columns and walls in concrete buildings are subjected to instantaneous and long term axial shortening caused by the time dependent effects of "shrinkage", "creep" and "elastic" deformations. Reinforcing steel content, variable concrete modulus, volume to surface area ratio of the elements and environmental conditions govern axial shortening. The impact of differential axial shortening among columns and core shear walls escalate with increasing building height. Differential axial shortening of gravity loaded elements in geometrically complex and irregular buildings result in permanent distortion and deflection of the structural frame which have a significant impact on building envelopes, building services, secondary systems and the life time serviceability and performance of a building. Existing numerical methods commonly used in design to quantify axial shortening are mainly based on elastic analytical techniques and therefore unable to capture the complexity of non-linear time dependent effect. Ambient measurements of axial shortening using vibrating wire, external mechanical strain, and electronic strain gauges are methods that are available to verify pre-estimated values from the design stage. Installing these gauges permanently embedded in or on the surface of concrete components for continuous measurements during and after construction with adequate protection is uneconomical, inconvenient and unreliable. Therefore such methods are rarely if ever used in actual practice of building construction. This research project has developed a rigorous numerical procedure that encompasses linear and non-linear time dependent phenomena for prediction of axial shortening of reinforced concrete structural components at design stage. This procedure takes into consideration (i) construction sequence, (ii) time varying values of Young's Modulus of reinforced concrete and (iii) creep and shrinkage models that account for variability resulting from environmental effects. The capabilities of the procedure are illustrated through examples. In order to update previous predictions of axial shortening during the construction and service stages of the building, this research has also developed a vibration based procedure using ambient measurements. This procedure takes into consideration the changes in vibration characteristic of structure during and after construction. The application of this procedure is illustrated through numerical examples which also highlight the features. The vibration based procedure can also be used as a tool to assess structural health/performance of key structural components in the building during construction and service life.
Resumo:
Digital human modelling (DHM) has today matured from research into industrial application. In the automotive domain, DHM has become a commonly used tool in virtual prototyping and human-centred product design. While this generation of DHM supports the ergonomic evaluation of new vehicle design during early design stages of the product, by modelling anthropometry, posture, motion or predicting discomfort, the future of DHM will be dominated by CAE methods, realistic 3D design, and musculoskeletal and soft tissue modelling down to the micro-scale of molecular activity within single muscle fibres. As a driving force for DHM development, the automotive industry has traditionally used human models in the manufacturing sector (production ergonomics, e.g. assembly) and the engineering sector (product ergonomics, e.g. safety, packaging). In product ergonomics applications, DHM share many common characteristics, creating a unique subset of DHM. These models are optimised for a seated posture, interface to a vehicle seat through standardised methods and provide linkages to vehicle controls. As a tool, they need to interface with other analytic instruments and integrate into complex CAD/CAE environments. Important aspects of current DHM research are functional analysis, model integration and task simulation. Digital (virtual, analytic) prototypes or digital mock-ups (DMU) provide expanded support for testing and verification and consider task-dependent performance and motion. Beyond rigid body mechanics, soft tissue modelling is evolving to become standard in future DHM. When addressing advanced issues beyond the physical domain, for example anthropometry and biomechanics, modelling of human behaviours and skills is also integrated into DHM. Latest developments include a more comprehensive approach through implementing perceptual, cognitive and performance models, representing human behaviour on a non-physiologic level. Through integration of algorithms from the artificial intelligence domain, a vision of the virtual human is emerging.
Resumo:
Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.
Resumo:
Real-time remote sales assistance is an underdeveloped component of online sales services. Solutions involving web page text chat, telephony and video support prove problematic when seeking to remotely guide customers in their sales processes, especially with configurations of physically complex artefacts. Recently, there has been great interest in the application of virtual worlds and augmented reality to create synthetic environments for remote sales of physical artefacts. However, there is a lack of analysis and development of appropriate software services to support these processes. We extend our previous work with the detailed design of configuration context services to support the management of an interactive sales session using augmented reality. We detail the context and configuration services required, presenting a novel data service streaming configuration information to the vendor for business analytics. We expect that a fully implemented configuration management service, based on our design, will improve the remote sales experience for both customers and vendors alike via analysis of the streamed information.
Resumo:
Workflow patterns have been recognized as the theoretical basis to modeling recurring problems in workflow systems. A form of workflow patterns, known as the resource patterns, characterise the behaviour of resources in workflow systems. Despite the fact that many resource patterns have been discovered, people still preclude them from many workflow system implementations. One of reasons could be obscurityin the behaviour of and interaction between resources and a workflow management system. Thus, we provide a modelling and visualization approach for the resource patterns, enabling a resource behaviour modeller to intuitively see the specific resource patterns involved in the lifecycle of a workitem. We believe this research can be extended to benefit not only workflow modelling, but also other applications, such as model validation, human resource behaviour modelling, and workflow model visualization.