977 resultados para policy simulation
Resumo:
Regional planning faces numerous decision making uncertainties related to the complex interdependencies between urban and regional centres. Questions about how to achieve sustainable planning solutions across regions are a key uncertainty and relate to a lack of information about the actual achievement of outcomes as proposed by the objectives of a plan. Regional plan implementation and its impact on environmental, social and economic outcomes have been little explored within Australian urban and regional planning research. Despite a desire to improve the conditions across Australian regions, ambiguity persists regarding the results of regional planning efforts. Of the variables affecting regional planning, scholars argue that governance has a significant impact on achieving outcomes (see Pahl-Wostl 2009). In order to better analyse the impact of governance, we propose a set of governance indicators to examine decisions across regional planning institutions and apply this to governance models across Queensland’s regions. We contend that these governance indicators can support a more rigorous assessment of the impacts of governance models on plan implementation and outcomes. We propose that this is a way to better understand the relationship between planning and outcomes across urban and regional areas.
Resumo:
Let a and s denote the inter arrival times and service times in a GI/GI/1 queue. Let a (n), s (n) be the r.v.s, with distributions as the estimated distributions of a and s from iid samples of a and s of sizes n. Let w be a r.v. with the stationary distribution lr of the waiting times of the queue with input (a, s). We consider the problem of estimating E [w~], tx > 0 and 7r via simulations when (a (n), s (n)) are used as input. Conditions for the accuracy of the asymptotic estimate, continuity of the asymptotic variance and uniformity in the rate of convergence to the estimate are obtained. We also obtain rates of convergence for sample moments, the empirical process and the quantile process for the regenerative processes. Robust estimates are also obtained when an outlier contaminated sample of a and s is provided. In the process we obtain consistency, continuity and asymptotic normality of M-estimators for stationary sequences. Some robustness results for Markov processes are included.
Resumo:
In a medical negligence context, and under the causation provisions enacted pursuant to Civil Liability Legislation in most Australian jurisdictions, the normative concept of “scope of liability” requires a consideration of whether or not and why a medical practitioner should be responsible for a patient’s harm. As such, it places a limit on the extent to which practitioners are deemed liable for a breach of the duty of care owed by them, in circumstances where a legal factual connection between that breach and the causation of a patient’s harm has already been shown. It has been said that a determination of causation requires ‘the identification and articulation of an evaluative judgement by reference to “the purposes and policy of the relevant part of the law”’: Wallace v Kam (2013) 297 ALR 383, 388. Accordingly, one of the normative factors falling within scope of liability is an examination of the content and purpose of the rule or duty of care violated – that is, its underlying policy and whether this supports an attribution of legal responsibility upon a practitioner. In this context, and with reference to recent jurisprudence, this paper considers: the policy relevant to a practitioner’s duty of care in each of the areas of diagnosis, treatment and advice; how this has been used to determine an appropriate scope of liability for the purpose of the causation inquiry in medical negligence claims; and whether such an approach is problematic for medical standards or decision-making.
Resumo:
This paper presents the modeling and analysis of a voltage source converter (VSC) based back-to-back (BTB) HVDC link. The case study considers the response to changes in the active and reactive power and disturbance caused by single line to ground (SLG) fault. The controllers at each terminal are designed to inject a variable (magnitude and phase angle) sinusoidal, balanced set of voltages to regulate/control the active and reactive power. It is also possible to regulate the converter bus (AC) voltage by controlling the injected reactive power. The analysis is carried out using both d-q model (neglecting the harmonics in the output voltages of VSC) and three phase detailed model of VSC. While the eigenvalue analysis and controller design is based on the d-q model, the transient simulation considers both models.
Resumo:
This paper presents real-time simulation models of electrical machines on FPGA platform. Implementation of the real-time numerical integration methods with digital logic elements is discussed. Several numerical integrations are presented. A real-time simulation of DC machine is carried out on this FPGA platform and important transient results are presented. These results are compared to simulation results obtained through a commercial off-line simulation software.
Resumo:
We propose a dynamic mathematical model of tissue oxygen transport by a preexisting three-dimensional microvascular network which provides nutrients for an in situ cancer at the very early stage of primary microtumour growth. The expanding tumour consumes oxygen during its invasion to the surrounding tissues and cooption of host vessels. The preexisting vessel cooption, remodelling and collapse are modelled by the changes of haemodynamic conditions due to the growing tumour. A detailed computational model of oxygen transport in tumour tissue is developed by considering (a) the time-varying oxygen advection diffusion equation within the microvessel segments, (b) the oxygen flux across the vessel walls, and (c) the oxygen diffusion and consumption with in the tumour and surrounding healthy tissue. The results show the oxygen concentration distribution at different time points of early tumour growth. In addition, the influence of preexisting vessel density on the oxygen transport has been discussed. The proposed model not only provides a quantitative approach for investigating the interactions between tumour growth and oxygen delivery, but also is extendable to model other molecules or chemotherapeutic drug transport in the future study.
Resumo:
A three-dimensional (3D) mathematical model of tumour growth at the avascular phase and vessel remodelling in host tissues is proposed with emphasis on the study of the interactions of tumour growth and hypoxic micro-environment in host tissues. The hybrid based model includes the continuum part, such as the distributions of oxygen and vascular endothelial growth factors (VEGFs), and the discrete part of tumour cells (TCs) and blood vessel networks. The simulation shows the dynamic process of avascular tumour growth from a few initial cells to an equilibrium state with varied vessel networks. After a phase of rapidly increasing numbers of the TCs, more and more host vessels collapse due to the stress caused by the growing tumour. In addition, the consumption of oxygen expands with the enlarged tumour region. The study also discusses the effects of certain factors on tumour growth, including the density and configuration of preexisting vessel networks and the blood oxygen content. The model enables us to examine the relationship between early tumour growth and hypoxic micro-environment in host tissues, which can be useful for further applications, such as tumour metastasis and the initialization of tumour angiogenesis.
Resumo:
Background: Coronary tortuosity (CT) is a common coronary angiographic finding. Whether CT leads to an apparent reduction in coronary pressure distal to the tortuous segment of the coronary artery is still unknown. The purpose of this study is to determine the impact of CT on coronary pressure distribution by numerical simulation. Methods: 21 idealized models were created to investigate the influence of coronary tortuosity angle (CTA) and coronary tortuosity number (CTN) on coronary pressure distribution. A 2D incompressible Newtonian flow was assumed and the computational simulation was performed using finite volume method. CTA of 30°, 60°, 90°, 120° and CTN of 0, 1, 2, 3, 4, 5 were discussed under both steady and pulsatile conditions, and the changes of outlet pressure and inlet velocity during the cardiac cycle were considered. Results: Coronary pressure distribution was affected both by CTA and CTN. We found that the pressure drop between the start and the end of the CT segment decreased with CTA, and the length of the CT segment also declined with CTA. An increase in CTN resulted in an increase in the pressure drop. Conclusions: Compared to no-CT, CT can results in more decrease of coronary blood pressure in dependence on the severity of tortuosity and severe CT may cause myocardial ischemia.
Resumo:
The rupture of atherosclerotic plaques is known to be associated with the stresses that act on or within the arterial wall. The extreme wall tensile stress (WTS) is usually recognized as a primary trigger for the rupture of vulnerable plaque. The present study used the in-vivo high-resolution multi-spectral magnetic resonance imaging (MRI) for carotid arterial plaque morphology reconstruction. Image segmentation of different plaque components was based on the multi-spectral MRI and co-registered with different sequences for the patient. Stress analysis was performed on totally four subjects with different plaque burden by fluid-structure interaction (FSI) simulations. Wall shear stress distributions are highly related to the degree of stenosis, while the level of its magnitude is much lower than the WTS in the fibrous cap. WTS is higher in the luminal wall and lower at the outer wall, with the lowest stress at the lipid region. Local stress concentrations are well confined in the thinner fibrous cap region, and usually locating in the plaque shoulder; the introduction of relative stress variation during a cycle in the fibrous cap can be a potential indicator for plaque fatigue process in the thin fibrous cap. According to stress analysis of the four subjects, a risk assessment in terms of mechanical factors could be made, which may be helpful in clinical practice. However, more subjects with patient specific analysis are desirable for plaque-stability study.
Resumo:
It has been well accepted that over 50% of cerebral ischemic events are the result of rupture of vulnerable carotid atheroma and subsequent thrombosis. Such strokes are potentially preventable by carotid interventions. Selection of patients for intervention is currently based on the severity of carotid luminal stenosis. It has been, however, widely accepted that luminal stenosis alone may not be an adequate predictor of risk. To evaluate the effects of degree of luminal stenosis and plaque morphology on plaque stability, we used a coupled nonlinear time-dependent model with flow-plaque interaction simulation to perform flow and stress/strain analysis for stenotic artery with a plaque. The Navier-Stokes equations in the Arbitrary Lagrangian-Eulerian (ALE) formulation were used as the governing equations for the fluid. The Ogden strain energy function was used for both the fibrous cap and the lipid pool. The plaque Principal stresses and flow conditions were calculated for every case when varying the fibrous cap thickness from 0.1 to 2mm and the degree of luminal stenosis from 10% to 90%. Severe stenosis led to high flow velocities and high shear stresses, but a low or even negative pressure at the throat of the stenosis. Higher degree of stenosis and thinner fibrous cap led to larger plaque stresses, and a 50% decrease of fibrous cap thickness resulted in a 200% increase of maximum stress. This model suggests that fibrous cap thickness is critically related to plaque vulnerability and that, even within presence of moderate stenosis, may play an important role in the future risk stratification of those patients when identified in vivo using high resolution MR imaging.
Resumo:
Considering ultrasound propagation through complex composite media as an array of parallel sonic rays, a comparison of computer simulated prediction with experimental data has previously been reported for transmission mode (where one transducer serves as transmitter, the other as receiver) in a series of ten acrylic step-wedge samples, immersed in water, exhibiting varying degrees of transit time inhomogeneity. In this study, the same samples were used but in pulse-echo mode, where the same ultrasound transducer served as both transmitter and receiver, detecting both ‘primary’ (internal sample interface) and ‘secondary’ (external sample interface) echoes. A transit time spectrum (TTS) was derived, describing the proportion of sonic rays with a particular transit time. A computer simulation was performed to predict the transit time and amplitude of various echoes created, and compared with experimental data. Applying an amplitude-tolerance analysis, 91.7±3.7% of the simulated data was within ±1 standard deviation (STD) of the experimentally measured amplitude-time data. Correlation of predicted and experimental transit time spectra provided coefficients of determination (R2) ranging from 100.0% to 96.8% for the various samples tested. The results acquired from this study provide good evidence for the concept of parallel sonic rays. Further, deconvolution of experimental input and output signals has been shown to provide an effective method to identify echoes otherwise lost due to phase cancellation. Potential applications of pulse-echo ultrasound transit time spectroscopy (PE-UTTS) include improvement of ultrasound image fidelity by improving spatial resolution and reducing phase interference artefacts.
Resumo:
Three simulations of evapotranspiration were done with two values of time step,viz 10 min and one day. Inputs to the model were weather data, including directly measured upward and downward radiation, and soil characteristics. Three soils were used for each simulation. Analysis of the results shows that the time step has a direct influence on the prediction of potential evapotranspiration, but a complex interaction of this effect with the soil moisture characteristic, rate of increase of ground cover and bare soil evaporation determines the actual transpiration predicted. The results indicate that as small a time step as possible should be used in the simulation.
Resumo:
The literature contains many examples of digital procedures for the analytical treatment of electroencephalograms, but there is as yet no standard by which those techniques may be judged or compared. This paper proposes one method of generating an EEG, based on a computer program for Zetterberg's simulation. It is assumed that the statistical properties of an EEG may be represented by stationary processes having rational transfer functions and achieved by a system of software fillers and random number generators.The model represents neither the neurological mechanism response for generating the EEG, nor any particular type of EEG record; transient phenomena such as spikes, sharp waves and alpha bursts also are excluded. The basis of the program is a valid ‘partial’ statistical description of the EEG; that description is then used to produce a digital representation of a signal which if plotted sequentially, might or might not by chance resemble an EEG, that is unimportant. What is important is that the statistical properties of the series remain those of a real EEG; it is in this sense that the output is a simulation of the EEG. There is considerable flexibility in the form of the output, i.e. its alpha, beta and delta content, which may be selected by the user, the same selected parameters always producing the same statistical output. The filtered outputs from the random number sequences may be scaled to provide realistic power distributions in the accepted EEG frequency bands and then summed to create a digital output signal, the ‘stationary EEG’. It is suggested that the simulator might act as a test input to digital analytical techniques for the EEG, a simulator which would enable at least a substantial part of those techniques to be compared and assessed in an objective manner. The equations necessary to implement the model are given. The program has been run on a DEC1090 computer but is suitable for any microcomputer having more than 32 kBytes of memory; the execution time required to generate a 25 s simulated EEG is in the region of 15 s.