807 resultados para Masters
Resumo:
When asymptotic series methods are applied in order to solve problems that arise in applied mathematics in the limit that some parameter becomes small, they are unable to demonstrate behaviour that occurs on a scale that is exponentially small compared to the algebraic terms of the asymptotic series. There are many examples of physical systems where behaviour on this scale has important effects and, as such, a range of techniques known as exponential asymptotic techniques were developed that may be used to examinine behaviour on this exponentially small scale. Many problems in applied mathematics may be represented by behaviour within the complex plane, which may subsequently be examined using asymptotic methods. These problems frequently demonstrate behaviour known as Stokes phenomenon, which involves the rapid switches of behaviour on an exponentially small scale in the neighbourhood of some curve known as a Stokes line. Exponential asymptotic techniques have been applied in order to obtain an expression for this exponentially small switching behaviour in the solutions to orginary and partial differential equations. The problem of potential flow over a submerged obstacle has been previously considered in this manner by Chapman & Vanden-Broeck (2006). By representing the problem in the complex plane and applying an exponential asymptotic technique, they were able to detect the switching, and subsequent behaviour, of exponentially small waves on the free surface of the flow in the limit of small Froude number, specifically considering the case of flow over a step with one Stokes line present in the complex plane. We consider an extension of this work to flow configurations with multiple Stokes lines, such as flow over an inclined step, or flow over a bump or trench. The resultant expressions are analysed, and demonstrate interesting implications, such as the presence of exponentially sub-subdominant intermediate waves and the possibility of trapped surface waves for flow over a bump or trench. We then consider the effect of multiple Stokes lines in higher order equations, particu- larly investigating the behaviour of higher-order Stokes lines in the solutions to partial differential equations. These higher-order Stokes lines switch off the ordinary Stokes lines themselves, adding a layer of complexity to the overall Stokes structure of the solution. Specifically, we consider the different approaches taken by Howls et al. (2004) and Chap- man & Mortimer (2005) in applying exponential asymptotic techniques to determine the higher-order Stokes phenomenon behaviour in the solution to a particular partial differ- ential equation.
Resumo:
A research study was conducted in a key area of project management: stakeholder and relationship management through communication - ‘the soft skills’. It was conducted with Diploma of Project Management graduates from one Australian Registered Training Organisation (RTO), the Australian College of Project Management (ACPM). The study was designed to initially identify the qualifications and project management experience of the participants. Further, it identified the respondents’ understanding of and attitude to commonly held principles and literature within the project management field as it relates to the soft skills of projects. This is specifically connected to their project experience and knowledge, approach to project communications, and the stakeholder’s needs. Some of the literature showed that through the management and application of the project soft skills by project managers may actually be a recipe for project success. Hence, an important underpinning of this study was that the project manager can enhance project success (or reduce the impact of failure) by identifying and prioritising stakeholders, developing and implementing strategies for engaging and communicating with them. The use of a positivist approach to this research study allowed for the evaluation and understanding of respondents to the emergent theories of successful projects being delivered through the management of stakeholders, communications, and relationships. Consequently, a quantitative approach to this study was undertaken. The participants were drawn from graduates who completed (graduated) from the ACPM with the Diploma of Project Management between January 2004 and December 2007 only. A list of graduates was collated from this period indicating that a total of 656 graduates have completed and graduated with the qualification. The data collection for this study was done in one phase only. The questionnaire was emailed individually by the researcher directly to the selected potential respondents. Subsequently, a total of 44 responses were received, providing an overall response rate of 43%. Two key factors emerged from the survey questionnaire. Firstly, the need for the soft skills to be incorporated in project management curriculum and education programs, and secondly, that successful projects are delivered through the management and application of the project soft skills. It is expected that the findings of this study be provided across various forums (such as vocational education and training, and project management conferences) and via project management bodies such as the Australian Institute of Project Management (AIPM) to inform learning and provide greater insight into the soft skills of project management. It is the contention of the researcher that this quantitative study of Diploma of Project Management graduates’ views and attitudes highlights the importance of project soft skills and its importance in the delivery of successful projects as well as being part of the competencies of a successful project manager. This study also revealed the value of project experience and knowledge as it pertains to the management and application of the project soft skills.
Resumo:
Shared leadership has been identified as a key governance base for the future of government and Catholic schools in Queensland, the state’s two largest providers of school education. Shared leadership values the contributions that many individuals can make through collaboration and teamwork. It claims to improve organisational performance and reduce the increasing pressures faced by principals. However despite these positive features, shared leadership is generally not well understood, not well accepted and not valued by those who practice or study leadership. A collective case study method was chosen, incorporating a series of semi-structured interviews with principals and the use of official school documents. The study has explored the current understanding and practice of shared leadership in four Queensland schools and investigated its potential for use.
Resumo:
Immersive environments are part of a recent media innovation that allow users to become so involved within a computer-based simulated environment that they feel part of that virtual world (Grigorovici, 2003). A specific example is Second Life, which is an internet-based, three-dimensional immersive virtual world in which users create an online representation of themselves (an avatar) to play games and interact socially with thousands of people simultaneously. This study focuses on Second Life as an example of an immersive environment, as it is the largest adult freeform virtual world, home to 12 million avatars (IOWA State University, 2008). Already in Second Life there are more than 100 real-life brands from a range of industries, including automotive, professional services, and consumer goods and travel, among others (KZero, 2007; New Business Horizons, 2009). Compared to traditional advertising media, this interactive media can immerse users in the environment. As a result of this interactivity, users can become more involved with a virtual environment, resulting in prolonged usage over weeks, months and even years. Also, it can facilitate presence. Despite these developments, little is known about the effectiveness of marketing messages in a virtual world context. Marketers are incorporating products into Second Life using a strategy of online product placement. This study, therefore, explores the perceived effectiveness of online product placement in Second Life in terms of effects on product/brand recall, purchase intentions and trial. This research examines the association between individuals’ involvement with Second Life and online product placement effectiveness, as well as the relationship between individuals’ Second Life involvement and the effectiveness of online product placement. In addition, it investigates the association of immersion and product placement involvement. It also examines the impact of product placement involvement on online product placement effectiveness and the role of presence in affecting this relationship. An exploratory study was conducted for this research using semi-structured in-depth interviews face-to-face, email-based and in-world. The sample comprised 24 active Second Life users. Results indicate that product placement effectiveness is not directly associated with Second Life involvement, but rather effectiveness is impacted through the effect of Second Life involvement on product placement involvement. A positive relationship was found between individuals’ product placement involvement and online product placement effectiveness. Findings also indicate that online product placement effectiveness is not directly associated with immersion. Rather, it appears that effectiveness is impacted through the effect of immersion on product placement involvement. Moreover, higher levels of presence appear to have a positive impact on the relationship between product placement involvement and product placement effectiveness. Finally, a model was developed from this qualitative study for future testing. In terms of theoretical contributions, this study provides a new model for testing the effectiveness of product placement within immersive environments. From a methodological perspective, in-world interviews as a new research method were undertaken. In terms of a practical contribution, findings identified useful information for marketers and advertising agencies that aim to promote their products in immersive virtual environments like Second Life.
Resumo:
Water Sensitive Urban Design (WSUD) systems have the potential mitigate the hydrologic disturbance and water quality concerns associated with stormwater runoff from urban development. In the last few years WSUD has been strongly promoted in South East Queensland (SEQ) and new developments are now required to use WSUD systems to manage stormwater runoff. However, there has been limited field evaluation of WSUD systems in SEQ and consequently knowledge of their effectiveness in the field, under storm events, is limited. The objective of this research project was to assess the effectiveness of WSUD systems installed in a residential development, under real storm events. To achieve this objective, a constructed wetland, bioretention swale and a bioretention basin were evaluated for their ability to improve the hydrologic and water quality characteristics of stormwater runoff from urban development. The monitoring focused on storm events, with sophisticated event monitoring stations measuring the inflow and outflow from WSUD systems. Data analysis undertaken confirmed that the constructed wetland, bioretention basin and bioretention swale improved the hydrologic characteristics by reducing peak flow. The bioretention systems, particularly the bioretention basin also reduced the runoff volume and frequency of flow, meeting key objectives of current urban stormwater management. The pollutant loads were reduced by the WSUD systems to above or just below the regional guidelines, showing significant reductions to TSS (70-85%), TN (40-50%) and TP (50%). The load reduction of NOx and PO4 3- by the bioretention basin was poor (<20%), whilst the constructed wetland effectively reduced the load of these pollutants in the outflow by approximately 90%. The primary reason for the load reduction in the wetland was due to a reduction in concentration in the outflow, showing efficient treatment of stormwater by the system. In contrast, the concentration of key pollutants exiting the bioretention basin were higher than the inflow. However, as the volume of stormwater exiting the bioretention basin was significantly lower than the inflow, a load reduction was still achieved. Calibrated MUSIC modelling showed that the bioretention basin, and in particular, the constructed wetland were undersized, with 34% and 62% of stormwater bypassing the treatment zones in the devices. Over the long term, a large proportion of runoff would not receive treatment, considerably reducing the effectiveness of the WSUD systems.
Resumo:
Art is most often at the margins of community life, seen as a distraction or entertainment only; an individual’s whim. It is generally seen as without a useful role to play in that community. This is a perception of grown-ups; children seem readily to accept an engagement with art making. Our research has shown that when an individual is drawn into a crafted art project where they have an actual involvement with the direction and production of the art work, then they become deeply engaged on multiple levels. This is true of all age groups. Artists skilled in community collaboration are able to produce art of value that transcends the usual judgements of worth. It gives people a licence to unfetter their imagination and then cooperatively be drawn back to a reachable visual solution. If you engage with children in a community, you engage the extended family at some point. The primary methodology was to produce a series of educationally valid projects at the Cherbourg State School that had a resonance into that community, then revisit and refine them where necessary and develop a new series that extended all of the positive aspects of them. This was done over a period of five years. The art made during this time is excellent. The children know it, as do their families, staff at the school, members of the local community and the others who have viewed it in exhibitions in far places like Brisbane and Melbourne. This art and the way it has been made has been acknowledged as useful by the children, teachers and the community, in educational and social terms. The school is a better place to be. This has been acknowledged by the children, teachers and the community The art making of the last five years has become an integral part of the way the school now operates and the influence of that has begun to seep into other parts of the community. Art needs to be taken from the margins and put to work at the centre.
Resumo:
Power transformers are one of the most important and costly equipment in power generation, transmission and distribution systems. Current average age of transformers in Australia is around 25 years and there is a strong economical tendency to use them up to 50 years or more. As the transformers operate, they get degraded due to different loading and environmental operating stressed conditions. In today‘s competitive energy market with the penetration of distributed energy sources, the transformers are stressed more with minimum required maintenance. The modern asset management program tries to increase the usage life time of power transformers with prognostic techniques using condition indicators. In the case of oil filled transformers, condition monitoring methods based on dissolved gas analysis, polarization studies, partial discharge studies, frequency response analysis studies to check the mechanical integrity, IR heat monitoring and other vibration monitoring techniques are in use. In the current research program, studies have been initiated to identify the degradation of insulating materials by the electrical relaxation technique known as dielectrometry. Aging leads to main degradation products like moisture and other oxidized products due to fluctuating thermal and electrical loading. By applying repetitive low frequency high voltage sine wave perturbations in the range of 100 to 200 V peak across available terminals of power transformer, the conductive and polarization parameters of insulation aging are identified. An in-house novel digital instrument is developed to record the low leakage response of repetitive polarization currents in three terminals configuration. The technique is tested with known three transformers of rating 5 kVA or more. The effects of stressing polarization voltage level, polarizing wave shapes and various terminal configurations provide characteristic aging relaxation information. By using different analyses, sensitive parameters of aging are identified and it is presented in this thesis.
Resumo:
In this thesis, the relationship between air pollution and human health has been investigated utilising Geographic Information System (GIS) as an analysis tool. The research focused on how vehicular air pollution affects human health. The main objective of this study was to analyse the spatial variability of pollutants, taking Brisbane City in Australia as a case study, by the identification of the areas of high concentration of air pollutants and their relationship with the numbers of death caused by air pollutants. A correlation test was performed to establish the relationship between air pollution, number of deaths from respiratory disease, and total distance travelled by road vehicles in Brisbane. GIS was utilized to investigate the spatial distribution of the air pollutants. The main finding of this research is the comparison between spatial and non-spatial analysis approaches, which indicated that correlation analysis and simple buffer analysis of GIS using the average levels of air pollutants from a single monitoring station or by group of few monitoring stations is a relatively simple method for assessing the health effects of air pollution. There was a significant positive correlation between variable under consideration, and the research shows a decreasing trend of concentration of nitrogen dioxide at the Eagle Farm and Springwood sites and an increasing trend at CBD site. Statistical analysis shows that there exists a positive relationship between the level of emission and number of deaths, though the impact is not uniform as certain sections of the population are more vulnerable to exposure. Further statistical tests found that the elderly people of over 75 years age and children between 0-15 years of age are the more vulnerable people exposed to air pollution. A non-spatial approach alone may be insufficient for an appropriate evaluation of the impact of air pollutant variables and their inter-relationships. It is important to evaluate the spatial features of air pollutants before modeling the air pollution-health relationships.
Resumo:
The high morbidity and mortality associated with atherosclerotic coronary vascular disease (CVD) and its complications are being lessened by the increased knowledge of risk factors, effective preventative measures and proven therapeutic interventions. However, significant CVD morbidity remains and sudden cardiac death continues to be a presenting feature for some subsequently diagnosed with CVD. Coronary vascular disease is also the leading cause of anaesthesia related complications. Stress electrocardiography/exercise testing is predictive of 10 year risk of CVD events and the cardiovascular variables used to score this test are monitored peri-operatively. Similar physiological time-series datasets are being subjected to data mining methods for the prediction of medical diagnoses and outcomes. This study aims to find predictors of CVD using anaesthesia time-series data and patient risk factor data. Several pre-processing and predictive data mining methods are applied to this data. Physiological time-series data related to anaesthetic procedures are subjected to pre-processing methods for removal of outliers, calculation of moving averages as well as data summarisation and data abstraction methods. Feature selection methods of both wrapper and filter types are applied to derived physiological time-series variable sets alone and to the same variables combined with risk factor variables. The ability of these methods to identify subsets of highly correlated but non-redundant variables is assessed. The major dataset is derived from the entire anaesthesia population and subsets of this population are considered to be at increased anaesthesia risk based on their need for more intensive monitoring (invasive haemodynamic monitoring and additional ECG leads). Because of the unbalanced class distribution in the data, majority class under-sampling and Kappa statistic together with misclassification rate and area under the ROC curve (AUC) are used for evaluation of models generated using different prediction algorithms. The performance based on models derived from feature reduced datasets reveal the filter method, Cfs subset evaluation, to be most consistently effective although Consistency derived subsets tended to slightly increased accuracy but markedly increased complexity. The use of misclassification rate (MR) for model performance evaluation is influenced by class distribution. This could be eliminated by consideration of the AUC or Kappa statistic as well by evaluation of subsets with under-sampled majority class. The noise and outlier removal pre-processing methods produced models with MR ranging from 10.69 to 12.62 with the lowest value being for data from which both outliers and noise were removed (MR 10.69). For the raw time-series dataset, MR is 12.34. Feature selection results in reduction in MR to 9.8 to 10.16 with time segmented summary data (dataset F) MR being 9.8 and raw time-series summary data (dataset A) being 9.92. However, for all time-series only based datasets, the complexity is high. For most pre-processing methods, Cfs could identify a subset of correlated and non-redundant variables from the time-series alone datasets but models derived from these subsets are of one leaf only. MR values are consistent with class distribution in the subset folds evaluated in the n-cross validation method. For models based on Cfs selected time-series derived and risk factor (RF) variables, the MR ranges from 8.83 to 10.36 with dataset RF_A (raw time-series data and RF) being 8.85 and dataset RF_F (time segmented time-series variables and RF) being 9.09. The models based on counts of outliers and counts of data points outside normal range (Dataset RF_E) and derived variables based on time series transformed using Symbolic Aggregate Approximation (SAX) with associated time-series pattern cluster membership (Dataset RF_ G) perform the least well with MR of 10.25 and 10.36 respectively. For coronary vascular disease prediction, nearest neighbour (NNge) and the support vector machine based method, SMO, have the highest MR of 10.1 and 10.28 while logistic regression (LR) and the decision tree (DT) method, J48, have MR of 8.85 and 9.0 respectively. DT rules are most comprehensible and clinically relevant. The predictive accuracy increase achieved by addition of risk factor variables to time-series variable based models is significant. The addition of time-series derived variables to models based on risk factor variables alone is associated with a trend to improved performance. Data mining of feature reduced, anaesthesia time-series variables together with risk factor variables can produce compact and moderately accurate models able to predict coronary vascular disease. Decision tree analysis of time-series data combined with risk factor variables yields rules which are more accurate than models based on time-series data alone. The limited additional value provided by electrocardiographic variables when compared to use of risk factors alone is similar to recent suggestions that exercise electrocardiography (exECG) under standardised conditions has limited additional diagnostic value over risk factor analysis and symptom pattern. The effect of the pre-processing used in this study had limited effect when time-series variables and risk factor variables are used as model input. In the absence of risk factor input, the use of time-series variables after outlier removal and time series variables based on physiological variable values’ being outside the accepted normal range is associated with some improvement in model performance.
Resumo:
Aims: To develop clinical protocols for acquiring PET images, performing CT-PET registration and tumour volume definition based on the PET image data, for radiotherapy for lung cancer patients and then to test these protocols with respect to levels of accuracy and reproducibility. Method: A phantom-based quality assurance study of the processes associated with using registered CT and PET scans for tumour volume definition was conducted to: (1) investigate image acquisition and manipulation techniques for registering and contouring CT and PET images in a radiotherapy treatment planning system, and (2) determine technology-based errors in the registration and contouring processes. The outcomes of the phantom image based quality assurance study were used to determine clinical protocols. Protocols were developed for (1) acquiring patient PET image data for incorporation into the 3DCRT process, particularly for ensuring that the patient is positioned in their treatment position; (2) CT-PET image registration techniques and (3) GTV definition using the PET image data. The developed clinical protocols were tested using retrospective clinical trials to assess levels of inter-user variability which may be attributed to the use of these protocols. A Siemens Somatom Open Sensation 20 slice CT scanner and a Philips Allegro stand-alone PET scanner were used to acquire the images for this research. The Philips Pinnacle3 treatment planning system was used to perform the image registration and contouring of the CT and PET images. Results: Both the attenuation-corrected and transmission images obtained from standard whole-body PET staging clinical scanning protocols were acquired and imported into the treatment planning system for the phantom-based quality assurance study. Protocols for manipulating the PET images in the treatment planning system, particularly for quantifying uptake in volumes of interest and window levels for accurate geometric visualisation were determined. The automatic registration algorithms were found to have sub-voxel levels of accuracy, with transmission scan-based CT-PET registration more accurate than emission scan-based registration of the phantom images. Respiration induced image artifacts were not found to influence registration accuracy while inadequate pre-registration over-lap of the CT and PET images was found to result in large registration errors. A threshold value based on a percentage of the maximum uptake within a volume of interest was found to accurately contour the different features of the phantom despite the lower spatial resolution of the PET images. Appropriate selection of the threshold value is dependant on target-to-background ratios and the presence of respiratory motion. The results from the phantom-based study were used to design, implement and test clinical CT-PET fusion protocols. The patient PET image acquisition protocols enabled patients to be successfully identified and positioned in their radiotherapy treatment position during the acquisition of their whole-body PET staging scan. While automatic registration techniques were found to reduce inter-user variation compared to manual techniques, there was no significant difference in the registration outcomes for transmission or emission scan-based registration of the patient images, using the protocol. Tumour volumes contoured on registered patient CT-PET images using the tested threshold values and viewing windows determined from the phantom study, demonstrated less inter-user variation for the primary tumour volume contours than those contoured using only the patient’s planning CT scans. Conclusions: The developed clinical protocols allow a patient’s whole-body PET staging scan to be incorporated, manipulated and quantified in the treatment planning process to improve the accuracy of gross tumour volume localisation in 3D conformal radiotherapy for lung cancer. Image registration protocols which factor in potential software-based errors combined with adequate user training are recommended to increase the accuracy and reproducibility of registration outcomes. A semi-automated adaptive threshold contouring technique incorporating a PET windowing protocol, accurately defines the geometric edge of a tumour volume using PET image data from a stand alone PET scanner, including 4D target volumes.
Resumo:
Reputation and proof-of-work systems have been outlined as methods bot masters will soon use to defend their peer-to-peer botnets. These techniques are designed to prevent sybil attacks, such as those that led to the downfall of the Storm botnet. To evaluate the effectiveness of these techniques, a botnet that employed these techniques was simulated, and the amount of resources required to stage a successful sybil attack against it measured. While the proof-of-work system was found to increase the resources required for a successful sybil attack, the reputation system was found to lower the amount of resources required to disable the botnet.
Resumo:
The reliability of Critical Infrastructure is considered to be a fundamental expectation of modern societies. These large-scale socio-technical systems have always, due to their complex nature, been faced with threats challenging their ongoing functioning. However, increasing uncertainty in addition to the trend of infrastructure fragmentation has made reliable service provision not only a key organisational goal, but a major continuity challenge: especially given the highly interdependent network conditions that exist both regionally and globally. The notion of resilience as an adaptive capacity supporting infrastructure reliability under conditions of uncertainty and change has emerged as a critical capacity for systems of infrastructure and the organisations responsible for their reliable management. This study explores infrastructure reliability through the lens of resilience from an organisation and system perspective using two recognised resilience-enhancing management practices, High Reliability Theory (HRT) and Business Continuity Management (BCM) to better understand how this phenomenon manifests within a partially fragmented (corporatised) critical infrastructure industry – The Queensland Electricity Industry. The methodological approach involved a single case study design (industry) with embedded sub-units of analysis (organisations), utilising in-depth interviews and document analysis to illicit findings. Derived from detailed assessment of BCM and Reliability-Enhancing characteristics, findings suggest that the industry as a whole exhibits resilient functioning, however this was found to manifest at different levels across the industry and in different combinations. Whilst there were distinct differences in respect to resilient capabilities at the organisational level, differences were less marked at a systems (industry) level, with many common understandings carried over from the pre-corporatised operating environment. These Heritage Factors were central to understanding the systems level cohesion noted in the work. The findings of this study are intended to contribute to a body of knowledge encompassing resilience and high reliability in critical infrastructure industries. The research also has value from a practical perspective, as it suggests a range of opportunities to enhance resilient functioning under increasingly interdependent, networked conditions.
Resumo:
An Asset Management (AM) life-cycle constitutes a set of processes that align with the development, operation and maintenance of assets, in order to meet the desired requirements and objectives of the stake holders of the business. The scope of AM is often broad within an organization due to the interactions between its internal elements such as human resources, finance, technology, engineering operation, information technology and management, as well as external elements such as governance and environment. Due to the complexity of the AM processes, it has been proposed that in order to optimize asset management activities, process modelling initiatives should be adopted. Although organisations adopt AM principles and carry out AM initiatives, most do not document or model their AM processes, let alone enacting their processes (semi-) automatically using a computer-supported system. There is currently a lack of knowledge describing how to model AM processes through a methodical and suitable manner so that the processes are streamlines and optimized and are ready for deployment in a computerised way. This research aims to overcome this deficiency by developing an approach that will aid organisations in constructing AM process models quickly and systematically whilst using the most appropriate techniques, such as workflow technology. Currently, there is a wealth of information within the individual domains of AM and workflow. Both fields are gaining significant popularity in many industries thus fuelling the need for research in exploring the possible benefits of their cross-disciplinary applications. This research is thus inspired to investigate these two domains to exploit the application of workflow to modelling and execution of AM processes. Specifically, it will investigate appropriate methodologies in applying workflow techniques to AM frameworks. One of the benefits of applying workflow models to AM processes is to adapt and enable both ad-hoc and evolutionary changes over time. In addition, this can automate an AM process as well as to support the coordination and collaboration of people that are involved in carrying out the process. A workflow management system (WFMS) can be used to support the design and enactment (i.e. execution) of processes and cope with changes that occur to the process during the enactment. So far few literatures can be found in documenting a systematic approach to modelling the characteristics of AM processes. In order to obtain a workflow model for AM processes commonalities and differences between different AM processes need to be identified. This is the fundamental step in developing a conscientious workflow model for AM processes. Therefore, the first stage of this research focuses on identifying the characteristics of AM processes, especially AM decision making processes. The second stage is to review a number of contemporary workflow techniques and choose a suitable technique for application to AM decision making processes. The third stage is to develop an intermediate ameliorated AM decision process definition that improves the current process description and is ready for modelling using the workflow language selected in the previous stage. All these lead to the fourth stage where a workflow model for an AM decision making process is developed. The process model is then deployed (semi-) automatically in a state-of-the-art WFMS demonstrating the benefits of applying workflow technology to the domain of AM. Given that the information in the AM decision making process is captured at an abstract level within the scope of this work, the deployed process model can be used as an executable guideline for carrying out an AM decision process in practice. Moreover, it can be used as a vanilla system that, once being incorporated with rich information from a specific AM decision making process (e.g. in the case of a building construction or a power plant maintenance), is able to support the automation of such a process in a more elaborated way.
Resumo:
This thesis examines the advanced North American environmental mitigation schemes for their applicability to Queensland. Compensatory wetland mitigation banking, in particular, is concerned with in-perpetuity management and protection - the basic concerns of the Queensland public about its unique environment. The process has actively engaged the North American market and become a thriving industry that (for the most part) effectively designs, creates and builds (or enhances) environmental habitat. A methodology was designed to undertake a comprehensive review of the history, evolution and concepts of the North American wetland mitigation banking system - before and after the implementation of a significant new compensatory wetland mitigation banking regulation in 2008. The Delphi technique was then used to determine the principles and working components of wetland mitigation banking. Results were then applied to formulate a questionnaire to review Australian marketbased instruments (including offsetting policies) against these North American principles. Following this, two case studies established guiding principles for implementation based on two components of the North American wetland mitigation banking program. The subsequent outcomes confirmed that environmental banking is a workable concept in North America and that it is worth applying in Queensland. The majority of offsetting policies in Australia have adopted some principles of the North American mitigation programs. Examination reveals that however, they fail to provide adequate incentives for private landowners to participate because the essential trading mechanisms are not employed. Much can thus be learnt from the North American situation - where private enterprise has devised appropriate free market concepts. The consequent environmental banking process (as adapted from the North American programs) should be implemented in Queensland. It can then focus here on engaging the private sector, where the majority of naturally productive lands are managed.
Resumo:
This work seeks to fill some of the gap existing in the economics and behavioural economics literature pertaining to the decision making process of individuals under extreme environmental situations (life and death events). These essays specifically examine the sinking’s of the R.M.S. Titanic, on 14th April of 1912, and the R.M.S. Lusitania, on 7th May 1915, using econometric (multivariate) analysis techniques. The results show that even under extreme life and death conditions, social norms matter and are reflected in the survival probabilities of individuals onboard the Titanic. However, results from the comparative analysis of the Titanic and Lusitania show that social norms take time to organise and be effective. In the presence of such time constraints, the traditional “homo economicus” model of individual behaviour becomes evident as a survival of the fittest competition.