923 resultados para Real Options Theory


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditional approaches to teaching criminal law in Australian law schools include lectures that focus on the transmission of abstracted and decontextualised knowledge, with content often prioritised at the expense of depth. This paper discusses The Sapphire Vortex, a blended learning environment that combines a suite of on-line modules using Second Life machinima to depict a narrative involving a series of criminal offences and the ensuing courtroom proceedings, expert commentary by practising lawyers and class discussions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evaluation of the Get REAL programme in an inclusive primary school setting has indicated its effectiveness in promoting pro-social behaviour for children with high functioning Autism. However, two children with co-morbid diagnoses and complex personal circumstances showed less consistent improvements. In order to explain their unique trajectories, not readily derived from quantitative studies, an exploratory case study approach was used to examine contextual influences on patterns of progress. Multiple data sources included coded video footage from the Get REAL programme, school reports on conduct, and parents and classroom teacher reports using the Strengths and Difficulties Questionnaire. While results provide support for the efficacy of the Get REAL programme for the two children, they also highlight the value of co-ordinated strategies and collaborative individualised approaches in more complex cases. This paper outlines the Get REAL intervention and a range of other school and support agency strategies impacting progress.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Australian queer (GLBTIQ) university student activist media is an important site of self-representation. Community media is a significant site for the development of queer identity, community and a key part of queer politics. This paper reviews my research into queer student media, which is grounded in a queer theoretical perspective. Rob Cover argues that queer theoretical approaches that study media products fail to consider the material contexts that contribute to their construction. I use an ethnographic approach to examine how editors construct queer identity and community in queer student media. My research contributes to queer media scholarship by addressing the gap that Cover identifies, and to the rich scholarship on negotiations of queer community.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A clear understanding of the cognitive-emotional processes underpinning desires to overconsume foods and adopt sedentary lifestyles can inform the development of more effective interventions to promote healthy eating and physical activity. The Elaborated Intrusion Theory of Desires offers a framework that can help in this endeavor through its emphases on the roles of intrusive thoughts and elaboration of multisensory imagery. There is now substantial evidence that tasks that compete for limited working memory resources with food-related imagery can reduce desires to eat that food, and that positive imagery can promote functional behavior. Meditation mindfulness can also short-circuit elaboration of dysfunctional cognition. Functional Decision Making is an approach that applies laboratory-based research on desire, to provide a motivational intervention to establish and entrench behavior changes, so healthy eating and physical activity become everyday habits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to implement a Game-Theory based offline mission path planner for aerial inspection tasks of large linear infrastructures. Like most real-world optimisation problems, mission path planning involves a number of objectives which ideally should be minimised simultaneously. The goal of this work is then to develop a Multi-Objective (MO) optimisation tool able to provide a set of optimal solutions for the inspection task, given the environment data, the mission requirements and the definition of the objectives to minimise. Results indicate the robustness and capability of the method to find the trade-off between the Pareto-optimal solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

What are the information practices of teen content creators? In the United States over two thirds of teens have participated in creating and sharing content in online communities that are developed for the purpose of allowing users to be producers of content. This study investigates how teens participating in digital participatory communities find and use information as well as how they experience the information. From this investigation emerged a model of their information practices while creating and sharing content such as film-making, visual art work, story telling, music, programming, and web site design in digital participatory communities. The research uses grounded theory methodology in a social constructionist framework to investigate the research problem: what are the information practices of teen content creators? Data was gathered through semi-structured interviews and observation of teen’s digital communities. Analysis occurred concurrently with data collection, and the principle of constant comparison was applied in analysis. As findings were constructed from the data, additional data was collected until a substantive theory was constructed and no new information emerged from data collection. The theory that was constructed from the data describes five information practices of teen content creators. The five information practices are learning community, negotiating aesthetic, negotiating control, negotiating capacity, and representing knowledge. In describing the five information practices there are three necessary descriptive components, the community of practice, the experiences of information and the information actions. The experiences of information include information as participation, inspiration, collaboration, process, and artifact. Information actions include activities that occur in the categories of gathering, thinking and creating. The experiences of information and information actions intersect in the information practices, which are situated within the specific community of practice, such as a digital participatory community. Finally, the information practices interact and build upon one another and this is represented in a graphic model and explanation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numerical study is carried out using large eddy simulation to study the heat and toxic gases released from fires in real road tunnels. Due to disasters about tunnel fires in previous decade, it attracts increasing attention of researchers to create safe and reliable ventilation designs. In this research, a real tunnel with 10 MW fire (which approximately equals to the heat output speed of a burning bus) at the middle of tunnel is simulated using FDS (Fire Dynamic Simulator) for different ventilation velocities. Carbone monoxide concentration and temperature vertical profiles are shown for various locations to explore the flow field. It is found that, with the increase of the longitudinal ventilation velocity, the vertical profile gradients of CO concentration and smoke temperature were shown to be both reduced. However, a relatively large longitudinal ventilation velocity leads to a high similarity between the vertical profile of CO volume concentration and that of temperature rise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An advanced rule-based Transit Signal Priority (TSP) control method is presented in this paper. An on-line transit travel time prediction model is the key component of the proposed method, which enables the selection of the most appropriate TSP plans for the prevailing traffic and transit condition. The new method also adopts a priority plan re-development feature that enables modifying or even switching the already implemented priority plan to accommodate changes in the traffic conditions. The proposed method utilizes conventional green extension and red truncation strategies and also two new strategies including green truncation and queue clearance. The new method is evaluated against a typical active TSP strategy and also the base case scenario assuming no TSP control in microsimulation. The evaluation results indicate that the proposed method can produce significant benefits in reducing the bus delay time and improving the service regularity with negligible adverse impacts on the non-transit street traffic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an efficient face detection method suitable for real-time surveillance applications. Improved efficiency is achieved by constraining the search window of an AdaBoost face detector to pre-selected regions. Firstly, the proposed method takes a sparse grid of sample pixels from the image to reduce whole image scan time. A fusion of foreground segmentation and skin colour segmentation is then used to select candidate face regions. Finally, a classifier-based face detector is applied only to selected regions to verify the presence of a face (the Viola-Jones detector is used in this paper). The proposed system is evaluated using 640 x 480 pixels test images and compared with other relevant methods. Experimental results show that the proposed method reduces the detection time to 42 ms, where the Viola-Jones detector alone requires 565 ms (on a desktop processor). This improvement makes the face detector suitable for real-time applications. Furthermore, the proposed method requires 50% of the computation time of the best competing method, while reducing the false positive rate by 3.2% and maintaining the same hit rate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Grounded theory, first developed by Glaser and Strauss in the 1960s, was introduced into nursing education as a distinct research methodology in the 1970s. The theory is grounded in a critique of the dominant contemporary approach to social inquiry, which imposed "enduring" theoretical propositions onto study data. Rather than starting from a set theoretical framework, grounded theory relies on researchers distinguishing meaningful constructs from generated data and then identifying an appropriate theory. Grounded theory is thus particularly useful in investigating complex issues and behaviours not previously addressed and concepts and relationships in particular populations or places that are still undeveloped or weakly connected. Grounded theory data analysis processes include open, axial and selective coding levels. The purpose of this article was to explore the grounded theory research process and provide an initial understanding of this methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many of the more extreme bushfire prone landscapes in Australia are located in colder climate regions. For such sites, the National Construction Code regulates that houses satisfy both the Australian Standard for Bushfire (AS 3959:2009) and achieve a 6 Star energy rating. When combined these requirements present a considerable challenge to the construction of affordable housing - a problem which is often exacerbated by the complex topography of bushifre prone landscapes. Dr Weir presents a series of case studies from his architetcural practice which highlight the need for further design-led research into affordable housing - a ground up holistic approach to design which recolciles energy performance, human behaviourm, bushland conservation and bushfire safety.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Deploying networked control systems (NCSs) over wireless networks is becoming more and more popular. However, the widely-used transport layer protocols, Transmission Control Protocol (TCP) and User Datagram Protocol (UDP), are not designed for real-time applications. Therefore, they may not be suitable for many NCS application scenarios because of their limitations on reliability and/or delay performance, which real-control systems concern. Considering a typical type of NCSs with periodic and sporadic real-time traffic, this paper proposes a highly reliable transport layer protocol featuring a packet loss-sensitive retransmission mechanism and a prioritized transmission mechanism. The packet loss-sensitive retransmission mechanism is designed to improve the reliability of all traffic flows. And the prioritized transmission mechanism offers differentiated services for periodic and sporadic flows. Simulation results show that the proposed protocol has better reliability than UDP and improved delay performance than TCP over wireless networks, particularly when channel errors and congestions occur.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability to estimate the asset reliability and the probability of failure is critical to reducing maintenance costs, operation downtime, and safety hazards. Predicting the survival time and the probability of failure in future time is an indispensable requirement in prognostics and asset health management. In traditional reliability models, the lifetime of an asset is estimated using failure event data, alone; however, statistically sufficient failure event data are often difficult to attain in real-life situations due to poor data management, effective preventive maintenance, and the small population of identical assets in use. Condition indicators and operating environment indicators are two types of covariate data that are normally obtained in addition to failure event and suspended data. These data contain significant information about the state and health of an asset. Condition indicators reflect the level of degradation of assets while operating environment indicators accelerate or decelerate the lifetime of assets. When these data are available, an alternative approach to the traditional reliability analysis is the modelling of condition indicators and operating environment indicators and their failure-generating mechanisms using a covariate-based hazard model. The literature review indicates that a number of covariate-based hazard models have been developed. All of these existing covariate-based hazard models were developed based on the principle theory of the Proportional Hazard Model (PHM). However, most of these models have not attracted much attention in the field of machinery prognostics. Moreover, due to the prominence of PHM, attempts at developing alternative models, to some extent, have been stifled, although a number of alternative models to PHM have been suggested. The existing covariate-based hazard models neglect to fully utilise three types of asset health information (including failure event data (i.e. observed and/or suspended), condition data, and operating environment data) into a model to have more effective hazard and reliability predictions. In addition, current research shows that condition indicators and operating environment indicators have different characteristics and they are non-homogeneous covariate data. Condition indicators act as response variables (or dependent variables) whereas operating environment indicators act as explanatory variables (or independent variables). However, these non-homogenous covariate data were modelled in the same way for hazard prediction in the existing covariate-based hazard models. The related and yet more imperative question is how both of these indicators should be effectively modelled and integrated into the covariate-based hazard model. This work presents a new approach for addressing the aforementioned challenges. The new covariate-based hazard model, which termed as Explicit Hazard Model (EHM), explicitly and effectively incorporates all three available asset health information into the modelling of hazard and reliability predictions and also drives the relationship between actual asset health and condition measurements as well as operating environment measurements. The theoretical development of the model and its parameter estimation method are demonstrated in this work. EHM assumes that the baseline hazard is a function of the both time and condition indicators. Condition indicators provide information about the health condition of an asset; therefore they update and reform the baseline hazard of EHM according to the health state of asset at given time t. Some examples of condition indicators are the vibration of rotating machinery, the level of metal particles in engine oil analysis, and wear in a component, to name but a few. Operating environment indicators in this model are failure accelerators and/or decelerators that are included in the covariate function of EHM and may increase or decrease the value of the hazard from the baseline hazard. These indicators caused by the environment in which an asset operates, and that have not been explicitly identified by the condition indicators (e.g. Loads, environmental stresses, and other dynamically changing environment factors). While the effects of operating environment indicators could be nought in EHM; condition indicators could emerge because these indicators are observed and measured as long as an asset is operational and survived. EHM has several advantages over the existing covariate-based hazard models. One is this model utilises three different sources of asset health data (i.e. population characteristics, condition indicators, and operating environment indicators) to effectively predict hazard and reliability. Another is that EHM explicitly investigates the relationship between condition and operating environment indicators associated with the hazard of an asset. Furthermore, the proportionality assumption, which most of the covariate-based hazard models suffer from it, does not exist in EHM. According to the sample size of failure/suspension times, EHM is extended into two forms: semi-parametric and non-parametric. The semi-parametric EHM assumes a specified lifetime distribution (i.e. Weibull distribution) in the form of the baseline hazard. However, for more industry applications, due to sparse failure event data of assets, the analysis of such data often involves complex distributional shapes about which little is known. Therefore, to avoid the restrictive assumption of the semi-parametric EHM about assuming a specified lifetime distribution for failure event histories, the non-parametric EHM, which is a distribution free model, has been developed. The development of EHM into two forms is another merit of the model. A case study was conducted using laboratory experiment data to validate the practicality of the both semi-parametric and non-parametric EHMs. The performance of the newly-developed models is appraised using the comparison amongst the estimated results of these models and the other existing covariate-based hazard models. The comparison results demonstrated that both the semi-parametric and non-parametric EHMs outperform the existing covariate-based hazard models. Future research directions regarding to the new parameter estimation method in the case of time-dependent effects of covariates and missing data, application of EHM in both repairable and non-repairable systems using field data, and a decision support model in which linked to the estimated reliability results, are also identified.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Power system restoration after a large area outage involves many factors, and the procedure is usually very complicated. A decision-making support system could then be developed so as to find the optimal black-start strategy. In order to evaluate candidate black-start strategies, some indices, usually both qualitative and quantitative, are employed. However, it may not be possible to directly synthesize these indices, and different extents of interactions may exist among these indices. In the existing black-start decision-making methods, qualitative and quantitative indices cannot be well synthesized, and the interactions among different indices are not taken into account. The vague set, an extended version of the well-developed fuzzy set, could be employed to deal with decision-making problems with interacting attributes. Given this background, the vague set is first employed in this work to represent the indices for facilitating the comparisons among them. Then, a concept of the vague-valued fuzzy measure is presented, and on that basis a mathematical model for black-start decision-making developed. Compared with the existing methods, the proposed method could deal with the interactions among indices and more reasonably represent the fuzzy information. Finally, an actual power system is served for demonstrating the basic features of the developed model and method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chatrooms, for example Internet Relay Chat, are generally multi-user, multi-channel and multiserver chat-systems which run over the Internet and provide a protocol for real-time text-based conferencing between users all over the world. While a well-trained human observer is able to understand who is chatting with whom, there are no efficient and accurate automated tools to determine the groups of users conversing with each other. A precursor to analysing evolving cyber-social phenomena is to first determine what the conversations are and which groups of chatters are involved in each conversation. We consider this problem in this paper. We propose an algorithm to discover all groups of users that are engaged in conversation. Our algorithms are based on a statistical model of a chatroom that is founded on our experience with real chatrooms. Our approach does not require any semantic analysis of the conversations, rather it is based purely on the statistical information contained in the sequence of posts. We improve the accuracy by applying some graph algorithms to clean the statistical information. We present some experimental results which indicate that one can automatically determine the conversing groups in a chatroom, purely on the basis of statistical analysis.