973 resultados para probability models
Resumo:
This thesis concerns the mathematical model of moving fluid interfaces in a Hele-Shaw cell: an experimental device in which fluid flow is studied by sandwiching the fluid between two closely separated plates. Analytic and numerical methods are developed to gain new insights into interfacial stability and bubble evolution, and the influence of different boundary effects is examined. In particular, the properties of the velocity-dependent kinetic undercooling boundary condition are analysed, with regard to the selection of only discrete possible shapes of travelling fingers of fluid, the formation of corners on the interface, and the interaction of kinetic undercooling with the better known effect of surface tension. Explicit solutions to the problem of an expanding or contracting ring of fluid are also developed.
Resumo:
This paper evaluates the efficiency of a number of popular corpus-based distributional models in performing discovery on very large document sets, including online collections. Literature-based discovery is the process of identifying previously unknown connections from text, often published literature, that could lead to the development of new techniques or technologies. Literature-based discovery has attracted growing research interest ever since Swanson's serendipitous discovery of the therapeutic effects of fish oil on Raynaud's disease in 1986. The successful application of distributional models in automating the identification of indirect associations underpinning literature-based discovery has been heavily demonstrated in the medical domain. However, we wish to investigate the computational complexity of distributional models for literature-based discovery on much larger document collections, as they may provide computationally tractable solutions to tasks including, predicting future disruptive innovations. In this paper we perform a computational complexity analysis on four successful corpus-based distributional models to evaluate their fit for such tasks. Our results indicate that corpus-based distributional models that store their representations in fixed dimensions provide superior efficiency on literature-based discovery tasks.
Resumo:
In biology, we frequently observe different species existing within the same environment. For example, there are many cell types in a tumour, or different animal species may occupy a given habitat. In modelling interactions between such species, we often make use of the mean field approximation, whereby spatial correlations between the locations of individuals are neglected. Whilst this approximation holds in certain situations, this is not always the case, and care must be taken to ensure the mean field approximation is only used in appropriate settings. In circumstances where the mean field approximation is unsuitable we need to include information on the spatial distributions of individuals, which is not a simple task. In this paper we provide a method that overcomes many of the failures of the mean field approximation for an on-lattice volume-excluding birth-death-movement process with multiple species. We explicitly take into account spatial information on the distribution of individuals by including partial differential equation descriptions of lattice site occupancy correlations. We demonstrate how to derive these equations for the multi-species case, and show results specific to a two-species problem. We compare averaged discrete results to both the mean field approximation and our improved method which incorporates spatial correlations. We note that the mean field approximation fails dramatically in some cases, predicting very different behaviour from that seen upon averaging multiple realisations of the discrete system. In contrast, our improved method provides excellent agreement with the averaged discrete behaviour in all cases, thus providing a more reliable modelling framework. Furthermore, our method is tractable as the resulting partial differential equations can be solved efficiently using standard numerical techniques.
Resumo:
Electricity is the cornerstone of modern life. It is essential to economic stability and growth, jobs and improved living standards. Electricity is also the fundamental ingredient for a dignified life; it is the source of such basic human requirements as cooked food, a comfortable living temperature and essential health care. For these reasons, it is unimaginable that today's economies could function without electricity and the modern energy services that it delivers. Somewhat ironically, however, the current approach to electricity generation also contributes to two of the gravest and most persistent problems threatening the livelihood of humans. These problems are anthropogenic climate change and sustained human poverty. To address these challenges, the global electricity sector must reduce its reliance on fossil fuel sources. In this context, the object of this research is twofold. Initially it is to consider the design of the Renewable Energy (Electricity) Act 2000 (Cth) (Renewable Electricity Act), which represents Australia's primary regulatory approach to increase the production of renewable sourced electricity. This analysis is conducted by reference to the regulatory models that exist in Germany and Great Britain. Within this context, this thesis then evaluates whether the Renewable Electricity Act is designed effectively to contribute to a more sustainable and dignified electricity generation sector in Australia. On the basis of the appraisal of the Renewable Electricity Act, this thesis contends that while certain aspects of the regulatory regime have merit, ultimately its design does not represent an effective and coherent regulatory approach to increase the production of renewable sourced electricity. In this regard, this thesis proposes a number of recommendations to reform the existing regime. These recommendations are not intended to provide instantaneous or simple solutions to the current regulatory regime. Instead, the purpose of these recommendations is to establish the legal foundations for an effective regulatory regime that is designed to increase the production of renewable sourced electricity in Australia in order to contribute to a more sustainable and dignified approach to electricity production.
Resumo:
This paper presents a novel framework for the modelling of passenger facilitation in a complex environment. The research is motivated by the challenges in the airport complex system, where there are multiple stakeholders, differing operational objectives and complex interactions and interdependencies between different parts of the airport system. Traditional methods for airport terminal modelling do not explicitly address the need for understanding causal relationships in a dynamic environment. Additionally, existing Bayesian Network (BN) models, which provide a means for capturing causal relationships, only present a static snapshot of a system. A method to integrate a BN complex systems model with stochastic queuing theory is developed based on the properties of the Poisson and Exponential distributions. The resultant Hybrid Queue-based Bayesian Network (HQBN) framework enables the simulation of arbitrary factors, their relationships, and their effects on passenger flow and vice versa. A case study implementation of the framework is demonstrated on the inbound passenger facilitation process at Brisbane International Airport. The predicted outputs of the model, in terms of cumulative passenger flow at intermediary and end points in the inbound process, are found to have an $R^2$ goodness of fit of 0.9994 and 0.9982 respectively over a 10 hour test period. The utility of the framework is demonstrated on a number of usage scenarios including real time monitoring and `what-if' analysis. This framework provides the ability to analyse and simulate a dynamic complex system, and can be applied to other socio-technical systems such as hospitals.
Resumo:
Introduction. The purpose of this chapter is to address the question raised in the chapter title. Specifically, how can models of motor control help us understand low back pain (LBP)? There are several classes of models that have been used in the past for studying spinal loading, stability, and risk of injury (see Reeves and Cholewicki (2003) for a review of past modeling approaches), but for the purpose of this chapter we will focus primarily on models used to assess motor control and its effect on spine behavior. This chapter consists of 4 sections. The first section discusses why a shift in modeling approaches is needed to study motor control issues. We will argue that the current approach for studying the spine system is limited and not well-suited for assessing motor control issues related to spine function and dysfunction. The second section will explore how models can be used to gain insight into how the central nervous system (CNS) controls the spine. This segues segue nicely into the next section that will address how models of motor control can be used in the diagnosis and treatment of LBP. Finally, the last section will deal with the issue of model verification and validity. This issue is important since modelling accuracy is critical for obtaining useful insight into the behavior of the system being studied. This chapter is not intended to be a critical review of the literature, but instead intended to capture some of the discussion raised during the 2009 Spinal Control Symposium, with some elaboration on certain issues. Readers interested in more details are referred to the cited publications.
Resumo:
This article examines manual textual categorisation by human coders with the hypothesis that the law of total probability may be violated for difficult categories. An empirical evaluation was conducted to compare a one step categorisation task with a two step categorisation task using crowdsourcing. It was found that the law of total probability was violated. Both a quantum and classical probabilistic interpretations for this violation are presented. Further studies are required to resolve whether quantum models are more appropriate for this task.
Resumo:
This project’s aim was to create new experimental models in small animals for the investigation of infections related to bone fracture fixation implants. Animal models are essential in orthopaedic trauma research and this study evaluated new implants and surgical techniques designed to improve standardisation in these experiments, and ultimately to minimise the number of animals needed in future work. This study developed and assessed procedures using plates and inter-locked nails to stabilise fractures in rabbit thigh bones. Fracture healing was examined with mechanical testing and histology. The results of this work contribute to improvements in future small animal infection experiments.
Resumo:
Pavlovian fear conditioning is a robust technique for examining behavioral and cellular components of fear learning and memory. In fear conditioning, the subject learns to associate a previously neutral stimulus with an inherently noxious co-stimulus. The learned association is reflected in the subjects' behavior upon subsequent re-exposure to the previously neutral stimulus or the training environment. Using fear conditioning, investigators can obtain a large amount of data that describe multiple aspects of learning and memory. In a single test, researchers can evaluate functional integrity in fear circuitry, which is both well characterized and highly conserved across species. Additionally, the availability of sensitive and reliable automated scoring software makes fear conditioning amenable to high-throughput experimentation in the rodent model; thus, this model of learning and memory is particularly useful for pharmacological and toxicological screening. Due to the conserved nature of fear circuitry across species, data from Pavlovian fear conditioning are highly translatable to human models. We describe equipment and techniques needed to perform and analyze conditioned fear data. We provide two examples of fear conditioning experiments, one in rats and one in mice, and the types of data that can be collected in a single experiment. © 2012 Springer Science+Business Media, LLC.
Resumo:
Pavlovian fear conditioning, also known as classical fear conditioning is an important model in the study of the neurobiology of normal and pathological fear. Progress in the neurobiology of Pavlovian fear also enhances our understanding of disorders such as posttraumatic stress disorder (PTSD) and with developing effective treatment strategies. Here we describe how Pavlovian fear conditioning is a key tool for understanding both the neurobiology of fear and the mechanisms underlying variations in fear memory strength observed across different phenotypes. First we discuss how Pavlovian fear models aspects of PTSD. Second, we describe the neural circuits of Pavlovian fear and the molecular mechanisms within these circuits that regulate fear memory. Finally, we show how fear memory strength is heritable; and describe genes which are specifically linked to both changes in Pavlovian fear behavior and to its underlying neural circuitry. These emerging data begin to define the essential genes, cells and circuits that contribute to normal and pathological fear.
Communication models of institutional online communities : the role of the ABC cultural intermediary
Resumo:
The co-creation of cultural artefacts has been democratised given the recent technological affordances of information and communication technologies. Web 2.0 technologies have enabled greater possibilities of citizen inclusion within the media conversations of their nations. For example, the Australian audience has more opportunities to collaboratively produce and tell their story to a broader audience via the public service media (PSM) facilitated platforms of the Australian Broadcasting Corporation (ABC). However, providing open collaborative production for the audience gives rise to the problem, how might the PSM manage the interests of all the stakeholders and align those interests with its legislated Charter? This paper considers this problem through the ABC’s user-created content participatory platform, ABC Pool and highlights the cultural intermediary as the role responsible for managing these tensions. This paper also suggests cultural intermediation is a useful framework for other media organisations engaging in co-creative activities with their audiences.
Communication models of institutional online communities : the role of the ABC cultural intermediary
Resumo:
The co-creation of cultural artefacts has been democratised given the recent technological affordances of information and communication technologies. Web 2.0 technologies have enabled greater possibilities of citizen inclusion within the media conversations of their nations. For example, the Australian audience has more opportunities to collaboratively produce and tell their story to a broader audience via the public service media (PSM) facilitated platforms of the Australian Broadcasting Corporation (ABC). However, providing open collaborative production for the audience gives rise to the problem, how might the PSM manage the interests of all the stakeholders and align those interests with its legislated Charter? This paper considers this problem through the ABC’s user-created content participatory platform, ABC Pool and highlights the cultural intermediary as the role responsible for managing these tensions. This paper also suggests cultural intermediation is a useful framework for other media organisations engaging in co-creative activities with their audiences.
Resumo:
Earthwork planning has been considered in this article and a generic block partitioning and modelling approach has been devised to provide strategic plans of various levels of detail. Conceptually this approach is more accurate and comprehensive than others, for instance those that are section based. In response to environmental concerns the metric for decision making was fuel consumption and emissions. Haulage distance and gradient are also included as they are important components of these metrics. Advantageously the fuel consumption metric is generic and captures the physical difficulties of travelling over inclines of different gradients, that is consistent across all hauling vehicles. For validation, the proposed models and techniques have been applied to a real world road project. The numerical investigations have demonstrated that the models can be solved with relatively little CPU time. The proposed block models also result in solutions of superior quality, i.e. they have reduced fuel consumption and cost. Furthermore the plans differ considerably from those based solely upon a distance based metric thus demonstrating a need for industry to reflect upon their current practices.
Resumo:
Executive Summary Emergency health is a critical component of Australia’s health system and emergency departments (EDs) are increasingly congested from growing demand and blocked access to inpatient beds. The Emergency Health Services Queensland (EHSQ) study aims to identify the factors driving increased demand for emergency health and to evaluate strategies which may safely reduce the future demand growth. This monograph addresses the perspectives of users of both ambulance services and EDs. The research reported here aimed to identify the perspectives of users of emergency health services, both ambulance services and public hospital Emergency Departments and to identify the factors that they took into consideration when exercising their choice of location for acute health care. A cross-sectional survey design was used involving a survey of patients or their carers presenting to the EDs of a stratified sample of eight hospitals. A specific purpose questionnaire was developed based on a novel theoretical model which had been derived from analysis of the literature (Monograph 1). Two survey versions were developed: one for adult patients (self-complete); and one for children (to be completed by parents/guardians). The questionnaires measured perceptions of social support, health status, illness severity, self-efficacy; beliefs and attitudes towards ED and ambulance services; reasons for using these services, and actions taken prior to the service request. The survey was conducted at a stratified sample of eight hospitals representing major cities (four), inner regional (two) and outer regional and remote (two). Due to practical limitations, data were collected for ambulance and ED users within hospital EDs, while patients were waiting for or under treatment. A sample size quota was determined for each ED based on their 2009/10 presentation volumes. The data collection was conducted by four members of the research team and a group of eight interviewers between March and May 2011 (corresponding to autumn season). Of the total of 1608 patients in all eight emergency departments the interviewers were able to approach 1361 (85%) patients and seek their consent to participate in the study. In total, 911 valid surveys were available for analysis (response rate= 67%). These studies demonstrate that patients elected to attend hospital EDs in a considered fashion after weighing up alternatives and there is no evidence of deliberate or ill-informed misuse. • Patients attending ED have high levels of social support and self-efficacy that speak to the considered and purposeful nature of the exercise of choice. • About one third of patients have new conditions while two thirds have chronic illnesses • More than half the attendees (53.1%) had consulted a healthcare professional prior to making the decision. • The decision to seek urgent care at an ED was mostly constructed around the patient’s perception of the urgency and severity of their illness, reinforced by a strong perception that the hospital ED was the correct location for them (better specialised staff, better care for my condition, other options not as suitable). • 33% of the respondent held private hospital insurance but nevertheless attended a public hospital ED. Similarly patients exercised considered and rational judgements in their choice to seek help from the ambulance service. • The decision to call for ambulance assistance was based on a strong perception about the severity of the illness (too severe to use other means of transport) and that other options were not considered appropriate. • The decision also appeared influenced by a perception that the ambulance provided appropriate access to the ED which was considered most appropriate for their particular condition (too severe to go elsewhere, all facilities in one spot, better specialised and better care). • In 43.8% of cases a health care professional advised use of the ambulance. • Only a small number of people perceived that ambulance should be freely available regardless of severity or appropriateness. These findings confirm a growing understanding that the choice of professional emergency health care services is not made lightly but rather made by reasonable people exercising a judgement which is influenced by public awareness of the risks of acute health and which is most often informed by health professionals. It is also made on the basis of a rational weighing up of alternatives and a deliberate and considered choice to seek assistance from a service which the patient perceived was most appropriate to their needs at that time. These findings add weight to dispensing with public perceptions that ED and ambulance congestion is a result of inappropriate choice by patients. The challenge for health services is to better understand the patient’s needs and to design and validate services that meet those needs. The failure of our health system to do so should not be grounds for blaming the patient, claiming inappropriate patient choices.
Resumo:
Process-Aware Information Systems (PAISs) support executions of operational processes that involve people, resources, and software applications on the basis of process models. Process models describe vast, often infinite, amounts of process instances, i.e., workflows supported by the systems. With the increasing adoption of PAISs, large process model repositories emerged in companies and public organizations. These repositories constitute significant information resources. Accurate and efficient retrieval of process models and/or process instances from such repositories is interesting for multiple reasons, e.g., searching for similar models/instances, filtering, reuse, standardization, process compliance checking, verification of formal properties, etc. This paper proposes a technique for indexing process models that relies on their alternative representations, called untanglings. We show the use of untanglings for retrieval of process models based on process instances that they specify via a solution to the total executability problem. Experiments with industrial process models testify that the proposed retrieval approach is up to three orders of magnitude faster than the state of the art.