947 resultados para Stochastic models
Resumo:
Background Models of service provision and professional training differ between countries. This study aims to investigate a specialist intellectual disabilities model and a generic mental health model, specifically comparing psychiatrists’ knowledge and competencies, and service quality and accessibility in meeting the mental health needs of people with intellectual disabilities. Method Data were collected from consultant and trainee psychiatrists within a specialist intellectual disabilities model (UK) and a generic mental health model (Australia). Results The sample sizes were 294 (UK) and 205 (Australia). Statistically significant differences were found, with UK participants having positive views about the specialist intellectual disabilities service model they worked within, demonstrating flexible and accessible working practices and service provision, responsive to the range of mental health needs of the population with intellectual disabilities, and providing a wide range of treatments and supports. The UK participants were knowledgeable, well trained and confident in their work. They wanted to work with people with intellectual disabilities. In all of these areas, the converse was found from the Australian generic mental health service model. Conclusions The specialist intellectual disabilities model of service provision and training has advantages over the generic mental health model.
Resumo:
Channels are becoming an increasingly important area for companies to innovate, specifically as they provide direct points of contact with their customers. However, little is known in regards to multi-channel strategies that embody strategic brand values and how customers experience these channels collectively. The purpose of this paper is to investigate how organisations configure multi- channel strategies to communicate their brand value and experience to their customers. Data was collated from sixty companies through a content analysis methodology within the retail sector. Results uncovered commonalities through the identification of four meta-models surrounding common brand values, intended emotive experience, individual channels and the customer segment. These meta-models are titled: High Quality, Trust, Convenience and Community. This research also presents implications of a multi-channel design tool based on findings from this study to help reinforce company brand values and design an overall connected customer experience.
Resumo:
Malaria has been eliminated from over 40 countries with an additional 39 currently planning for, or committed to, elimination. Information on the likely impact of available interventions, and the required time, is urgently needed to help plan resource allocation. Mathematical modelling has been used to investigate the impact of various interventions; the strength of the conclusions is boosted when several models with differing formulation produce similar data. Here we predict by using an individual-based stochastic simulation model of seasonal Plasmodium falciparum transmission that transmission can be interrupted and parasite reintroductions controlled in villages of 1,000 individuals where the entomological inoculation rate is <7 infectious bites per person per year using chemotherapy and bed net strategies. Above this transmission intensity bed nets and symptomatic treatment alone were not sufficient to interrupt transmission and control the importation of malaria for at least 150 days. Our model results suggest that 1) stochastic events impact the likelihood of successfully interrupting transmission with large variability in the times required, 2) the relative reduction in morbidity caused by the interventions were age-group specific, changing over time, and 3) the post-intervention changes in morbidity were larger than the corresponding impact on transmission. These results generally agree with the conclusions from previously published models. However the model also predicted changes in parasite population structure as a result of improved treatment of symptomatic individuals; the survival probability of introduced parasites reduced leading to an increase in the prevalence of sub-patent infections in semi-immune individuals. This novel finding requires further investigation in the field because, if confirmed, such a change would have a negative impact on attempts to eliminate the disease from areas of moderate transmission.
Resumo:
"This collection of papers offers a broad synopsis of state-of-the-art mathematical methods used in modeling the interaction between tumors and the immune system. These papers were presented at the four-day workshop on Mathematical Models of Tumor-Immune System Dynamics held in Sydney, Australia from January 7th to January 10th, 2013. The workshop brought together applied mathematicians, biologists, and clinicians actively working in the field of cancer immunology to share their current research and to increase awareness of the innovative mathematical tools that are applicable to the growing field of cancer immunology. Recent progress in cancer immunology and advances in immunotherapy suggest that the immune system plays a fundamental role in host defense against tumors and could be utilized to prevent or cure cancer. Although theoretical and experimental studies of tumor-immune system dynamics have a long history, there are still many unanswered questions about the mechanisms that govern the interaction between the immune system and a growing tumor. The multidimensional nature of these complex interactions requires a cross-disciplinary approach to capture more realistic dynamics of the essential biology. The papers presented in this volume explore these issues and the results will be of interest to graduate students and researchers in a variety of fields within mathematical and biological sciences."--Publisher website
Resumo:
This paper presents an event-based failure model to predict the number of failures that occur in water distribution assets. Often, such models have been based on analysis of historical failure data combined with pipe characteristics and environmental conditions. In this paper weather data have been added to the model to take into account the commonly observed seasonal variation of the failure rate. The theoretical basis of existing logistic regression models is briefly described in this paper, along with the refinements made to the model for inclusion of seasonal variation of weather. The performance of these refinements is tested using data from two Australian water authorities.
Resumo:
The business model concept is gaining traction in different disciplines but is still criticized for being fuzzy and vague and lacking consensus on its definition and compositional elements. In this paper we set out to advance our understanding of the business model concept by addressing three areas of foundational research: business model definitions, business model elements, and business model archetypes. We define a business model as a representation of the value logic of an organization in terms of how it creates and captures customer value. This abstract and generic definition is made more specific and operational by the compositional elements that need to address the customer, value proposition, organizational architecture (firm and network level) and economics dimensions. Business model archetypes complement the definition and elements by providing a more concrete and empirical understanding of the business model concept. The main contributions of this paper are (1) explicitly including the customer value concept in the business model definition and focussing on value creation, (2) presenting four core dimensions that business model elements need to cover, (3) arguing for flexibility by adapting and extending business model elements to cater for different purposes and contexts (e.g. technology, innovation, strategy),(4) stressing a more systematic approach to business model archetypes by using business model elements for their description, and (5) suggesting to use business model archetype research for the empirical exploration and testing of business model elements and their relationships.
Resumo:
A discrete agent-based model on a periodic lattice of arbitrary dimension is considered. Agents move to nearest-neighbor sites by a motility mechanism accounting for general interactions, which may include volume exclusion. The partial differential equation describing the average occupancy of the agent population is derived systematically. A diffusion equation arises for all types of interactions and is nonlinear except for the simplest interactions. In addition, multiple species of interacting subpopulations give rise to an advection-diffusion equation for each subpopulation. This work extends and generalizes previous specific results, providing a construction method for determining the transport coefficients in terms of a single conditional transition probability, which depends on the occupancy of sites in an influence region. These coefficients characterize the diffusion of agents in a crowded environment in biological and physical processes.
Resumo:
This study analyses and compares the cost efficiency of Japanese steam power generation companies using the fixed and random Bayesian frontier models. We show that it is essential to account for heterogeneity in modelling the performance of energy companies. Results from the model estimation also indicate that restricting CO2 emissions can lead to a decrease in total cost. The study finally discusses the efficiency variations between the energy companies under analysis, and elaborates on the managerial and policy implications of the results.
Resumo:
This thesis has contributed to the advancement of knowledge in disease modelling by addressing interesting and crucial issues relevant to modelling health data over space and time. The research has led to the increased understanding of spatial scales, temporal scales, and spatial smoothing for modelling diseases, in terms of their methodology and applications. This research is of particular significance to researchers seeking to employ statistical modelling techniques over space and time in various disciplines. A broad class of statistical models are employed to assess what impact of spatial and temporal scales have on simulated and real data.
Resumo:
Identifying railway capacity is an important task that can identify "in principal" whether the network can handle an intended traffic flow, and whether there is any free capacity left for additional train services. Capacity determination techniques can also be used to identify how best to improve an existing network, and at least cost. In this article an optimization approach has been applied to a case study of the Iran national railway, in order to identify its current capacity and to optimally expand it given a variety of technical conditions. This railway is very important in Iran and will be upgraded extensively in the coming years. Hence the conclusions in this article may help in that endeavor. A sensitivity analysis is recommended to evaluate a wider range of possible scenarios. Hence more useful lower and upper bounds can be provided for the performance of the system
Resumo:
Approximately half of prostate cancers (PCa) carry TMPRSS2-ERG translocations; however, the clinical impact of this genomic alteration remains enigmatic. Expression of v-ets erythroblastosis virus E26 oncogene like (avian) gene (ERG) promotes prostatic epithelial dysplasia in transgenic mice and acquisition of epithelial-to-mesenchymal transition (EMT) characteristics in human prostatic epithelial cells (PrECs). To explore whether ERG-induced EMT in PrECs was associated with therapeutically targetable transformation characteristics, we established stable populations of BPH-1, PNT1B and RWPE-1 immortalized human PrEC lines that constitutively express flag-tagged ERG3 (fERG). All fERG-expressing populations exhibited characteristics of in vitro and in vivo transformation. Microarray analysis revealed >2000 commonly dysregulated genes in the fERG-PrEC lines. Functional analysis revealed evidence that fERG cells underwent EMT and acquired invasive characteristics. The fERG-induced EMT transcript signature was exemplified by suppressed expression of E-cadherin and keratins 5, 8, 14 and 18; elevated expression of N-cadherin, N-cadherin 2 and vimentin, and of the EMT transcriptional regulators Snail, Zeb1 and Zeb2, and lymphoid enhancer-binding factor-1 (LEF-1). In BPH-1 and RWPE-1-fERG cells, fERG expression is correlated with increased expression of integrin-linked kinase (ILK) and its downstream effectors Snail and LEF-1. Interfering RNA suppression of ERG decreased expression of ILK, Snail and LEF-1, whereas small interfering RNA suppression of ILK did not alter fERG expression. Interfering RNA suppression of ERG or ILK impaired fERG-PrEC Matrigel invasion. Treating fERG-BPH-1 cells with the small molecule ILK inhibitor, QLT-0267, resulted in dose-dependent suppression of Snail and LEF-1 expression, Matrigel invasion and reversion of anchorage-independent growth. These results suggest that ILK is a therapeutically targetable mediator of ERG-induced EMT and transformation in PCa.
Resumo:
Computational models in physiology often integrate functional and structural information from a large range of spatio-temporal scales from the ionic to the whole organ level. Their sophistication raises both expectations and scepticism concerning how computational methods can improve our understanding of living organisms and also how they can reduce, replace and refine animal experiments. A fundamental requirement to fulfil these expectations and achieve the full potential of computational physiology is a clear understanding of what models represent and how they can be validated. The present study aims at informing strategies for validation by elucidating the complex interrelations between experiments, models and simulations in cardiac electrophysiology. We describe the processes, data and knowledge involved in the construction of whole ventricular multiscale models of cardiac electrophysiology. Our analysis reveals that models, simulations, and experiments are intertwined, in an assemblage that is a system itself, namely the model-simulation-experiment (MSE) system. Validation must therefore take into account the complex interplay between models, simulations and experiments. Key points for developing strategies for validation are: 1) understanding sources of bio-variability is crucial to the comparison between simulation and experimental results; 2) robustness of techniques and tools is a pre-requisite to conducting physiological investigations using the MSE system; 3) definition and adoption of standards facilitates interoperability of experiments, models and simulations; 4) physiological validation must be understood as an iterative process that defines the specific aspects of electrophysiology the MSE system targets, and is driven by advancements in experimental and computational methods and the combination of both.
Resumo:
Spatial data are now prevalent in a wide range of fields including environmental and health science. This has led to the development of a range of approaches for analysing patterns in these data. In this paper, we compare several Bayesian hierarchical models for analysing point-based data based on the discretization of the study region, resulting in grid-based spatial data. The approaches considered include two parametric models and a semiparametric model. We highlight the methodology and computation for each approach. Two simulation studies are undertaken to compare the performance of these models for various structures of simulated point-based data which resemble environmental data. A case study of a real dataset is also conducted to demonstrate a practical application of the modelling approaches. Goodness-of-fit statistics are computed to compare estimates of the intensity functions. The deviance information criterion is also considered as an alternative model evaluation criterion. The results suggest that the adaptive Gaussian Markov random field model performs well for highly sparse point-based data where there are large variations or clustering across the space; whereas the discretized log Gaussian Cox process produces good fit in dense and clustered point-based data. One should generally consider the nature and structure of the point-based data in order to choose the appropriate method in modelling a discretized spatial point-based data.
Resumo:
Existing crowd counting algorithms rely on holistic, local or histogram based features to capture crowd properties. Regression is then employed to estimate the crowd size. Insufficient testing across multiple datasets has made it difficult to compare and contrast different methodologies. This paper presents an evaluation across multiple datasets to compare holistic, local and histogram based methods, and to compare various image features and regression models. A K-fold cross validation protocol is followed to evaluate the performance across five public datasets: UCSD, PETS 2009, Fudan, Mall and Grand Central datasets. Image features are categorised into five types: size, shape, edges, keypoints and textures. The regression models evaluated are: Gaussian process regression (GPR), linear regression, K nearest neighbours (KNN) and neural networks (NN). The results demonstrate that local features outperform equivalent holistic and histogram based features; optimal performance is observed using all image features except for textures; and that GPR outperforms linear, KNN and NN regression