975 resultados para Average models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider a discrete agent-based model on a one-dimensional lattice and a two-dimensional square lattice, where each agent is a dimer occupying two sites. Agents move by vacating one occupied site in favor of a nearest-neighbor site and obey either a strict simple exclusion rule or a weaker constraint that permits partial overlaps between dimers. Using indicator variables and careful probability arguments, a discrete-time master equation for these processes is derived systematically within a mean-field approximation. In the continuum limit, nonlinear diffusion equations that describe the average agent occupancy of the dimer population are obtained. In addition, we show that multiple species of interacting subpopulations give rise to advection-diffusion equations. Averaged discrete simulation data compares very well with the solution to the continuum partial differential equation models. Since many cell types are elongated rather than circular, this work offers insight into population-level behavior of collective cellular motion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study analyses and compares the cost efficiency of Japanese steam power generation companies using the fixed and random Bayesian frontier models. We show that it is essential to account for heterogeneity in modelling the performance of energy companies. Results from the model estimation also indicate that restricting CO2 emissions can lead to a decrease in total cost. The study finally discusses the efficiency variations between the energy companies under analysis, and elaborates on the managerial and policy implications of the results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis has contributed to the advancement of knowledge in disease modelling by addressing interesting and crucial issues relevant to modelling health data over space and time. The research has led to the increased understanding of spatial scales, temporal scales, and spatial smoothing for modelling diseases, in terms of their methodology and applications. This research is of particular significance to researchers seeking to employ statistical modelling techniques over space and time in various disciplines. A broad class of statistical models are employed to assess what impact of spatial and temporal scales have on simulated and real data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Identifying railway capacity is an important task that can identify "in principal" whether the network can handle an intended traffic flow, and whether there is any free capacity left for additional train services. Capacity determination techniques can also be used to identify how best to improve an existing network, and at least cost. In this article an optimization approach has been applied to a case study of the Iran national railway, in order to identify its current capacity and to optimally expand it given a variety of technical conditions. This railway is very important in Iran and will be upgraded extensively in the coming years. Hence the conclusions in this article may help in that endeavor. A sensitivity analysis is recommended to evaluate a wider range of possible scenarios. Hence more useful lower and upper bounds can be provided for the performance of the system

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Approximately half of prostate cancers (PCa) carry TMPRSS2-ERG translocations; however, the clinical impact of this genomic alteration remains enigmatic. Expression of v-ets erythroblastosis virus E26 oncogene like (avian) gene (ERG) promotes prostatic epithelial dysplasia in transgenic mice and acquisition of epithelial-to-mesenchymal transition (EMT) characteristics in human prostatic epithelial cells (PrECs). To explore whether ERG-induced EMT in PrECs was associated with therapeutically targetable transformation characteristics, we established stable populations of BPH-1, PNT1B and RWPE-1 immortalized human PrEC lines that constitutively express flag-tagged ERG3 (fERG). All fERG-expressing populations exhibited characteristics of in vitro and in vivo transformation. Microarray analysis revealed >2000 commonly dysregulated genes in the fERG-PrEC lines. Functional analysis revealed evidence that fERG cells underwent EMT and acquired invasive characteristics. The fERG-induced EMT transcript signature was exemplified by suppressed expression of E-cadherin and keratins 5, 8, 14 and 18; elevated expression of N-cadherin, N-cadherin 2 and vimentin, and of the EMT transcriptional regulators Snail, Zeb1 and Zeb2, and lymphoid enhancer-binding factor-1 (LEF-1). In BPH-1 and RWPE-1-fERG cells, fERG expression is correlated with increased expression of integrin-linked kinase (ILK) and its downstream effectors Snail and LEF-1. Interfering RNA suppression of ERG decreased expression of ILK, Snail and LEF-1, whereas small interfering RNA suppression of ILK did not alter fERG expression. Interfering RNA suppression of ERG or ILK impaired fERG-PrEC Matrigel invasion. Treating fERG-BPH-1 cells with the small molecule ILK inhibitor, QLT-0267, resulted in dose-dependent suppression of Snail and LEF-1 expression, Matrigel invasion and reversion of anchorage-independent growth. These results suggest that ILK is a therapeutically targetable mediator of ERG-induced EMT and transformation in PCa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computational models in physiology often integrate functional and structural information from a large range of spatio-temporal scales from the ionic to the whole organ level. Their sophistication raises both expectations and scepticism concerning how computational methods can improve our understanding of living organisms and also how they can reduce, replace and refine animal experiments. A fundamental requirement to fulfil these expectations and achieve the full potential of computational physiology is a clear understanding of what models represent and how they can be validated. The present study aims at informing strategies for validation by elucidating the complex interrelations between experiments, models and simulations in cardiac electrophysiology. We describe the processes, data and knowledge involved in the construction of whole ventricular multiscale models of cardiac electrophysiology. Our analysis reveals that models, simulations, and experiments are intertwined, in an assemblage that is a system itself, namely the model-simulation-experiment (MSE) system. Validation must therefore take into account the complex interplay between models, simulations and experiments. Key points for developing strategies for validation are: 1) understanding sources of bio-variability is crucial to the comparison between simulation and experimental results; 2) robustness of techniques and tools is a pre-requisite to conducting physiological investigations using the MSE system; 3) definition and adoption of standards facilitates interoperability of experiments, models and simulations; 4) physiological validation must be understood as an iterative process that defines the specific aspects of electrophysiology the MSE system targets, and is driven by advancements in experimental and computational methods and the combination of both.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Existing crowd counting algorithms rely on holistic, local or histogram based features to capture crowd properties. Regression is then employed to estimate the crowd size. Insufficient testing across multiple datasets has made it difficult to compare and contrast different methodologies. This paper presents an evaluation across multiple datasets to compare holistic, local and histogram based methods, and to compare various image features and regression models. A K-fold cross validation protocol is followed to evaluate the performance across five public datasets: UCSD, PETS 2009, Fudan, Mall and Grand Central datasets. Image features are categorised into five types: size, shape, edges, keypoints and textures. The regression models evaluated are: Gaussian process regression (GPR), linear regression, K nearest neighbours (KNN) and neural networks (NN). The results demonstrate that local features outperform equivalent holistic and histogram based features; optimal performance is observed using all image features except for textures; and that GPR outperforms linear, KNN and NN regression

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Extreme heat is a leading weather-related cause of illness and death in many locations across the globe, including subtropical Australia. The possibility of increasingly frequent and severe heat waves warrants continued efforts to reduce this health burden, which could be accomplished by targeting intervention measures toward the most vulnerable communities. Objectives: We sought to quantify spatial variability in heat-related morbidity in Brisbane, Australia, to highlight regions of the city with the greatest risk. We also aimed to find area-level social and environmental determinants of high risk within Brisbane. Methods: We used a series of hierarchical Bayesian models to examine city-wide and intracity associations between temperature and morbidity using a 2007–2011 time series of geographically referenced hospital admissions data. The models accounted for long-term time trends, seasonality, and day of week and holiday effects. Results: On average, a 10°C increase in daily maximum temperature during the summer was associated with a 7.2% increase in hospital admissions (95% CI: 4.7, 9.8%) on the following day. Positive statistically significant relationships between admissions and temperature were found for 16 of the city’s 158 areas; negative relationships were found for 5 areas. High-risk areas were associated with a lack of high income earners and higher population density. Conclusions: Geographically targeted public health strategies for extreme heat may be effective in Brisbane, because morbidity risk was found to be spatially variable. Emergency responders, health officials, and city planners could focus on short- and long-term intervention measures that reach communities in the city with lower incomes and higher population densities, including reduction of urban heat island effects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is often said that Australia is a world leader in rates of copyright infringement for entertainment goods. In 2012, the hit television show, Game of Thrones, was the most downloaded television show over bitorrent, and estimates suggest that Australians accounted for a plurality of nearly 10% of the 3-4 million downloads each week. The season finale of 2013 was downloaded over a million times within 24 hours of its release, and again Australians were the largest block of illicit downloaders over BitTorrent, despite our relatively small population. This trend has led the former US Ambassador to Australia to implore Australians to stop 'stealing' digital content, and rightsholders to push for increasing sanctions on copyright infringers. The Australian Government is looking to respond by requiring Internet Service Providers to issue warnings and potentially punish consumers who are alleged by industry groups to have infringed copyright. This is the logical next step in deterring infringement, given that the operators of infringing networks (like The Pirate Bay, for example) are out of regulatory reach. This steady ratcheting up of the strength of copyright, however, comes at a significant cost to user privacy and autonomy, and while the decentralisation of enforcement reduces costs, it also reduces the due process safeguards provided by the judicial process. This article presents qualitative evidence that substantiates a common intuition: one of the major reasons that Australians seek out illicit downloads of content like Game of Thrones in such numbers is that it is more difficult to access legitimately in Australia. The geographically segmented way in which copyright is exploited at an international level has given rise to a ‘tyranny of digital distance’, where Australians have less access to copyright goods than consumers in other countries. Compared to consumers in the US and the EU, Australians pay more for digital goods, have less choice in distribution channels, are exposed to substantial delays in access, and are sometimes denied access completely. In this article we focus our analysis on premium film and television offerings, like Game of Thrones, and through semi-structured interviews, explore how choices in distribution impact on the willingness of Australian consumers to seek out infringing copies of copyright material. Game of Thrones provides an excellent case study through which to frame this analysis: it is both one of the least legally accessible television offerings and one of the most downloaded through filesharing networks of recent times. Our analysis shows that at the same time as rightsholder groups, particularly in the film and television industries, are lobbying for stronger laws to counter illicit distribution, the business practices of their member organisations are counter-productively increasing incentives for consumers to infringe. The lack of accessibility and high prices of copyright goods in Australia leads to substantial economic waste. The unmet consumer demand means that Australian consumers are harmed by lower access to information and entertainment goods than consumers in other jurisdictions. The higher rates of infringement that fulfils some of this unmet demand increases enforcement costs for copyright owners and imposes burdens either on our judicial system or on private entities – like ISPs – who may be tasked with enforcing the rights of third parties. Most worryingly, the lack of convenient and cheap legitimate digital distribution channels risks undermining public support for copyright law. Our research shows that consumers blame rightsholders for failing to meet market demand, and this encourages a social norm that infringing copyright, while illegal, is not morally wrongful. The implications are as simple as they are profound: Australia should not take steps to increase the strength of copyright law at this time. The interests of the public and those of rightsholders align better when there is effective competition in distribution channels and consumers can legitimately get access to content. While foreign rightsholders are seeking enhanced protection for their interests, increasing enforcement is likely to increase their ability to engage in lucrative geographical price-discrimination, particularly for premium content. This is only likely to increase the degree to which Australian consumers feel that their interests are not being met and, consequently, to further undermine the legitimacy of copyright law. If consumers are to respect copyright law, increasing sanctions for infringement without enhancing access and competition in legitimate distribution channels could be dangerously counter-productive. We suggest that rightsholders’ best strategy for addressing infringement in Australia at this time is to ensure that Australians can access copyright goods in a timely, affordable, convenient, and fair lawful manner.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Agent-based modeling and simulation (ABMS) may fit well with entrepreneurship research and practice because the core concepts and basic premises of entrepreneurship coincide with the characteristics of ABMS. However, it is difficult to find cases where ABMS is applied to entrepreneurship research. To apply ABMS to entrepreneurship and organization studies, designing a conceptual model is important; thus to effectively design a conceptual model, various mixed method approaches are being attempted. As a new mixed method approach to ABMS, this study proposes a bibliometric approach to designing agent based models, which establishes and analyzes a domain corpus. This study presents an example on the venture creation process using the bibliometric approach. This example shows us that the results of the multi-agent simulations on the venturing process based on the bibliometric approach are close to each nation’s surveyed data on the venturing activities. In conclusion, by the bibliometric approach proposed in this study, all the agents and the agents’ behaviors related to a phenomenon can be extracted effectively, and a conceptual model for ABMS can be designed with the agents and their behaviors. This study contributes to the entrepreneurship and organization studies by promoting the application of ABMS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This digital poster (which was on display at "The Cube", Queensland University of Technology) demonstrates how specification parameters can be extracted from a product library repository for use in augmenting the information contents of the objects in a local BIM tool (Revit in this instance).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rating systems are used by many websites, which allow customers to rate available items according to their own experience. Subsequently, reputation models are used to aggregate available ratings in order to generate reputation scores for items. A problem with current reputation models is that they provide solutions to enhance accuracy of sparse datasets not thinking of their models performance over dense datasets. In this paper, we propose a novel reputation model to generate more accurate reputation scores for items using any dataset; whether it is dense or sparse. Our proposed model is described as a weighted average method, where the weights are generated using the normal distribution. Experiments show promising results for the proposed model over state-of-the-art ones on sparse and dense datasets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the cost of mineral fertilisers increases globally, organic soil amendments (OAs) from agricultural sources are increasingly being used as substitutes for nitrogen. However, the impact of OAs on the production of greenhouse gases (CO2 and N2O) is not well understood. A 60-day laboratory incubation experiment was conducted to investigate the impacts of applying OAs (equivalent to 296 kg N ha−1 on average) on N2O and CO2 emissions and soil properties of clay and sandy loam soils from sugar cane production. The experiment included 6 treatments, one being an un-amended (UN) control with addition of five OAs being raw mill mud (MM), composted mill mud (CM), high N compost (HC), rice husk biochar (RB), and raw mill mud plus rice husk biochar (MB). These OAs were incubated at 60, 75 and 90% water-filled pore space (WFPS) at 25°C with urea (equivalent to 200 kg N ha−1) added to the soils thirty days after the incubation commenced. Results showed WFPS did not influence CO2 emissions over the 60 days but the magnitude of emissions as a proportion of C applied was RB < CM < MB < HC models being developed to predict CO2 and N2O emissions as a function of the dry matter and C/N ratio of the OAs, WFPS, and the soil CEC. Application of RB reduced N2O emissions by as much as 42-64% depending on WFPS. The reductions in both CO2 and N2O emissions after application of RB were due to a reduced bioavailability of C and not immobilisation of N. These findings show that the effect of OAs on soil GHG emissions can vary substantially depending on their chemical properties. OAs with a high availability of labile C and N can lead to elevated emissions of CO2 and N2O, while rice husk biochar showed potential in reducing overall soil GHG emissions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this book by two Australian authors is to: introduce the audience to the full complement of contextual elements found within program theory; offer practical suggestions to engage with theories of change, theories of action and logic models; and provide substantial evidence for this approach through scholarly literature, practice case studies together with the authors' combined experience of 60 years.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Land-use regression (LUR) is a technique that can improve the accuracy of air pollution exposure assessment in epidemiological studies. Most LUR models are developed for single cities, which places limitations on their applicability to other locations. We sought to develop a model to predict nitrogen dioxide (NO2) concentrations with national coverage of Australia by using satellite observations of tropospheric NO2 columns combined with other predictor variables. We used a generalised estimating equation (GEE) model to predict annual and monthly average ambient NO2 concentrations measured by a national monitoring network from 2006 through 2011. The best annual model explained 81% of spatial variation in NO2 (absolute RMS error=1.4 ppb), while the best monthly model explained 76% (absolute RMS error=1.9 ppb). We applied our models to predict NO2 concentrations at the ~350,000 census mesh blocks across the country (a mesh block is the smallest spatial unit in the Australian census). National population-weighted average concentrations ranged from 7.3 ppb (2006) to 6.3 ppb (2011). We found that a simple approach using tropospheric NO2 column data yielded models with slightly better predictive ability than those produced using a more involved approach that required simulation of surface-to-column ratios. The models were capable of capturing within-urban variability in NO2, and offer the ability to estimate ambient NO2 concentrations at monthly and annual time scales across Australia from 2006–2011. We are making our model predictions freely available for research.