954 resultados para Transport Modelling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Spreading cell fronts play an essential role in many physiological processes. Classically, models of this process are based on the Fisher-Kolmogorov equation; however, such continuum representations are not always suitable as they do not explicitly represent behaviour at the level of individual cells. Additionally, many models examine only the large time asymptotic behaviour, where a travelling wave front with a constant speed has been established. Many experiments, such as a scratch assay, never display this asymptotic behaviour, and in these cases the transient behaviour must be taken into account. We examine the transient and asymptotic behaviour of moving cell fronts using techniques that go beyond the continuum approximation via a volume-excluding birth-migration process on a regular one-dimensional lattice. We approximate the averaged discrete results using three methods: (i) mean-field, (ii) pair-wise, and (iii) one-hole approximations. We discuss the performace of these methods, in comparison to the averaged discrete results, for a range of parameter space, examining both the transient and asymptotic behaviours. The one-hole approximation, based on techniques from statistical physics, is not capable of predicting transient behaviour but provides excellent agreement with the asymptotic behaviour of the averaged discrete results, provided that cells are proliferating fast enough relative to their rate of migration. The mean-field and pair-wise approximations give indistinguishable asymptotic results, which agree with the averaged discrete results when cells are migrating much more rapidly than they are proliferating. The pair-wise approximation performs better in the transient region than does the mean-field, despite having the same asymptotic behaviour. Our results show that each approximation only works in specific situations, thus we must be careful to use a suitable approximation for a given system, otherwise inaccurate predictions could be made.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Philosophical inquiry in the teaching and learning of mathematics has received continued, albeit limited, attention over many years (e.g., Daniel, 2000; English, 1994; Lafortune, Daniel, Fallascio, & Schleider, 2000; Kennedy, 2012a). The rich contributions these communities can offer school mathematics, however, have not received the deserved recognition, especially from the mathematics education community. This is a perplexing situation given the close relationship between the two disciplines and their shared values for empowering students to solve a range of challenging problems, often unanticipated, and often requiring broadened reasoning. In this article, I first present my understanding of philosophical inquiry as it pertains to the mathematics classroom, taking into consideration the significant work that has been undertaken on socio-political contexts in mathematics education (e.g., Skovsmose & Greer, 2012). I then consider one approach to advancing philosophical inquiry in the mathematics classroom, namely, through modelling activities that require interpretation, questioning, and multiple approaches to solution. The design of these problem activities, set within life-based contexts, provides an ideal vehicle for stimulating philosophical inquiry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The number of office building retrofit projects is increasing. These projects are characterised by processes which have a close relationship with waste generation and therefore demand a high level of waste management. In a preliminary study reported separately, we identified seven critical factors of on-site waste generation in office building retrofit projects. Through semi-structured interviews and Interpretive Structural Modelling, this research further investigated the interrelationships among these critical waste factors, to identify each factor’s level of influence on waste generation and propose effective solutions for waste minimization. “Organizational commitment” was identified as the fundamental issue for waste generation in the ISM system. Factors related to plan, design and construction processes were found to be located in the middle levels of the ISM model but still had significant impacts on the system as a whole. Based on the interview findings and ISM analysis results, some practical solutions were proposed for waste minimization in building retrofit projects: (1) reusable and adaptable fit-out design; (2) a system for as-built drawings and building information; (3) integrated planning for retrofitting work process and waste management; and (4) waste benchmarking development for retrofit projects. This research will provide a better understanding of waste issues associated with building retrofit projects and facilitate enhanced waste minimization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Object classification is plagued by the issue of session variation. Session variation describes any variation that makes one instance of an object look different to another, for instance due to pose or illumination variation. Recent work in the challenging task of face verification has shown that session variability modelling provides a mechanism to overcome some of these limitations. However, for computer vision purposes, it has only been applied in the limited setting of face verification. In this paper we propose a local region based intersession variability (ISV) modelling approach, and apply it to challenging real-world data. We propose a region based session variability modelling approach so that local session variations can be modelled, termed Local ISV. We then demonstrate the efficacy of this technique on a challenging real-world fish image database which includes images taken underwater, providing significant real-world session variations. This Local ISV approach provides a relative performance improvement of, on average, 23% on the challenging MOBIO, Multi-PIE and SCface face databases. It also provides a relative performance improvement of 35% on our challenging fish image dataset.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Food waste is a current challenge that both developing and developed countries face. This project applied a novel combination of available methods in Mechanical, agricultural and food engineering to address these challenges. A systematic approach was devised to investigate possibilities of reducing food waste and increasing the efficiency of industry by applying engineering concepts and theories including experimental, mathematical and computational modelling methods. This study highlights the impact of comprehensive understanding of agricultural and food material response to the mechanical operations and its direct relation to the volume of food wasted globally.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During the evolution of the music industry, developments in the media environment have required music firms to adapt in order to survive. Changes in broadcast radio programming during the 1950s; the Compact Cassette during the 1970s; and the deregulation of media ownership during the 1990s are all examples of changes which have heavily affected the music industry. This study explores similar contemporary dynamics, examines how decision makers in the music industry perceive and make sense of the developments, and reveals how they revise their business strategies, based on their mental models of the media environment. A qualitative system dynamics model is developed in order to support the reasoning brought forward by the study. The model is empirically grounded, but is also based on previous music industry research and a theoretical platform constituted by concepts from evolutionary economics and sociology of culture. The empirical data primarily consist of 36 personal interviews with decision makers in the American, British and Swedish music industrial ecosystems. The study argues that the model which is proposed, more effectively explains contemporary music industry dynamics than music industry models presented by previous research initiatives. Supported by the model, the study is able to show how “new” media outlets make old music business models obsolete and challenge the industry’s traditional power structures. It is no longer possible to expose music at one outlet (usually broadcast radio) in the hope that it will lead to sales of the same music at another (e.g. a compact disc). The study shows that many music industry decision makers still have not embraced the new logic, and have not yet challenged their traditional mental models of the media environment. Rather, they remain focused on preserving the pivotal role held by the CD and other physical distribution technologies. Further, the study shows that while many music firms remain attached to the old models, other firms, primarily music publishers, have accepted the transformation, and have reluctantly recognised the realities of a virtualised environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fire safety has become an important part in structural design due to the ever increasing loss of properties and lives during fires. Conventionally the fire rating of load bearing wall systems made of Light gauge Steel Frames (LSF) is determined using fire tests based on the standard time-temperature curve in ISO834 [1]. However, modern commercial and residential buildings make use of thermoplastic materials, which mean considerably high fuel loads. Hence a detailed fire research study into the fire performance of LSF walls was undertaken using realistic design fire curves developed based on Eurocode parametric [2] and Barnett’s BFD [3] curves using both full scale fire tests and numerical studies. It included LSF walls without cavity insulation, and the recently developed externally insulated composite panel system. This paper presents the details of finite element models developed to simulate the full scale fire tests of LSF wall panels under realistic design fires. Finite element models of LSF walls exposed to realistic design fires were developed, and analysed under both transient and steady state fire conditions using the measured stud time-temperature curves. Transient state analyses were performed to simulate fire test conditions while steady state analyses were performed to obtain the load ratio versus time and failure temperature curves of LSF walls. Details of the developed finite element models and the results including the axial deformation and lateral deflection versus time curves, and the stud failure modes and times are presented in this paper. Comparison with fire test results demonstrate the ability of developed finite element models to predict the performance and fire resistance ratings of LSF walls under realistic design fires.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The global financial crisis (GFC) in 2008 rocked local, regional, and state economies throughout the world. Several intermediate outcomes of the GFC have been well documented in the literature including loss of jobs and reduced income. Relatively little research has, however, examined the impacts of the GFC on individual level travel behaviour change. To address this shortcoming, HABITAT panel data were employed to estimate a multinomial logit model to examine mode switching behaviour between 2007 (pre-GFC) and 2009 (post-GFC) of a baby boomers cohort in Brisbane, Australia—a city within a developed country that has been on many metrics the least affected by the GFC. In addition, a Poisson regression model was estimated to model the number of trips made by individuals in 2007, 2008, and 2009. The South East Queensland Travel Survey datasets were used to develop this model. Four linear regression models were estimated to assess the effects of the GFC on time allocated to travel during a day: one for each of the three travel modes including public transport, active transport, less environmentally friendly transport; and an overall travel time model irrespective of mode. The results reveal that individuals were more likely to switch to public transport who lost their job or whose income reduced between 2007 and 2009. Individuals also made significantly fewer trips in 2008 and 2009 compared to 2007. Individuals spent significantly less time using less environmentally friendly transport but more time using public transport in 2009. Baby boomers switched to more environmentally friendly travel modes during the GFC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computational models represent a highly suitable framework, not only for testing biological hypotheses and generating new ones but also for optimising experimental strategies. As one surveys the literature devoted to cancer modelling, it is obvious that immense progress has been made in applying simulation techniques to the study of cancer biology, although the full impact has yet to be realised. For example, there are excellent models to describe cancer incidence rates or factors for early disease detection, but these predictions are unable to explain the functional and molecular changes that are associated with tumour progression. In addition, it is crucial that interactions between mechanical effects, and intracellular and intercellular signalling are incorporated in order to understand cancer growth, its interaction with the extracellular microenvironment and invasion of secondary sites. There is a compelling need to tailor new, physiologically relevant in silico models that are specialised for particular types of cancer, such as ovarian cancer owing to its unique route of metastasis, which are capable of investigating anti-cancer therapies, and generating both qualitative and quantitative predictions. This Commentary will focus on how computational simulation approaches can advance our understanding of ovarian cancer progression and treatment, in particular, with the help of multicellular cancer spheroids, and thus, can inform biological hypothesis and experimental design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to knowledge gaps in relation to urban stormwater quality processes, an in-depth understanding of model uncertainty can enhance decision making. Uncertainty in stormwater quality models can originate from a range of sources such as the complexity of urban rainfall-runoff-stormwater pollutant processes and the paucity of observed data. Unfortunately, studies relating to epistemic uncertainty, which arises from the simplification of reality are limited and often deemed mostly unquantifiable. This paper presents a statistical modelling framework for ascertaining epistemic uncertainty associated with pollutant wash-off under a regression modelling paradigm using Ordinary Least Squares Regression (OLSR) and Weighted Least Squares Regression (WLSR) methods with a Bayesian/Gibbs sampling statistical approach. The study results confirmed that WLSR assuming probability distributed data provides more realistic uncertainty estimates of the observed and predicted wash-off values compared to OLSR modelling. It was also noted that the Bayesian/Gibbs sampling approach is superior compared to the most commonly adopted classical statistical and deterministic approaches commonly used in water quality modelling. The study outcomes confirmed that the predication error associated with wash-off replication is relatively higher due to limited data availability. The uncertainty analysis also highlighted the variability of the wash-off modelling coefficient k as a function of complex physical processes, which is primarily influenced by surface characteristics and rainfall intensity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research looked at using the metaphor of biological evolution as a way of solving architectural design problems. Drawing from fields such as language grammars, algorithms and cellular biology, this thesis looked at ways of encoding design information for processing. The aim of this work is to help in the building of software that support the architectural design process and allow designers to examine more variations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electricity network investment and asset management require accurate estimation of future demand in energy consumption within specified service areas. For this purpose, simple models are typically developed to predict future trends in electricity consumption using various methods and assumptions. This paper presents a statistical model to predict electricity consumption in the residential sector at the Census Collection District (CCD) level over the state of New South Wales, Australia, based on spatial building and household characteristics. Residential household demographic and building data from the Australian Bureau of Statistics (ABS) and actual electricity consumption data from electricity companies are merged for 74 % of the 12,000 CCDs in the state. Eighty percent of the merged dataset is randomly set aside to establish the model using regression analysis, and the remaining 20 % is used to independently test the accuracy of model prediction against actual consumption. In 90 % of the cases, the predicted consumption is shown to be within 5 kWh per dwelling per day from actual values, with an overall state accuracy of -1.15 %. Given a future scenario with a shift in climate zone and a growth in population, the model is used to identify the geographical or service areas that are most likely to have increased electricity consumption. Such geographical representation can be of great benefit when assessing alternatives to the centralised generation of energy; having such a model gives a quantifiable method to selecting the 'most' appropriate system when a review or upgrade of the network infrastructure is required.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Security models for two-party authenticated key exchange (AKE) protocols have developed over time to prove the security of AKE protocols even when the adversary learns certain secret values. In this work, we address more granular leakage: partial leakage of long-term secrets of protocol principals, even after the session key is established. We introduce a generic key exchange security model, which can be instantiated allowing bounded or continuous leakage, even when the adversary learns certain ephemeral secrets or session keys. Our model is the strongest known partial-leakage-based security model for key exchange protocols. We propose a generic construction of a two-pass leakage-resilient key exchange protocol that is secure in the proposed model, by introducing a new concept: the leakage-resilient NAXOS trick. We identify a special property for public-key cryptosystems: pair generation indistinguishability, and show how to obtain the leakage-resilient NAXOS trick from a pair generation indistinguishable leakage-resilient public-key cryptosystem.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Exact solutions of partial differential equation models describing the transport and decay of single and coupled multispecies problems can provide insight into the fate and transport of solutes in saturated aquifers. Most previous analytical solutions are based on integral transform techniques, meaning that the initial condition is restricted in the sense that the choice of initial condition has an important impact on whether or not the inverse transform can be calculated exactly. In this work we describe and implement a technique that produces exact solutions for single and multispecies reactive transport problems with more general, smooth initial conditions. We achieve this by using a different method to invert a Laplace transform which produces a power series solution. To demonstrate the utility of this technique, we apply it to two example problems with initial conditions that cannot be solved exactly using traditional transform techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of graphical processing unit (GPU) parallel processing is becoming a part of mainstream statistical practice. The reliance of Bayesian statistics on Markov Chain Monte Carlo (MCMC) methods makes the applicability of parallel processing not immediately obvious. It is illustrated that there are substantial gains in improved computational time for MCMC and other methods of evaluation by computing the likelihood using GPU parallel processing. Examples use data from the Global Terrorism Database to model terrorist activity in Colombia from 2000 through 2010 and a likelihood based on the explicit convolution of two negative-binomial processes. Results show decreases in computational time by a factor of over 200. Factors influencing these improvements and guidelines for programming parallel implementations of the likelihood are discussed.