315 resultados para Data-driven modelling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The process of spray drying is applied in a number of contexts. One such application is the production of a synthetic rock used for storage of nuclear waste. To establish a framework for a model of the spray drying process for this application, we here develop a model describing evaporation from droplets of pure water, such that the model may be extended to account for the presence of colloid within the droplet. We develop a spherically-symmetric model and formulate continuum equations describing mass, momentum, and energy balance in both the liquid and gas phases from first principles. We establish appropriate boundary conditions at the surface of the droplet, including a generalised Clapeyron equation that accurately describes the temperature at the surface of the droplet. To account for experiment design, we introduce a simplified platinum ball and wire model into the system using a thin wire problem. The resulting system of equations is transformed in order to simplify a finite volume solution scheme. The results from numerical simulation are compared with data collected for validation, and the sensitivity of the model to variations in key parameters, and to the use of Clausius–Clapeyron and generalised Clapeyron equations, is investigated. Good agreement is found between the model and experimental data, despite the simplicity of the platinum phase model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Online communities have fundamentally changed how humans connected and are now so common they are fundamental to the human experience. As the Internet developed for Web 1.0 to Web 2.0, the functionality of these communities has far exceeded initial expectations. These communities have shifted from simply places to share information to ways to access products and services that bridge the online and offline worlds. This shift has led to the disruption of many industries with the transportation industry being one such sector. Both private transport providers and public transport systems face competition from online communities who are able to link services providers and customers more effectively and innovatively. These types of communities fall under what has been popularised as collaborative consumption or the sharing economy. The aim of this study is to explore the role of Design-led Innovation in the creation of digital futures, specifically online connected communities for successful new mobility solutions. To explore this proposition multiple data collection methods are proposed;Content Analysis, ii) A Comparative Qualitative Study consisting of Qualitative Interviews and Focus Groups / Design Workshops and iii) An Action Research Cycle of Embedded Practice. The multidisciplinary nature of this study grounds this research in a novel position contributing to new knowledge in both the field of design, and also a deeper understanding of the larger fast-growing online community phenomena.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The export of sediments from coastal catchments can have detrimental impacts on estuaries and near shore reef ecosystems such as the Great Barrier Reef. Catchment management approaches aimed at reducing sediment loads require monitoring to evaluate their effectiveness in reducing loads over time. However, load estimation is not a trivial task due to the complex behaviour of constituents in natural streams, the variability of water flows and often a limited amount of data. Regression is commonly used for load estimation and provides a fundamental tool for trend estimation by standardising the other time specific covariates such as flow. This study investigates whether load estimates and resultant power to detect trends can be enhanced by (i) modelling the error structure so that temporal correlation can be better quantified, (ii) making use of predictive variables, and (iii) by identifying an efficient and feasible sampling strategy that may be used to reduce sampling error. To achieve this, we propose a new regression model that includes an innovative compounding errors model structure and uses two additional predictive variables (average discounted flow and turbidity). By combining this modelling approach with a new, regularly optimised, sampling strategy, which adds uniformity to the event sampling strategy, the predictive power was increased to 90%. Using the enhanced regression model proposed here, it was possible to detect a trend of 20% over 20 years. This result is in stark contrast to previous conclusions presented in the literature. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Deriving an estimate of optimal fishing effort or even an approximate estimate is very valuable for managing fisheries with multiple target species. The most challenging task associated with this is allocating effort to individual species when only the total effort is recorded. Spatial information on the distribution of each species within a fishery can be used to justify the allocations, but often such information is not available. To determine the long-term overall effort required to achieve maximum sustainable yield (MSY) and maximum economic yield (MEY), we consider three methods for allocating effort: (i) optimal allocation, which optimally allocates effort among target species; (ii) fixed proportions, which chooses proportions based on past catch data; and (iii) economic allocation, which splits effort based on the expected catch value of each species. Determining the overall fishing effort required to achieve these management objectives is a maximizing problem subject to constraints due to economic and social considerations. We illustrated the approaches using a case study of the Moreton Bay Prawn Trawl Fishery in Queensland (Australia). The results were consistent across the three methods. Importantly, our analysis demonstrated the optimal total effort was very sensitive to daily fishing costs-the effort ranged from 9500-11 500 to 6000-7000, 4000 and 2500 boat-days, using daily cost estimates of $0, $500, $750, and $950, respectively. The zero daily cost corresponds to the MSY, while a daily cost of $750 most closely represents the actual present fishing cost. Given the recent debate on which costs should be factored into the analyses for deriving MEY, our findings highlight the importance of including an appropriate cost function for practical management advice. The approaches developed here could be applied to other multispecies fisheries where only aggregated fishing effort data are recorded, as the literature on this type of modelling is sparse.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The contemporary methodology for growth models of organisms is based on continuous trajectories and thus it hinders us from modelling stepwise growth in crustacean populations. Growth models for fish are normally assumed to follow a continuous function, but a different type of model is needed for crustacean growth. Crustaceans must moult in order for them to grow. The growth of crustaceans is a discontinuous process due to the periodical shedding of the exoskeleton in moulting. The stepwise growth of crustaceans through the moulting process makes the growth estimation more complex. Stochastic approaches can be used to model discontinuous growth or what are commonly known as "jumps" (Figure 1). However, in stochastic growth model we need to ensure that the stochastic growth model results in only positive jumps. In view of this, we will introduce a subordinator that is a special case of a Levy process. A subordinator is a non-decreasing Levy process, that will assist in modelling crustacean growth for better understanding of the individual variability and stochasticity in moulting periods and increments. We develop the estimation methods for parameter estimation and illustrate them with the help of a dataset from laboratory experiments. The motivational dataset is from the ornate rock lobster, Panulirus ornatus, which can be found between Australia and Papua New Guinea. Due to the presence of sex effects on the growth (Munday et al., 2004), we estimate the growth parameters separately for each sex. Since all hard parts are shed too often, the exact age determination of a lobster can be challenging. However, the growth parameters for the aforementioned moult processes from tank data being able to estimate through: (i) inter-moult periods, and (ii) moult increment. We will attempt to derive a joint density, which is made up of two functions: one for moult increments and the other for time intervals between moults. We claim these functions are conditionally independent given pre-moult length and the inter-moult periods. The variables moult increments and inter-moult periods are said to be independent because of the Markov property or conditional probability. Hence, the parameters in each function can be estimated separately. Subsequently, we integrate both of the functions through a Monte Carlo method. We can therefore obtain a population mean for crustacean growth (e. g. red curve in Figure 1). [GRAPHICS]

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical methods are often used to analyse commercial catch and effort data to provide standardised fishing effort and/or a relative index of fish abundance for input into stock assessment models. Achieving reliable results has proved difficult in Australia's Northern Prawn Fishery (NPF), due to a combination of such factors as the biological characteristics of the animals, some aspects of the fleet dynamics, and the changes in fishing technology. For this set of data, we compared four modelling approaches (linear models, mixed models, generalised estimating equations, and generalised linear models) with respect to the outcomes of the standardised fishing effort or the relative index of abundance. We also varied the number and form of vessel covariates in the models. Within a subset of data from this fishery, modelling correlation structures did not alter the conclusions from simpler statistical models. The random-effects models also yielded similar results. This is because the estimators are all consistent even if the correlation structure is mis-specified, and the data set is very large. However, the standard errors from different models differed, suggesting that different methods have different statistical efficiency. We suggest that there is value in modelling the variance function and the correlation structure, to make valid and efficient statistical inferences and gain insight into the data. We found that fishing power was separable from the indices of prawn abundance only when we offset the impact of vessel characteristics at assumed values from external sources. This may be due to the large degree of confounding within the data, and the extreme temporal changes in certain aspects of individual vessels, the fleet and the fleet dynamics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Records of shrimp growth and water quality made during 12 crops from each of 48 ponds, over a period of 6.5 years, were provided by a Queensland, Australia, commercial shrimp farm, These data were analysed with a new growth model derived from the Gompertz model. The results indicate that water temperature, mortality and pond age significantly affect growth rates. After 180 days, shrimp reach 34 g at constant 30 degrees C, but only 15 g after the same amount of time at 20 degrees C. Mortality, through thinning the density of shrimp in the ponds, increased the growth rate, but the effect is small. With continual production, growth rates at first remained steady, then appeared to decrease for the sixth and seventh crop, after which they have increased steadily with each crop. It appears that conservative pond management, together with a gradual improvement in husbandry techniques, particularly feed management, brought about this change. This has encouraging implications for the long-term sustainability of the farming methods used. The growth model can be used to predict productivity, and hence, profitability, of new aquaculture locations or new production strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical analyses of health program participation seek to address a number of objectives compatible with the evaluation of demand for current resources. In this spirit, a spatial hierarchical model is developed for disentangling patterns in participation at the small area level, as a function of population-based demand and additional variation. For the former, a constrained gravity model is proposed to quantify factors associated with spatial choice and account for competition effects, for programs delivered by multiple clinics. The implications of gravity model misspecification within a mixed effects framework are also explored. The proposed model is applied to participation data from a no-fee mammography program in Brisbane, Australia. Attention is paid to the interpretation of various model outputs and their relevance for public health policy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traffic incidents are recognised as one of the key sources of non-recurrent congestion that often leads to reduction in travel time reliability (TTR), a key metric of roadway performance. A method is proposed here to quantify the impacts of traffic incidents on TTR on freeways. The method uses historical data to establish recurrent speed profiles and identifies non-recurrent congestion based on their negative impacts on speeds. The locations and times of incidents are used to identify incidents among non-recurrent congestion events. Buffer time is employed to measure TTR. Extra buffer time is defined as the extra delay caused by traffic incidents. This reliability measure indicates how much extra travel time is required by travellers to arrive at their destination on time with 95% certainty in the case of an incident, over and above the travel time that would have been required under recurrent conditions. An extra buffer time index (EBTI) is defined as the ratio of extra buffer time to recurrent travel time, with zero being the best case (no delay). A Tobit model is used to identify and quantify factors that affect EBTI using a selected freeway segment in the Southeast Queensland, Australia network. Both fixed and random parameter Tobit specifications are tested. The estimation results reveal that models with random parameters offer a superior statistical fit for all types of incidents, suggesting the presence of unobserved heterogeneity across segments. What factors influence EBTI depends on the type of incident. In addition, changes in TTR as a result of traffic incidents are related to the characteristics of the incidents (multiple vehicles involved, incident duration, major incidents, etc.) and traffic characteristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We formalise and present a new generic multifaceted complex system approach for modelling complex business enterprises. Our method has a strong focus on integrating the various data types available in an enterprise which represent the diverse perspectives of various stakeholders. We explain the challenges faced and define a novel approach to converting diverse data types into usable Bayesian probability forms. The data types that can be integrated include historic data, survey data, and management planning data, expert knowledge and incomplete data. The structural complexities of the complex system modelling process, based on various decision contexts, are also explained along with a solution. This new application of complex system models as a management tool for decision making is demonstrated using a railway transport case study. The case study demonstrates how the new approach can be utilised to develop a customised decision support model for a specific enterprise. Various decision scenarios are also provided to illustrate the versatility of the decision model at different phases of enterprise operations such as planning and control.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Standard methods for quantifying IncuCyte ZOOM™ assays involve measurements that quantify how rapidly the initially-vacant area becomes re-colonised with cells as a function of time. Unfortunately, these measurements give no insight into the details of the cellular-level mechanisms acting to close the initially-vacant area. We provide an alternative method enabling us to quantify the role of cell motility and cell proliferation separately. To achieve this we calibrate standard data available from IncuCyte ZOOM™ images to the solution of the Fisher-Kolmogorov model. Results: The Fisher-Kolmogorov model is a reaction-diffusion equation that has been used to describe collective cell spreading driven by cell migration, characterised by a cell diffusivity, D, and carrying capacity limited proliferation with proliferation rate, λ, and carrying capacity density, K. By analysing temporal changes in cell density in several subregions located well-behind the initial position of the leading edge we estimate λ and K. Given these estimates, we then apply automatic leading edge detection algorithms to the images produced by the IncuCyte ZOOM™ assay and match this data with a numerical solution of the Fisher-Kolmogorov equation to provide an estimate of D. We demonstrate this method by applying it to interpret a suite of IncuCyte ZOOM™ assays using PC-3 prostate cancer cells and obtain estimates of D, λ and K. Comparing estimates of D, λ and K for a control assay with estimates of D, λ and K for assays where epidermal growth factor (EGF) is applied in varying concentrations confirms that EGF enhances the rate of scratch closure and that this stimulation is driven by an increase in D and λ, whereas K is relatively unaffected by EGF. Conclusions: Our approach for estimating D, λ and K from an IncuCyte ZOOM™ assay provides more detail about cellular-level behaviour than standard methods for analysing these assays. In particular, our approach can be used to quantify the balance of cell migration and cell proliferation and, as we demonstrate, allow us to quantify how the addition of growth factors affects these processes individually.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Masonry under compression is affected by the properties of its constituents and their interfaces. In spite of extensive investigations of the behaviour of masonry under compression, the information in the literature cannot be regarded as comprehensive due to ongoing inventions of new generation products – for example, polymer modified thin layer mortared masonry and drystack masonry. As comprehensive experimental studies are very expensive, an analytical model inspired by damage mechanics is developed and applied to the prediction of the compressive behaviour of masonry in this paper. The model incorporates a parabolic progressively softening stress-strain curve for the units and a progressively stiffening stress-strain curve until a threshold strain for the combined mortar and the unit-mortar interfaces is reached. The model simulates the mutual constraints imposed by each of these constituents through their respective tensile and compressive behaviour and volumetric changes. The advantage of the model is that it requires only the properties of the constituents and considers masonry as a continuum and computes the average properties of the composite masonry prisms/wallettes; it does not require discretisation of prism or wallette similar to the finite element methods. The capability of the model in capturing the phenomenological behaviour of masonry with appropriate elastic response, stiffness degradation and post peak softening is presented through numerical examples. The fitting of the experimental data to the model parameters is demonstrated through calibration of some selected test data on units and mortar from the literature; the calibrated model is shown to predict the responses of the experimentally determined masonry built using the corresponding units and mortar quite well. Through a series of sensitivity studies, the model is also shown to predict the masonry strength appropriately for changes to the properties of the units and mortar, the mortar joint thickness and the ratio of the height of unit to mortar joint thickness. The unit strength is shown to affect the masonry strength significantly. Although the mortar strength has only a marginal effect, reduction in mortar joint thickness is shown to have a profound effect on the masonry strength. The results obtained from the model are compared with the various provisions in the Australian Masonry Structures Standard AS3700 (2011) and Eurocode 6.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Healthy transparent cornea depends upon the regulation of fluid, nutrient and oxygen transport through the tissue to sustain cell metabolism and other critical processes for normal functioning. This research considers the corneal geometry and investigates oxygen distribution using a two-dimensional Monod kinetic model, showing that previous studies make assumptions that lead to predictions of near-anoxic levels of oxygen tension in the limbal regions of the cornea. It also considers the comparison of experimental spatial and temporal data with the predictions of novel mathematical models with respect to distributed mitotic rates during corneal epithelial wound healing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Compositional data analysis usually deals with relative information between parts where the total (abundances, mass, amount, etc.) is unknown or uninformative. This article addresses the question of what to do when the total is known and is of interest. Tools used in this case are reviewed and analysed, in particular the relationship between the positive orthant of D-dimensional real space, the product space of the real line times the D-part simplex, and their Euclidean space structures. The first alternative corresponds to data analysis taking logarithms on each component, and the second one to treat a log-transformed total jointly with a composition describing the distribution of component amounts. Real data about total abundances of phytoplankton in an Australian river motivated the present study and are used for illustration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The noted 19th century biologist, Ernst Haeckel, put forward the idea that the growth (ontogenesis) of an organism recapitulated the history of its evolutionary development. While this idea is defunct within biology, the idea has been promoted in areas such as education (the idea of an education being the repetition of the civilizations before). In the research presented in this paper, recapitulation is used as a metaphor within computer-aided design as a way of grouping together different generations of spatial layouts. In most CAD programs, a spatial layout is represented as a series of objects (lines, or boundary representations) that stand in as walls. The relationships between spaces are not usually explicitly stated. A representation using Lindenmayer Systems (originally designed for the purpose of modelling plant morphology) is put forward as a way of representing the morphology of a spatial layout. The aim of this research is not just to describe an individual layout, but to find representations that link together lineages of development. This representation can be used in generative design as a way of creating more meaningful layouts which have particular characteristics. The use of genetic operators (mutation and crossover) is also considered, making this representation suitable for use with genetic algorithms.