722 resultados para Stochastic modelling
Resumo:
Currently, 1.3 billion tonnes of food is lost annually due to lack of proper processing and preservation method. Drying is one of the easiest and oldest methods of food processing which can contribute to reduce that huge losses, combat hunger and promote food security. Drying increase shelf life, reduce weight and volume of food thus minimize packing, storage, and transportation cost and enable storage of food under ambient environment. However, drying is a complex process which involves combination of heat and mass transfer and physical property change and shrinkage of the food material. Modelling of this process is essential to optimize the drying kinetics and improve energy efficiency of the process. Since material properties varies with moisture content, the models should not consider constant materials properties, constant diffusion .The objective of this paper is to develop a multiphysics based mathematical model to simulate coupled heat and mass transfer during convective drying of fruit considering variable material properties. This model can be used predict the temperature and moisture distribution inside the food during drying. Effect of different drying air temperature and drying air velocity on drying kinetics has been demonstrated. The governing equations of heat and mass transfer were solved with Comsol Multiphysics 4.3.
Resumo:
Abstract Background The quantum increases in home Internet access and available online health information with limited control over information quality highlight the necessity of exploring decision making processes in accessing and using online information, specifically in relation to children who do not make their health decisions. Objectives To understand the processes explaining parents’ decisions to use online health information for child health care. Methods Parents (N = 391) completed an initial questionnaire assessing the theory of planned behaviour constructs of attitude, subjective norm, and perceived behavioural control, as well as perceived risk, group norm, and additional demographic factors. Two months later, 187 parents completed a follow-up questionnaire assessing their decisions to use online information for their child’s health care, specifically to 1) diagnose and/or treat their child’s suspected medical condition/illness and 2) increase understanding about a diagnosis or treatment recommended by a health professional. Results Hierarchical multiple regression showed that, for both behaviours, attitude, subjective norm, perceived behavioural control, (less) perceived risk, group norm, and (non) medical background were the significant predictors of intention. For parents’ use of online child health information, for both behaviours, intention was the sole significant predictor of behaviour. The findings explain 77% of the variance in parents’ intention to treat/diagnose a child health problem and 74% of the variance in their intentions to increase their understanding about child health concerns. Conclusions Understanding parents’ socio-cognitive processes that guide their use of online information for child health care is important given the increase in Internet usage and the sometimes-questionable quality of health information provided online. Findings highlight parents’ thirst for information; there is an urgent need for health professionals to provide parents with evidence-based child health websites in addition to general population education on how to evaluate the quality of online health information.
Resumo:
This paper is concerned with recent advances in the development of near wall-normal-free Reynolds-stress models, whose single point closure formulation, based on the inhomogeneity direction concept, is completely independent of the distance from the wall, and of the normal to the wall direction. In the present approach the direction of the inhomogeneity unit vector is decoupled from the coefficient functions of the inhomogeneous terms. A study of the relative influence of the particular closures used for the rapid redistribution terms and for the turbulent diffusion is undertaken, through comparison with measurements, and with a baseline Reynolds-stress model (RSM) using geometric wall normals. It is shown that wall-normal-free rsms can be reformulated as a projection on a tensorial basis that includes the inhomogeneity direction unit vector, suggesting that the theory of the redistribution tensor closure should be revised by taking into account inhomogeneity effects in the tensorial integrity basis used for its representation.
Resumo:
Building Information Modelling (BIM) appears to be the next evolutionary link in project delivery within the AEC (Architecture, Engineering and Construction) Industry. There have been several surveys of implementation at the local level but to date little is known of the international context. This paper is a preliminary report of a large scale electronic survey of the implementation of BIM and the impact on AEC project delivery and project stakeholders in Australia and internationally. National and regional patterns of BIM usage will be identified. These patterns will include disciplinary users, project lifecycle stages, technology integration–including software compatibility—and organisational issues such as human resources and interoperability. Also considered is the current status of the inclusion of BIM within tertiary level curricula and potential for the creation of a new discipline.
Resumo:
The Clarence-Moreton Basin (CMB) covers approximately 26000 km2 and is the only sub-basin of the Great Artesian Basin (GAB) in which there is flow to both the south-west and the east, although flow to the south-west is predominant. In many parts of the basin, including catchments of the Bremer, Logan and upper Condamine Rivers in southeast Queensland, the Walloon Coal Measures are under exploration for Coal Seam Gas (CSG). In order to assess spatial variations in groundwater flow and hydrochemistry at a basin-wide scale, a 3D hydrogeological model of the Queensland section of the CMB has been developed using GoCAD modelling software. Prior to any large-scale CSG extraction, it is essential to understand the existing hydrochemical character of the different aquifers and to establish any potential linkage. To effectively use the large amount of water chemistry data existing for assessment of hydrochemical evolution within the different lithostratigraphic units, multivariate statistical techniques were employed.
Resumo:
Asset management (AM) processes play an important role in assisting enterprises to manage their assets more efficiently. To visualise and improve AM processes, the processes need to be modelled using certain process modelling methodologies. Understanding the requirements for AM process modelling is essential for selecting or developing effective AM process modelling methodologies. However, little research has been done on analysing the requirements. This paper attempts to fill this gap by investigating the features of AM processes. It is concluded that AM process modelling requires intuitive representation of its processes, ‘fast’ implementation of the process modelling, effective evaluation of the processes and sound system integration.
Resumo:
This paper describes a generalised linear mixed model (GLMM) approach for understanding spatial patterns of participation in population health screening, in the presence of multiple screening facilities. The models presented have dual focus, namely the prediction of expected patient flows from regions to services and relative rates of participation by region- service combination, with both outputs having meaningful implications for the monitoring of current service uptake and provision. The novelty of this paper lies with the former focus, and an approach for distributing expected participation by region based on proximity to services is proposed. The modelling of relative rates of participation is achieved through the combination of different random effects, as a means of assigning excess participation to different sources. The methodology is applied to participation data collected from a government-funded mammography program in Brisbane, Australia.
Resumo:
My quantitative study asks how Chinese Australians’ “Chineseness” and their various resources influence their Chinese language proficiency, using online survey and snowball sampling. ‘Operationalization’ is a challenging process which ensures that the survey design talks back to the informing theory and forwards to the analysis model. It requires the attention to two core methodological concerns, namely ‘validity’ and ‘reliability’. Construction of a high-quality questionnaire is critical to the achievement of valid and reliable operationalization. A series of strategies were chosen to ensure the quality of the questions, and thus the eventual data. These strategies enable the use of structural equation modelling to examine how well the data fits the theoretical framework, which was constructed in light of Bourdieu’s theory of habitus, capital and field.
Resumo:
The R statistical environment and language has demonstrated particular strengths for interactive development of statistical algorithms, as well as data modelling and visualisation. Its current implementation has an interpreter at its core which may result in a performance penalty in comparison to directly executing user algorithms in the native machine code of the host CPU. In contrast, the C++ language has no built-in visualisation capabilities, handling of linear algebra or even basic statistical algorithms; however, user programs are converted to high-performance machine code, ahead of execution. A new method avoids possible speed penalties in R by using the Rcpp extension package in conjunction with the Armadillo C++ matrix library. In addition to the inherent performance advantages of compiled code, Armadillo provides an easy-to-use template-based meta-programming framework, allowing the automatic pooling of several linear algebra operations into one, which in turn can lead to further speedups. With the aid of Rcpp and Armadillo, conversion of linear algebra centered algorithms from R to C++ becomes straightforward. The algorithms retains the overall structure as well as readability, all while maintaining a bidirectional link with the host R environment. Empirical timing comparisons of R and C++ implementations of a Kalman filtering algorithm indicate a speedup of several orders of magnitude.
Resumo:
Citizen Science projects are initiatives in which members of the general public participate in scientific research projects and perform or manage research-related tasks such as data collection and/or data annotation. Citizen Science is technologically possible and scientifically significant. However, although research teams can save time and money by recruiting general citizens to volunteer their time and skills to help data analysis, the reliability of contributed data varies a lot. Data reliability issues are significant to the domain of Citizen Science due to the quantity and diversity of people and devices involved. Participants may submit low quality, misleading, inaccurate, or even malicious data. Therefore, finding a way to improve the data reliability has become an urgent demand. This study aims to investigate techniques to enhance the reliability of data contributed by general citizens in scientific research projects especially for acoustic sensing projects. In particular, we propose to design a reputation framework to enhance data reliability and also investigate some critical elements that should be aware of during developing and designing new reputation systems.
Resumo:
Incorporating design thinking as a generic capability at a school level is needed to ensure future generations are empowered for business innovation and active citizenship. This paper describes the methodology of an investigation into modelling design led innovation approaches from the business sector to secondary education, as part of a larger study. It builds on a previously discussed research agenda by outlining the scope, significance and limitations of currently available research in this area, examining an action research methodology utilising an Australian design immersion program case study, and discussing implications and future work. It employs a triangulated approach encompassing thematic analysis of qualitative data collection from student focus groups, semi-structured convergent interviews with teachers and facilitators, and student journals. Eventual outcomes will be reviewed and analysed within the framework of a proposed innovation matrix model for educational growth, synthesising principles responding to 21st century student outcomes. It is anticipated this research will inform a successful design led secondary education innovation model, facilitating new engagement frameworks between tertiary and secondary education sectors, as well as providing new insight into the suitability of action research in prototyping social innovation in Australia.
Resumo:
A critical step in the dissemination of ovarian cancer is the formation of multicellular spheroids from cells shed from the primary tumour. The objectives of this study were to apply bioengineered three-dimensional (3D) microenvironments for culturing ovarian cancer spheroids in vitro and simultaneously to build on a mathematical model describing the growth of multicellular spheroids in these biomimetic matrices. Cancer cells derived from human epithelial ovarian carcinoma were embedded within biomimetic hydrogels of varying stiffness and grown for up to 4 weeks. Immunohistochemistry, imaging and growth analyses were used to quantify the dependence of cell proliferation and apoptosis on matrix stiffness, long-term culture and treatment with the anti-cancer drug paclitaxel. The mathematical model was formulated as a free boundary problem in which each spheroid was treated as an incompressible porous medium. The functional forms used to describe the rates of cell proliferation and apoptosis were motivated by the experimental work and predictions of the mathematical model compared with the experimental output. This work aimed to establish whether it is possible to simulate solid tumour growth on the basis of data on spheroid size, cell proliferation and cell death within these spheroids. The mathematical model predictions were in agreement with the experimental data set and simulated how the growth of cancer spheroids was influenced by mechanical and biochemical stimuli including matrix stiffness, culture duration and administration of a chemotherapeutic drug. Our computational model provides new perspectives on experimental results and has informed the design of new 3D studies of chemoresistance of multicellular cancer spheroids.
Resumo:
In this paper we give an overview of some very recent work, as well as presenting a new approach, on the stochastic simulation of multi-scaled systems involving chemical reactions. In many biological systems (such as genetic regulation and cellular dynamics) there is a mix between small numbers of key regulatory proteins, and medium and large numbers of molecules. In addition, it is important to be able to follow the trajectories of individual molecules by taking proper account of the randomness inherent in such a system. We describe different types of simulation techniques (including the stochastic simulation algorithm, Poisson Runge-Kutta methods and the balanced Euler method) for treating simulations in the three different reaction regimes: slow, medium and fast. We then review some recent techniques on the treatment of coupled slow and fast reactions for stochastic chemical kinetics and present a new approach which couples the three regimes mentioned above. We then apply this approach to a biologically inspired problem involving the expression and activity of LacZ and LacY proteins in E coli, and conclude with a discussion on the significance of this work. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Fire safety has become an important part in structural design due to the ever increasing loss of properties and lives during fires. Fire rating of load bearing wall systems made of Light gauge Steel Frames (LSF) is determined using fire tests based on the standard time-temperature curve given in ISO 834. However, modern residential buildings make use of thermoplastic materials, which mean considerably high fuel loads. Hence a detailed fire research study into the performance of load bearing LSF walls was undertaken using a series of realistic design fire curves developed based on Eurocode parametric curves and Barnett’s BFD curves. It included both full scale fire tests and numerical studies of LSF walls without any insulation, and the recently developed externally insulated composite panels. This paper presents the details of fire tests first, and then the numerical models of tested LSF wall studs. It shows that suitable finite element models can be developed to predict the fire rating of load bearing walls under real fire conditions. The paper also describes the structural and fire performances of externally insulated LSF walls in comparison to the non-insulated walls under real fires, and highlights the effects of standard and real fire curves on fire performance of LSF walls.