583 resultados para Bioeconomic modelling
Resumo:
Quantifying spatial and/or temporal trends in environmental modelling data requires that measurements be taken at multiple sites. The number of sites and duration of measurement at each site must be balanced against costs of equipment and availability of trained staff. The split panel design comprises short measurement campaigns at multiple locations and continuous monitoring at reference sites [2]. Here we present a modelling approach for a spatio-temporal model of ultrafine particle number concentration (PNC) recorded according to a split panel design. The model describes the temporal trends and background levels at each site. The data were measured as part of the “Ultrafine Particles from Transport Emissions and Child Health” (UPTECH) project which aims to link air quality measurements, child health outcomes and a questionnaire on the child’s history and demographics. The UPTECH project involves measuring aerosol and particle counts and local meteorology at each of 25 primary schools for two weeks and at three long term monitoring stations, and health outcomes for a cohort of students at each school [3].
Resumo:
A new dualscale modelling approach is presented for simulating the drying of a wet hygroscopic porous material that couples the porous medium (macroscale) with the underlying pore structure (microscale). The proposed model is applied to the convective drying of wood at low temperatures and is valid in the so-called hygroscopic range, where hygroscopically held liquid water is present in the solid phase and water exits only as vapour in the pores. Coupling between scales is achieved by imposing the macroscopic gradients of moisture content and temperature on the microscopic field using suitably-defined periodic boundary conditions, which allows the macroscopic mass and thermal fluxes to be defined as averages of the microscopic fluxes over the unit cell. This novel formulation accounts for the intricate coupling of heat and mass transfer at the microscopic scale but reduces to a classical homogenisation approach if a linear relationship is assumed between the microscopic gradient and flux. Simulation results for a sample of spruce wood highlight the potential and flexibility of the new dual-scale approach. In particular, for a given unit cell configuration it is not necessary to propose the form of the macroscopic fluxes prior to the simulations because these are determined as a direct result of the dual-scale formulation.
Resumo:
Identifying, modelling and documenting business processes usually require the collaboration of many stakeholders that may be spread across companies in inter-organizational settings. While modern process modelling technologies are beginning to provide a number of features to support remote, they lack support for visual cues used in co-located collaboration. In this paper, we examine the importance of visual cues for collaboration tasks in collaborative process modelling. Based on this analysis, we present a prototype 3D virtual world process modelling tool that supports a number of visual cues to facilitate remote collaborative process model creation and validation. We then report on a preliminary analysis of the technology. In conclusion, we proceed to describe the future direction of our research with regards to the theoretical contributions expected from the evaluation of the tool.
Resumo:
Process modelling – the design and use of graphical documentations of an organisation’s business processes – is a key method to document and use information about business processes in organisational projects. Still, despite current interest in process modelling, this area of study still faces essential challenges. One of the key unanswered questions concerns the impact of process modelling in organisational practice. Process modelling initiatives call for tangible results in the form of returns on the substantial investments that organisations undertake to achieve improved processes. This study explores the impact of process model use on end-users and its contribution to organisational success. We posit that the use of conceptual models creates impact in organisational process teams. We also report on a set of case studies in which we explore tentative evidence for the development of impact of process model use. The results of this work provide a better understanding of process modelling impact from information practices and also lead to insights into how organisations should conduct process modelling initiatives in order to achieve an optimum return on their investment.
Resumo:
A Flash Event (FE) represents a period of time when a web-server experiences a dramatic increase in incoming traffic, either following a newsworthy event that has prompted users to locate and access it, or as a result of redirection from other popular web or social media sites. This usually leads to network congestion and Quality-of-Service (QoS) degradation. These events can be mistaken for Distributed Denial-of-Service (DDoS) attacks aimed at disrupting the server. Accurate detection of FEs and their distinction from DDoS attacks is important, since different actions need to be undertaken by network administrators in these two cases. However, lack of public domain FE datasets hinders research in this area. In this paper we present a detailed study of flash events and classify them into three broad categories. In addition, the paper describes FEs in terms of three key components: the volume of incoming traffic, the related source IP-addresses, and the resources being accessed. We present such a FE model with minimal parameters and use publicly available datasets to analyse and validate our proposed model. The model can be used to generate different types of FE traffic, closely approximating real-world scenarios, in order to facilitate research into distinguishing FEs from DDoS attacks.
Resumo:
Each year, organizations in Australian mining industry (asset intensive industry) spend substantial amount of capital (A$86 billion in 2009-10) (Statistics, 2011) in acquiring engineering assets. Engineering assets are put to use in operations to generate value. Different functions (departments) of an organization have different expectations and requirements from each of the engineering asset e.g. return on investment, reliability, efficiency, maintainability, low cost of running the asset, low or nil environmental impact and easy of disposal, potential salvage value etc. Assets are acquired from suppliers or built by service providers and or internally. The process of acquiring assets is supported by procurement function. One of the most costly mistakes that organizations can make is acquiring the inappropriate or non-conforming assets that do not fit the purpose. The root cause of acquiring non confirming assets belongs to incorrect acquisition decision and the process of making decisions. It is very important that an asset acquisition decision is based on inputs and multi-criteria of each function within the organization which has direct or indirect impact on the acquisition, utilization, maintenance and disposal of the asset. Literature review shows that currently there is no comprehensive process framework and tool available to evaluate the inclusiveness and breadth of asset acquisition decisions that are taken in the Mining Organizations. This thesis discusses various such criteria and inputs that need to be considered and evaluated from various functions within the organization while making the asset acquisition decision. Criteria from functions such as finance, production, maintenance, logistics, procurement, asset management, environment health and safety, material management, training and development etc. need to be considered to make an effective and coherent asset acquisition decision. The thesis also discusses a tool that is developed to be used in the multi-criteria and cross functional acquisition decision making. The development of multi-criteria and cross functional inputs based decision framework and tool which utilizes that framework to formulate cross functional and integrated asset acquisition decisions are the contribution of this research.
Resumo:
The automotive industry has been the focus of digital human modeling (DHM) research and application for many years. In the highly competitive marketplace for personal transportation, the desire to improve the customer’s experience has driven extensive research in both the physical and cognitive interaction between the vehicle and its occupants. Human models provide vehicle designers with tools to view and analyze product interactions before the first prototypes are built, potentially improving the design while reducing cost and development time. The focus of DHM research and applications began with prediction and representation of static postures for purposes of driver workstation layout, including assessments of seat adjustment ranges and exterior vision. Now DHMs are used for seat design and assessment of driver reach and ingress/egress. DHMs and related simulation tools are expanding into the cognitive domain, with computational models of perception and motion, and into the dynamic domain with models of physical responses to ride and vibration. Moreover, DHMs are now widely used to analyze the ergonomics of vehicle assembly tasks. In this case, the analysis aims to determine whether workers can be expected to complete the tasks safely and with good quality. This preface provides a review of the literature to provide context for the nine new papers presented in this special issue.
Resumo:
Citizen Science projects are initiatives in which members of the general public participate in scientific research projects and perform or manage research-related tasks such as data collection and/or data annotation. Citizen Science is technologically possible and scientifically significant. However, as the gathered information is from the crowd, the data quality is always hard to manage. There are many ways to manage data quality, and reputation management is one of the common approaches. In recent year, many research teams have deployed many audio or image sensors in natural environment in order to monitor the status of animals or plants. The collected data will be analysed by ecologists. However, as the amount of collected data is exceedingly huge and the number of ecologists is very limited, it is impossible for scientists to manually analyse all these data. The functions of existing automated tools to process the data are still very limited and the results are still not very accurate. Therefore, researchers have turned to recruiting general citizens who are interested in helping scientific research to do the pre-processing tasks such as species tagging. Although research teams can save time and money by recruiting general citizens to volunteer their time and skills to help data analysis, the reliability of contributed data varies a lot. Therefore, this research aims to investigate techniques to enhance the reliability of data contributed by general citizens in scientific research projects especially for acoustic sensing projects. In particular, we aim to investigate how to use reputation management to enhance data reliability. Reputation systems have been used to solve the uncertainty and improve data quality in many marketing and E-Commerce domains. The commercial organizations which have chosen to embrace the reputation management and implement the technology have gained many benefits. Data quality issues are significant to the domain of Citizen Science due to the quantity and diversity of people and devices involved. However, research on reputation management in this area is relatively new. We therefore start our investigation by examining existing reputation systems in different domains. Then we design novel reputation management approaches for Citizen Science projects to categorise participants and data. We have investigated some critical elements which may influence data reliability in Citizen Science projects. These elements include personal information such as location and education and performance information such as the ability to recognise certain bird calls. The designed reputation framework is evaluated by a series of experiments involving many participants for collecting and interpreting data, in particular, environmental acoustic data. Our research in exploring the advantages of reputation management in Citizen Science (or crowdsourcing in general) will help increase awareness among organizations that are unacquainted with its potential benefits.
Resumo:
In this Column, I have teamed up with a colleague, Eike Bernhard, a doctoral student who is studying the impact of process modelling on organizational practices. Together, we want to shed light on an age-old question of Business Process Management: What is the value proposition of process modelling?
Resumo:
longitudinal study of data modelling across grades 1-3. The activity engaged children in designing, implementing, and analysing a survey about their new playground. Data modelling involves investigations of meaningful phenomena, deciding what is worthy of attention (identifying complex attributes), and then progressing to organising, structuring, visualising, and representing data. The core components of data modelling addressed here are children’s structuring and representing of data, with a focus on their display of metarepresentational competence (diSessa, 2004). Such competence includes students’ abilities to invent or design a variety of new representations, explain their creations, understand the role they play, and critique and compare the adequacy of representations. Reported here are the ways in which the children structured and represented their data, the metarepresentational competence displayed, and links between their metarepresentational competence and conceptual competence.
Resumo:
Currently, 1.3 billion tonnes of food is lost annually due to lack of proper processing and preservation method. Drying is one of the easiest and oldest methods of food processing which can contribute to reduce that huge losses, combat hunger and promote food security. Drying increase shelf life, reduce weight and volume of food thus minimize packing, storage, and transportation cost and enable storage of food under ambient environment. However, drying is a complex process which involves combination of heat and mass transfer and physical property change and shrinkage of the food material. Modelling of this process is essential to optimize the drying kinetics and improve energy efficiency of the process. Since material properties varies with moisture content, the models should not consider constant materials properties, constant diffusion .The objective of this paper is to develop a multiphysics based mathematical model to simulate coupled heat and mass transfer during convective drying of fruit considering variable material properties. This model can be used predict the temperature and moisture distribution inside the food during drying. Effect of different drying air temperature and drying air velocity on drying kinetics has been demonstrated. The governing equations of heat and mass transfer were solved with Comsol Multiphysics 4.3.
Resumo:
Abstract Background The quantum increases in home Internet access and available online health information with limited control over information quality highlight the necessity of exploring decision making processes in accessing and using online information, specifically in relation to children who do not make their health decisions. Objectives To understand the processes explaining parents’ decisions to use online health information for child health care. Methods Parents (N = 391) completed an initial questionnaire assessing the theory of planned behaviour constructs of attitude, subjective norm, and perceived behavioural control, as well as perceived risk, group norm, and additional demographic factors. Two months later, 187 parents completed a follow-up questionnaire assessing their decisions to use online information for their child’s health care, specifically to 1) diagnose and/or treat their child’s suspected medical condition/illness and 2) increase understanding about a diagnosis or treatment recommended by a health professional. Results Hierarchical multiple regression showed that, for both behaviours, attitude, subjective norm, perceived behavioural control, (less) perceived risk, group norm, and (non) medical background were the significant predictors of intention. For parents’ use of online child health information, for both behaviours, intention was the sole significant predictor of behaviour. The findings explain 77% of the variance in parents’ intention to treat/diagnose a child health problem and 74% of the variance in their intentions to increase their understanding about child health concerns. Conclusions Understanding parents’ socio-cognitive processes that guide their use of online information for child health care is important given the increase in Internet usage and the sometimes-questionable quality of health information provided online. Findings highlight parents’ thirst for information; there is an urgent need for health professionals to provide parents with evidence-based child health websites in addition to general population education on how to evaluate the quality of online health information.
Resumo:
This paper is concerned with recent advances in the development of near wall-normal-free Reynolds-stress models, whose single point closure formulation, based on the inhomogeneity direction concept, is completely independent of the distance from the wall, and of the normal to the wall direction. In the present approach the direction of the inhomogeneity unit vector is decoupled from the coefficient functions of the inhomogeneous terms. A study of the relative influence of the particular closures used for the rapid redistribution terms and for the turbulent diffusion is undertaken, through comparison with measurements, and with a baseline Reynolds-stress model (RSM) using geometric wall normals. It is shown that wall-normal-free rsms can be reformulated as a projection on a tensorial basis that includes the inhomogeneity direction unit vector, suggesting that the theory of the redistribution tensor closure should be revised by taking into account inhomogeneity effects in the tensorial integrity basis used for its representation.