960 resultados para Process mean shifts
Resumo:
The safety of passengers is a major concern to airports. In the event of crises, having an effective and efficient evacuation process in place can significantly aid in enhancing passenger safety. Hence, it is necessary for airport operators to have an in-depth understanding of the evacuation process of their airport terminal. Although evacuation models have been used in studying pedestrian behaviour for decades, little research has been done in considering the evacuees’ group dynamics and the complexity of the environment. In this paper, an agent-based model is presented to simulate passenger evacuation process. Different exits were allocated to passengers based on their location and security level. The simulation results show that the evacuation time can be influenced by passenger group dynamics. This model also provides a convenient way to design airport evacuation strategy and examine its efficiency. The model was created using AnyLogic software and its parameters were initialised using recent research data published in the literature.
Resumo:
Several websites utilise a rule-base recommendation system, which generates choices based on a series of questionnaires, for recommending products to users. This approach has a high risk of customer attrition and the bottleneck is the questionnaire set. If the questioning process is too long, complex or tedious; users are most likely to quit the questionnaire before a product is recommended to them. If the questioning process is short; the user intensions cannot be gathered. The commonly used feature selection methods do not provide a satisfactory solution. We propose a novel process combining clustering, decisions tree and association rule mining for a group-oriented question reduction process. The question set is reduced according to common properties that are shared by a specific group of users. When applied on a real-world website, the proposed combined method outperforms the methods where the reduction of question is done only by using association rule mining or only by observing distribution within the group.
Resumo:
Recently, mean-variance analysis has been proposed as a novel paradigm to model document ranking in Information Retrieval. The main merit of this approach is that it diversifies the ranking of retrieved documents. In its original formulation, the strategy considers both the mean of relevance estimates of retrieved documents and their variance. How- ever, when this strategy has been empirically instantiated, the concepts of mean and variance are discarded in favour of a point-wise estimation of relevance (to replace the mean) and of a parameter to be tuned or, alternatively, a quantity dependent upon the document length (to replace the variance). In this paper we revisit this ranking strategy by going back to its roots: mean and variance. For each retrieved document, we infer a relevance distribution from a series of point-wise relevance estimations provided by a number of different systems. This is used to compute the mean and the variance of document relevance estimates. On the TREC Clueweb collection, we show that this approach improves the retrieval performances. This development could lead to new strategies to address the fusion of relevance estimates provided by different systems.
Resumo:
The measures by which major developments are officially approved for construction are - by common agreement - complex, time-consuming, and of questionable merit in terms of maintaining ecological viability.
Resumo:
This thesis presents novel techniques for addressing the problems of continuous change and inconsistencies in large process model collections. The developed techniques treat process models as a collection of fragments and facilitate version control, standardization and automated process model discovery using fragment-based concepts. Experimental results show that the presented techniques are beneficial in consolidating large process model collections, specifically when there is a high degree of redundancy.
Resumo:
In recent years, the imperative to communicate organisational impacts to a variety of stakeholders has gained increasing importance within all sectors. Despite growing external demands for evaluation and social impact measurement, there has been limited critically informed analysis about the presumed importance of these activities to organisational success and the practical challenges faced by organisations in undertaking such assessment. In this paper, we present the findings from an action research study of five Australian small to medium social enterprises’ practices and use of evaluation and social impact analysis. Our findings have implications for social enterprise operators, policy makers and social investors regarding when, why and at what level these activities contribute to organisational performance and the fulfilment of mission.
Resumo:
This paper critically evaluates the series of inquires that the Australian Labor government undertook during 2011-2013 into reform of Australian media, communications and copyright laws. One important driver of policy reform was the government’s commitment to building a National Broadband Network (NBN), and the implications this had for existing broadcasting and telecommunications policy, as it would constitute a major driver of convergence of media and communications access devices and content platforms. These inquiries included: the Convergence Review of media and communications legislation; the Australian Law Reform Commission (ALRC) review of the National Classification Scheme; and the Independent Media Inquiry (Finkelstein Review) into Media and Media Regulation. One unusual feature of this review process was the degree to which academics were involved in the process, not simply as providers of expert opinion, but as review chairs seconded from their universities. This paper considers the role played by activist groups in all of these inquiries and their relationship to the various participants in the inquiries, as well as the implications of academics being engaged in such inquiries, not simply as activist-scholars, but as those primarily responsible for delivering policy review outcomes. The paper draws upon the concept of "policy windows" in order to better understand the context in which the inquiries took place, and their relative lack of legislative impact.
Resumo:
We present a novel approach for multi-object detection in aerial videos based on tracking. The proposed method mainly involves three steps. Firstly, the spatial-temporal saliency is employed to detect moving objects. Secondly, the detected objects are tracked by mean shift in the subsequent frames. Finally, the saliency results are fused with the weight map generated by tracking to get refined detection results, and in turn the modified detection results are used to update the tracking models. The proposed algorithm is evaluated on VIVID aerial videos, and the results show that our approach can reliably detect moving objects even in challenging situations. Meanwhile, the proposed method can process videos in real time, without the effect of time delay.
Resumo:
Used frequently in food contact materials, bisphenol A (BPA) has been studied extensively in recent years, and ubiquitous exposure in the general population has been demonstrated worldwide. Characterising within- and between-individual variability of BPA concentrations is important for characterising exposure in biomonitoring studies, and this has been investigated previously in adults, but not in children. The aim of this study was to characterise the short-term variability of BPA in spot urine samples in young children. Children aged ≥2-<4 years (n = 25) were recruited from an existing cohort in Queensland Australia, and donated four spot urine samples each over a two day period. Samples were analysed for total BPA using isotope dilution online solid phase extraction-liquid chromatography-tandem mass spectrometry, and concentrations ranged from 0.53–74.5 ng/ml, with geometric mean and standard deviation of 2.70 ng/ml and 2.94 ng/ml, respectively. Sex and time of sample collection were not significant predictors of BPA concentration. The between-individual variability was approximately equal to the within-individual variability (ICC = 0.51), and this ICC is somewhat higher than previously reported literature values. This may be the result of physiological or behavioural differences between children and adults or of the relatively short exposure window assessed. Using a bootstrapping methodology, a single sample resulted in correct tertile classification approximately 70% of the time. This study suggests that single spot samples obtained from young children provide a reliable characterization of absolute and relative exposure over the short time window studied, but this may not hold true over longer timeframes.
Resumo:
A key shift of thinking for effective learning and teaching of listening input has been seen and organized in education locally and globally. This study has probed whether metacognitive instruction through a pedagogical cycle shifts high-intermediate students' English language learning and English as a second language (ESL) teacher's teaching focus on listening input. Twenty male Iranian students with an age range of 18 to 24 received a guided methodology including metacognitive strategies (planning, monitoring, and evaluation) for a period of three months. This study has used the strategies and probed the importance of metacognitive instruction through interviewing both the teacher and the students. The results have shown that metacognitive instruction helped both the ESL teacher's and the students' shift of thinking about teaching and learning listening input. This key shift of thinking has implications globally and locally for classroom practices of listening input.
Resumo:
Temporary Traffic Control Plans (TCP’s), which provide construction phasing to maintain traffic during construction operations, are integral component of highway construction project design. Using the initial design, designers develop estimated quantities for the required TCP devices that become the basis for bids submitted by highway contractors. However, actual as-built quantities are often significantly different from the engineer’s original estimate. The total cost of TCP phasing on highway construction projects amounts to 6–10% of the total construction cost. Variations between engineer estimated quantities and final quantities contribute to reduced cost control, increased chances of cost related litigations, and bid rankings and selection. Statistical analyses of over 2000 highway construction projects were performed to determine the sources of variation, which later were used as the basis of development for an automated-hybrid prediction model that uses multiple regressions and heuristic rules to provide accurate TCP quantities and costs. The predictive accuracy of the model developed was demonstrated through several case studies.
Improving the performance of nutrition screening through a series of quality improvement initiatives
Resumo:
Background Nutrition screening identifies patients at risk of malnutrition to facilitate early nutritional intervention. Studies have reported incompletion and error rates of 30-90% for a range of commonly used screening tools. This study aims to investigate the incompletion and error rates of 3-Minute Nutrition Screening (3-MinNS) and the effect of quality improvement initiatives in improving the overall performance of the screening tool and the referral process for at risk patients. Methods Annual audits were carried out from 2008-2013 on 4467 patients. Value Stream Mapping, Plan-Do-Check-Act cycle and Root Cause Analysis were used in this study to identify gaps and determine the best intervention. The intervention included 1) implementing a nutrition screening protocol, 2) nutrition screening training, 3) nurse empowerment for online dietetics referral of at-risk cases, 4) closed-loop feedback system and 5) removing a component of 3-MinNS that caused the most error without compromising its sensitivity and specificity. Results Nutrition screening error rates were 33% and 31%, with 5% and 8% blank or missing forms, in 2008 and 2009 respectively. For patients at risk of malnutrition, referral to dietetics took up to 7.5 days, with 10% not referred at all. After intervention, the latter decreased to 7% (2010), 4% (2011) and 3% (2012 and 2013), and the mean turnaround time from screening to referral was reduced significantly from 4.3 ± 1.8 days to 0.3 ± 0.4 days (p < 0.001). Error rates were reduced to 25% (2010), 15% (2011), 7% (2012) and 5% (2013) and percentage of blank or missing forms reduced to and remained at 1%. Conclusion Quality improvement initiatives are effective in reducing the incompletion and error rates of nutrition screening, and led to sustainable improvements in the referral process of patients at nutritional risk.
Resumo:
The present article gives an overview of the reversible addition fragmentation chain transfer (RAFT) process. RAFT is one of the most versatile living radical polymerization systems and yields polymers of predictable chain length and narrow molecular weight distribution. RAFT relies on the rapid exchange of thiocarbonyl thio groups between growing polymeric chains. The key strengths of the RAFT process for polymer design are its high tolerance of monomer functionality and reaction conditions, the wide range of well-controlled polymeric architectures achievable, and its (in-principle) non-rate-retarding nature. This article introduces the mechanism of polymerization, the range of polymer molecular weights achievable, the range of monomers in which polymerization is controlled by RAFT, the various polymeric architectures that can be obtained, the type of end-group functionalities available to RAFT-made polymers, and the process of RAFT polymerization.
Resumo:
IT resources are indispensable in the management of Public Sector Organizations (PSOs) around the world. We investigate the factors that could leverage the IT resources in PSOs in developing economies. While research on ways to leverage IT resources in private sector organizations of developed countries is substantial, our understanding on ways to leverage the IT resources in the public sector in developing countries is limited. The current study aspires to address this gap in the literature by seeking to determine the key factors required to create process value from public sector IT investments in developing countries. We draw on the resource-centric theories to imply the nature of factors that could leverage the IT resources in the public sector. Employing an interpretive design, we identified three factors necessary for IT process value generation in the public sector. We discuss these factors and state their implications to theory and practice.