975 resultados para Average models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Physical access control systems play a central role in the protection of critical infrastructures, where both the provision of timely access and preserving the security of sensitive areas are paramount. In this paper we discuss the shortcomings of existing approaches to the administration of physical access control in complex environments. At the heart of the problem is the current dependency on human administrators to reason about the implications of the provision or the revocation of staff access to an area within these facilities. We demonstrate how utilising Building Information Models (BIMs) and the capabilities they provide, including 3D representation of a facility and path-finding can reduce possible intentional or accidental errors made by security administrators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent efforts in mission planning for underwater vehicles have utilised predictive models to aid in navigation, optimal path planning and drive opportunistic sampling. Although these models provide information at a unprecedented resolutions and have proven to increase accuracy and effectiveness in multiple campaigns, most are deterministic in nature. Thus, predictions cannot be incorporated into probabilistic planning frameworks, nor do they provide any metric on the variance or confidence of the output variables. In this paper, we provide an initial investigation into determining the confidence of ocean model predictions based on the results of multiple field deployments of two autonomous underwater vehicles. For multiple missions conducted over a two-month period in 2011, we compare actual vehicle executions to simulations of the same missions through the Regional Ocean Modeling System in an ocean region off the coast of southern California. This comparison provides a qualitative analysis of the current velocity predictions for areas within the selected deployment region. Ultimately, we present a spatial heat-map of the correlation between the ocean model predictions and the actual mission executions. Knowing where the model provides unreliable predictions can be incorporated into planners to increase the utility and application of the deterministic estimations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Here we present a sequential Monte Carlo approach to Bayesian sequential design for the incorporation of model uncertainty. The methodology is demonstrated through the development and implementation of two model discrimination utilities; mutual information and total separation, but it can also be applied more generally if one has different experimental aims. A sequential Monte Carlo algorithm is run for each rival model (in parallel), and provides a convenient estimate of the marginal likelihood (of each model) given the data, which can be used for model comparison and in the evaluation of utility functions. A major benefit of this approach is that it requires very little problem specific tuning and is also computationally efficient when compared to full Markov chain Monte Carlo approaches. This research is motivated by applications in drug development and chemical engineering.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Australian higher education institutions (HEIs) have entered a new phase of regulation and accreditation which includes performance-based funding relating to the participation and retention of students from social and cultural groups previously underrepresented in higher education. However, in addressing these priorities, it is critical that HEIs do not further disadvantage students from certain groups by identifying them for attention because of their social or cultural backgrounds, circumstances which are largely beyond the control of students. In response, many HEIs are focusing effort on university-wide approaches to enhancing the student experience because such approaches will enhance the engagement, success and retention of all students, and in doing so, particularly benefit those students who come from underrepresented groups. Measuring and benchmarking student experiences and engagement that arise from these efforts is well supported by extensive collections of student experience survey data. However no comparable instrument exists that measures the capability of institutions to influence and/or enhance student experiences where capability is an indication of how well an organisational process does what it is designed to do (Rosemann & de Bruin, 2005). This paper proposes that the concept of a maturity model (Marshall, 2010; Paulk, 1999) may be useful as a way of assessing the capability of HEIs to provide and implement student engagement, success and retention activities. We will describe the Student Engagement, Success and Retention Maturity Model (SESR-MM), (Clarke, Nelson & Stoodley, 2012; Nelson, Clarke & Stoodley, 2012) we are currently investigating. We will discuss if our research may address the current gap by facilitating the development of an SESR-MM instrument that aims (i) to enable institutions to assess the capability of their current student engagement and retention programs and strategies to influence and respond to student experiences within the institution; and (ii) to provide institutions with the opportunity to understand various practices across the sector with a view to further improving programs and practices relevant to their context. The first aim of our research is to extend the generational approach which has been useful in considering the evolutionary nature of the first year experience (FYE) (Wilson, 2009). Three generations have been identified and explored: First generation approaches that focus on co-curricular strategies (e.g. orientation and peer programs); Second generation approaches that focus on curriculum (e.g. pedagogy, curriculum design, and learning and teaching practice); and third generation approaches—also referred to as transition pedagogy—that focus on the production of an institution-wide integrated holistic intentional blend of curricular and co-curricular activities (Kift, Nelson & Clarke, 2010). The second aim of this research is to move beyond assessments of students’ experiences to focus on assessing institutional processes and their capability to influence student engagement. In essence, we propose to develop and use the maturity model concept to produce an instrument that will indicate the capability of HEIs to manage and improve student engagement, success and retention programs and strategies. References Australian Council for Educational Research. (n.d.). Australasian Survey of Student Engagement. Retrieved from http://www.acer.edu.au/research/ausse/background Clarke, J., Nelson, K., & Stoodley, I. (2012, July). The Maturity Model concept as framework for assessing the capability of higher education institutions to address student engagement, success and retention: New horizon or false dawn? A Nuts & Bolts presentation at the 15th International Conference on the First Year in Higher Education, “New Horizons,” Brisbane, Australia. Kift, S., Nelson, K., & Clarke, J. (2010) Transition pedagogy - a third generation approach to FYE: A case study of policy and practice for the higher education sector. The International Journal of the First Year in Higher Education, 1(1), pp. 1-20. Department of Education, Employment and Workplace Relations. (n.d.). The University Experience Survey. Advancing quality in higher education information sheet. Retrieved from http://www.deewr.gov.au/HigherEducation/Policy/Documents/University_Experience_Survey.pdf Marshall, S. (2010). A quality framework for continuous improvement of e-Learning: The e-Learning Maturity Model. Journal of Distance Education, 24(1), 143-166. Nelson, K., Clarke, J., & Stoodley, I. (2012). An exploration of the Maturity Model concept as a vehicle for higher education institutions to assess their capability to address student engagement. A work in progress. Submitted for publication. Paulk, M. (1999). Using the Software CMM with good judgment, ASQ Software Quality Professional, 1(3), 19-29. Wilson, K. (2009, June–July). The impact of institutional, programmatic and personal interventions on an effective and sustainable first-year student experience. Keynote address presented at the 12th Pacific Rim First Year in Higher Education Conference, “Preparing for Tomorrow Today: The First Year as Foundation,” Townsville, Australia. Retrieved from http://www.fyhe.com.au/past_papers/papers09/ppts/Keithia_Wilson_paper.pdf

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we present a methodology for designing experiments for efficiently estimating the parameters of models with computationally intractable likelihoods. The approach combines a commonly used methodology for robust experimental design, based on Markov chain Monte Carlo sampling, with approximate Bayesian computation (ABC) to ensure that no likelihood evaluations are required. The utility function considered for precise parameter estimation is based upon the precision of the ABC posterior distribution, which we form efficiently via the ABC rejection algorithm based on pre-computed model simulations. Our focus is on stochastic models and, in particular, we investigate the methodology for Markov process models of epidemics and macroparasite population evolution. The macroparasite example involves a multivariate process and we assess the loss of information from not observing all variables.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently, ‘business model’ and ‘business model innovation’ have gained substantial attention in management literature and practice. However, many firms lack the capability to develop a novel business model to capture the value from new technologies. Existing literature on business model innovation highlights the central role of ‘customer value’. Further, it suggests that firms need to experiment with different business models and engage in ‘trail-and-error’ learning when participating in business model innovation. Trial-and error processes and prototyping with tangible artifacts are a fundamental characteristic of design. This conceptual paper explores the role of design-led innovation in facilitating firms to conceive and prototype novel and meaningful business models. It provides a brief review of the conceptual discussion on business model innovation and highlights the opportunities for linking it with the research stream of design-led innovation. We propose design-led business model innovation as a future research area and highlight the role of design-led prototyping and new types of artifacts and prototypes play within it. We present six propositions in order to outline future research avenues.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The identification of the primary drivers of stock returns has been of great interest to both financial practitioners and academics alike for many decades. Influenced by classical financial theories such as the CAPM (Sharp, 1964; Lintner, 1965) and APT (Ross, 1976), a linear relationship is conventionally assumed between company characteristics as derived from their financial accounts and forward returns. Whilst this assumption may be a fair approximation to the underlying structural relationship, it is often adopted for the purpose of convenience. It is actually quite rare that the assumptions of distributional normality and a linear relationship are explicitly assessed in advance even though this information would help to inform the appropriate choice of modelling technique. Non-linear models have nevertheless been applied successfully to the task of stock selection in the past (Sorensen et al, 2000). However, their take-up by the investment community has been limited despite the fact that researchers in other fields have found them to be a useful way to express knowledge and aid decision-making...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Animal models typically require a known genetic pedigree to estimate quantitative genetic parameters. Here we test whether animal models can alternatively be based on estimates of relatedness derived entirely from molecular marker data. Our case study is the morphology of a wild bird population, for which we report estimates of the genetic variance-covariance matrices (G) of six morphological traits using three methods: the traditional animal model; a molecular marker-based approach to estimate heritability based on Ritland's pairwise regression method; and a new approach using a molecular genealogy arranged in a relatedness matrix (R) to replace the pedigree in an animal model. Using the traditional animal model, we found significant genetic variance for all six traits and positive genetic covariance among traits. The pairwise regression method did not return reliable estimates of quantitative genetic parameters in this population, with estimates of genetic variance and covariance typically being very small or negative. In contrast, we found mixed evidence for the use of the pedigree-free animal model. Similar to the pairwise regression method, the pedigree-free approach performed poorly when the full-rank R matrix based on the molecular genealogy was employed. However, performance improved substantially when we reduced the dimensionality of the R matrix in order to maximize the signal to noise ratio. Using reduced-rank R matrices generated estimates of genetic variance that were much closer to those from the traditional model. Nevertheless, this method was less reliable at estimating covariances, which were often estimated to be negative. Taken together, these results suggest that pedigree-free animal models can recover quantitative genetic information, although the signal remains relatively weak. It remains to be determined whether this problem can be overcome by the use of a more powerful battery of molecular markers and improved methods for reconstructing genealogies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

3D models of long bones are being utilised for a number of fields including orthopaedic implant design. Accurate reconstruction of 3D models is of utmost importance to design accurate implants to allow achieving a good alignment between two bone fragments. Thus for this purpose, CT scanners are employed to acquire accurate bone data exposing an individual to a high amount of ionising radiation. Magnetic resonance imaging (MRI) has been shown to be a potential alternative to computed tomography (CT) for scanning of volunteers for 3D reconstruction of long bones, essentially avoiding the high radiation dose from CT. In MRI imaging of long bones, the artefacts due to random movements of the skeletal system create challenges for researchers as they generate inaccuracies in the 3D models generated by using data sets containing such artefacts. One of the defects that have been observed during an initial study is the lateral shift artefact occurring in the reconstructed 3D models. This artefact is believed to result from volunteers moving the leg during two successive scanning stages (the lower limb has to be scanned in at least five stages due to the limited scanning length of the scanner). As this artefact creates inaccuracies in the implants designed using these models, it needs to be corrected before the application of 3D models to implant design. Therefore, this study aimed to correct the lateral shift artefact using 3D modelling techniques. The femora of five ovine hind limbs were scanned with a 3T MRI scanner using a 3D vibe based protocol. The scanning was conducted in two halves, while maintaining a good overlap between them. A lateral shift was generated by moving the limb several millimetres between two scanning stages. The 3D models were reconstructed using a multi threshold segmentation method. The correction of the artefact was achieved by aligning the two halves using the robust iterative closest point (ICP) algorithm, with the help of the overlapping region between the two. The models with the corrected artefact were compared with the reference model generated by CT scanning of the same sample. The results indicate that the correction of the artefact was achieved with an average deviation of 0.32 ± 0.02 mm between the corrected model and the reference model. In comparison, the model obtained from a single MRI scan generated an average error of 0.25 ± 0.02 mm when compared with the reference model. An average deviation of 0.34 ± 0.04 mm was seen when the models generated after the table was moved were compared to the reference models; thus, the movement of the table is also a contributing factor to the motion artefacts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traffic safety studies demand more than what current micro-simulation models can provide as they presume that all drivers of motor vehicles exhibit safe behaviours. Several car-following models are used in various micro-simulation models. This research compares the mainstream car following models’ capabilities of emulating precise driver behaviour parameters such as headways and Time to Collisions. The comparison firstly illustrates which model is more robust in the metric reproduction. Secondly, the study conducted a series of sensitivity tests to further explore the behaviour of each model. Based on the outcome of these two steps exploration of the models, a modified structure and parameters adjustment for each car-following model is proposed to simulate more realistic vehicle movements, particularly headways and Time to Collision, below a certain critical threshold. NGSIM vehicle trajectory data is used to evaluate the modified models performance to assess critical safety events within traffic flow. The simulation tests outcomes indicate that the proposed modified models produce better frequency of critical Time to Collision than the generic models, while the improvement on the headway is not significant. The outcome of this paper facilitates traffic safety assessment using microscopic simulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cognitive radio is an emerging technology proposing the concept of dynamic spec- trum access as a solution to the looming problem of spectrum scarcity caused by the growth in wireless communication systems. Under the proposed concept, non- licensed, secondary users (SU) can access spectrum owned by licensed, primary users (PU) so long as interference to PU are kept minimal. Spectrum sensing is a crucial task in cognitive radio whereby the SU senses the spectrum to detect the presence or absence of any PU signal. Conventional spectrum sensing assumes the PU signal as ‘stationary’ and remains in the same activity state during the sensing cycle, while an emerging trend models PU as ‘non-stationary’ and undergoes state changes. Existing studies have focused on non-stationary PU during the transmission period, however very little research considered the impact on spectrum sensing when the PU is non-stationary during the sensing period. The concept of PU duty cycle is developed as a tool to analyse the performance of spectrum sensing detectors when detecting non-stationary PU signals. New detectors are also proposed to optimise detection with respect to duty cycle ex- hibited by the PU. This research consists of two major investigations. The first stage investigates the impact of duty cycle on the performance of existing detec- tors and the extent of the problem in existing studies. The second stage develops new detection models and frameworks to ensure the integrity of spectrum sensing when detecting non-stationary PU signals. The first investigation demonstrates that conventional signal model formulated for stationary PU does not accurately reflect the behaviour of a non-stationary PU. Therefore the performance calculated and assumed to be achievable by the conventional detector does not reflect actual performance achieved. Through analysing the statistical properties of duty cycle, performance degradation is proved to be a problem that cannot be easily neglected in existing sensing studies when PU is modelled as non-stationary. The second investigation presents detectors that are aware of the duty cycle ex- hibited by a non-stationary PU. A two stage detection model is proposed to improve the detection performance and robustness to changes in duty cycle. This detector is most suitable for applications that require long sensing periods. A second detector, the duty cycle based energy detector is formulated by integrat- ing the distribution of duty cycle into the test statistic of the energy detector and suitable for short sensing periods. The decision threshold is optimised with respect to the traffic model of the PU, hence the proposed detector can calculate average detection performance that reflect realistic results. A detection framework for the application of spectrum sensing optimisation is proposed to provide clear guidance on the constraints on sensing and detection model. Following this framework will ensure the signal model accurately reflects practical behaviour while the detection model implemented is also suitable for the desired detection assumption. Based on this framework, a spectrum sensing optimisation algorithm is further developed to maximise the sensing efficiency for non-stationary PU. New optimisation constraints are derived to account for any PU state changes within the sensing cycle while implementing the proposed duty cycle based detector.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Malaria is a major public health burden in the tropics with the potential to significantly increase in response to climate change. Analyses of data from the recent past can elucidate how short-term variations in weather factors affect malaria transmission. This study explored the impact of climate variability on the transmission of malaria in the tropical rain forest area of Mengla County, south-west China. Methods: Ecological time-series analysis was performed on data collected between 1971 and 1999. Auto-regressive integrated moving average (ARIMA) models were used to evaluate the relationship between weather factors and malaria incidence. Results: At the time scale of months, the predictors for malaria incidence included: minimum temperature, maximum temperature, and fog day frequency. The effect of minimum temperature on malaria incidence was greater in the cool months than in the hot months. The fog day frequency in October had a positive effect on malaria incidence in May of the following year. At the time scale of years, the annual fog day frequency was the only weather predictor of the annual incidence of malaria. Conclusion: Fog day frequency was for the first time found to be a predictor of malaria incidence in a rain forest area. The one-year delayed effect of fog on malaria transmission may involve providing water input and maintaining aquatic breeding sites for mosquitoes in vulnerable times when there is little rainfall in the 6-month dry seasons. These findings should be considered in the prediction of future patterns of malaria for similar tropical rain forest areas worldwide.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In various industrial and scientific fields, conceptual models are derived from real world problem spaces to understand and communicate containing entities and coherencies. Abstracted models mirror the common understanding and information demand of engineers, who apply conceptual models for performing their daily tasks. However, most standardized models in Process Management, Product Lifecycle Management and Enterprise Resource Planning lack of a scientific foundation for their notation. In collaboration scenarios with stakeholders from several disciplines, tailored conceptual models complicate communication processes, as a common understanding is not shared or implemented in specific models. To support direct communication between experts from several disciplines, a visual language is developed which allows a common visualization of discipline-specific conceptual models. For visual discrimination and to overcome visual complexity issues, conceptual models are arranged in a three-dimensional space. The visual language introduced here follows and extends established principles of Visual Language science.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dengue fever is one of the world’s most important vector-borne diseases. The transmission area of this disease continues to expand due to many factors including urban sprawl, increased travel and global warming. Current preventative techniques are primarily based on controlling mosquito vectors as other prophylactic measures, such as a tetravalent vaccine are unlikely to be available in the foreseeable future. However, the continually increasing dengue incidence suggests that this strategy alone is not sufficient. Epidemiological models attempt to predict future outbreaks using information on the risk factors of the disease. Through a systematic literature review, this paper aims at analyzing the different modeling methods and their outputs in terms of accurately predicting disease outbreaks. We found that many previous studies have not sufficiently accounted for the spatio-temporal features of the disease in the modeling process. Yet with advances in technology, the ability to incorporate such information as well as the socio-environmental aspect allowed for its use as an early warning system, albeit limited geographically to a local scale.