623 resultados para Averaged models
Resumo:
Animal models typically require a known genetic pedigree to estimate quantitative genetic parameters. Here we test whether animal models can alternatively be based on estimates of relatedness derived entirely from molecular marker data. Our case study is the morphology of a wild bird population, for which we report estimates of the genetic variance-covariance matrices (G) of six morphological traits using three methods: the traditional animal model; a molecular marker-based approach to estimate heritability based on Ritland's pairwise regression method; and a new approach using a molecular genealogy arranged in a relatedness matrix (R) to replace the pedigree in an animal model. Using the traditional animal model, we found significant genetic variance for all six traits and positive genetic covariance among traits. The pairwise regression method did not return reliable estimates of quantitative genetic parameters in this population, with estimates of genetic variance and covariance typically being very small or negative. In contrast, we found mixed evidence for the use of the pedigree-free animal model. Similar to the pairwise regression method, the pedigree-free approach performed poorly when the full-rank R matrix based on the molecular genealogy was employed. However, performance improved substantially when we reduced the dimensionality of the R matrix in order to maximize the signal to noise ratio. Using reduced-rank R matrices generated estimates of genetic variance that were much closer to those from the traditional model. Nevertheless, this method was less reliable at estimating covariances, which were often estimated to be negative. Taken together, these results suggest that pedigree-free animal models can recover quantitative genetic information, although the signal remains relatively weak. It remains to be determined whether this problem can be overcome by the use of a more powerful battery of molecular markers and improved methods for reconstructing genealogies.
Resumo:
Traffic safety studies demand more than what current micro-simulation models can provide as they presume that all drivers of motor vehicles exhibit safe behaviours. Several car-following models are used in various micro-simulation models. This research compares the mainstream car following models’ capabilities of emulating precise driver behaviour parameters such as headways and Time to Collisions. The comparison firstly illustrates which model is more robust in the metric reproduction. Secondly, the study conducted a series of sensitivity tests to further explore the behaviour of each model. Based on the outcome of these two steps exploration of the models, a modified structure and parameters adjustment for each car-following model is proposed to simulate more realistic vehicle movements, particularly headways and Time to Collision, below a certain critical threshold. NGSIM vehicle trajectory data is used to evaluate the modified models performance to assess critical safety events within traffic flow. The simulation tests outcomes indicate that the proposed modified models produce better frequency of critical Time to Collision than the generic models, while the improvement on the headway is not significant. The outcome of this paper facilitates traffic safety assessment using microscopic simulation.
Resumo:
In various industrial and scientific fields, conceptual models are derived from real world problem spaces to understand and communicate containing entities and coherencies. Abstracted models mirror the common understanding and information demand of engineers, who apply conceptual models for performing their daily tasks. However, most standardized models in Process Management, Product Lifecycle Management and Enterprise Resource Planning lack of a scientific foundation for their notation. In collaboration scenarios with stakeholders from several disciplines, tailored conceptual models complicate communication processes, as a common understanding is not shared or implemented in specific models. To support direct communication between experts from several disciplines, a visual language is developed which allows a common visualization of discipline-specific conceptual models. For visual discrimination and to overcome visual complexity issues, conceptual models are arranged in a three-dimensional space. The visual language introduced here follows and extends established principles of Visual Language science.
Resumo:
Dengue fever is one of the world’s most important vector-borne diseases. The transmission area of this disease continues to expand due to many factors including urban sprawl, increased travel and global warming. Current preventative techniques are primarily based on controlling mosquito vectors as other prophylactic measures, such as a tetravalent vaccine are unlikely to be available in the foreseeable future. However, the continually increasing dengue incidence suggests that this strategy alone is not sufficient. Epidemiological models attempt to predict future outbreaks using information on the risk factors of the disease. Through a systematic literature review, this paper aims at analyzing the different modeling methods and their outputs in terms of accurately predicting disease outbreaks. We found that many previous studies have not sufficiently accounted for the spatio-temporal features of the disease in the modeling process. Yet with advances in technology, the ability to incorporate such information as well as the socio-environmental aspect allowed for its use as an early warning system, albeit limited geographically to a local scale.
Resumo:
Time series regression models were used to examine the influence of environmental factors (soil water content and soil temperature) on the emissions of nitrous oxide (N2O) from subtropical soils, by taking into account temporal lagged environmental factors, autoregressive processes, and seasonality for three horticultural crops in a subtropical region of Australia. Fluxes of N2O, soil water content, and soil temperature were determined simultaneously on a weekly basis over a 12-month period in South East Queensland. Annual N2O emissions for soils under mango, pineapple, and custard apple were 1590, 1156, and 2038 g N2O-N/ha, respectively, with most emissions attributed to nitrification. The N2O-N emitted from the pineapple and custard apple crops was equivalent to 0.26 and 2.22%, respectively, of the applied mineral N. The change in soil water content was the key variable for describing N2O emissions at the weekly time-scale, with soil temperature at a lag of 1 month having a significant influence on average N2O emissions (averaged) at the monthly time-scale across the three crops. After accounting for soil temperature and soil water content, both the weekly and monthly time series regression models exhibited significant autocorrelation at lags of 1–2 weeks and 1–2 months, and significant seasonality for weekly N2O emissions for mango crop and for monthly N2O emissions for mango and custard apple crops in this location over this time-frame. Time series regression models can explain a higher percentage of the temporal variation of N2O emission compared with simple regression models using soil temperature and soil water content as drivers. Taking into account seasonal variability and temporal persistence in N2O emissions associated with soil water content and soil temperature may lead to a reduction in the uncertainty surrounding estimates of N2O emissions based on limited sampling effort.
Resumo:
The aim of this study was to identify what outcome measures or quality indicators are being used to evaluate advanced and new roles in nine allied health professions and whether the measures are evaluating outcomes of interest to the patient, the clinician, or the healthcare provider. A systematic search strategy was used. Medical and allied health databases were searched and relevant articles extracted. Relevant studies with at least 1 outcome measure were evaluated. A total of 106 articles were identified that described advanced roles, however, only 23 of these described an outcome measure in sufficient detail to be included for review. The majority of the reported measures fit into the economic and process categories. The most reported outcome related to patients was satisfaction surveys. Measures of patient health outcomes were infrequently reported. It is unclear from the studies evaluated whether new models of allied healthcare can be shown to be as safe and effective as traditional care for a given procedure. Outcome measures chosen to evaluate these services often reflect organizational need and not patient outcomes. Organizations need to ensure that high-quality performance measures are chosen to evaluate the success of new health service innovations. There needs to be a move away from in-house type surveys that add little or no valid evidence as to the effect of a new innovation. More importance needs to be placed on patient outcomes as a measure of the quality of allied health interventions.
Resumo:
Emergency health is a critical component of Australia’s health system and one which is increasingly congested from growing demand and blocked access to inpatient beds. The Emergency Health Services Queensland (EHSQ) study aims to identify the factors driving increased demand for emergency health and to evaluate strategies which may safely reduce the future demand growth. This monograph addresses the characteristics of users of emergency health services with an aim to identify those that appear to contribute to demand growth. This study utilises data on patients treated by Emergency Departments (ED) and Queensland Ambulance Service (QAS) across Queensland. ED data was derived from the Emergency Department Information System (EDIS) for the period 2001-02 through to 2010-11. Ambulance data was extracted from the QAS’ Ambulance Information Management System (AIMS) and electronic Ambulance Report Form (eARF) for the period 2001-02 through to 2009-10. Due to discrepancies and comparability issues for ED data, this monograph compares data from the 2003-04 time period with 2010-11 data for 21 of the reporting EDs. Also a snapshot of users for the 2010-11 financial year for 31 reporting EDs is used to describe the characteristics of users and to compare those characteristics with population demographics. For QAS data, the 2002-03 and 2009-10 time periods were selected for detailed analyses to identify trends. • Demand for emergency health care services is increasing, representing both increased population and increased relative utilisation. Per capita demand for ED attention has increased by 2% per annum over the last decade and for ambulance attention by 3.7% per annum. • The growth in ED demand is prominent in more urgent triage categories with actual decline in less urgent patients. An estimated 55% of patients attend hospital EDs outside of normal working hours. There is no evidence that patients presenting out of hours are significantly different to those presenting within working hours; they have similar triage assessments and outcomes. • Patients suffering from injuries and poisoning comprise 28% of the ED workload (an increase of 65% in the study period), whilst declines of 32% in cardiovascular and circulatory conditions, and musculoskeletal problems have been observed. • 25.6% of patients attending EDs are admitted to hospital. 19% of admitted patients and 7% of patients who die in the ED are triage category 4 or 5 on arrival. • The average age of ED patients is 35.6 years. Demand has grown in all age groups and amongst both men and women. Men have higher utilisation rates for ED in all age groups. The only group where the growth rate in women has exceeded men is in the 20-29 age group; this growth is particularly in the injury and poisoning categories. • Considerable attention has been paid publicly to ED performance criteria. It is worth noting that 50% of all patients were treated within 33 minutes of arrival. • Patients from lower socioeconomic areas appear to have higher utilisation rates and the utilisation rate for indigenous people appears to exceed those of European and other backgrounds. The utilisation rates for immigrant people is generally less than that of Australian born however it has not been possible to eliminate the confounding impact of different age and socioeconomic profiles. • Demand for ambulance service is also increasing at a rate that exceeds population growth. Utilisation rates have increased by an average of 5% per annum in Queensland compared to 3.6% nationally, and the utilisation rate in Queensland is 27% higher than the national average. • The growth in ambulance utilisation has also been amongst the more urgent categories of dispatch and utilisation rates are higher in rural and regional areas than in the metropolitan area. The demand for ambulance increases with age but the growth in demand for ambulance service has been more prominent in younger age groups. These findings contribute significantly to an understanding of the growth in demand for emergency health. It shows that the growth is amongst patients in genuine need of emergency healthcare and public rhetoric that the congestion of emergency health services is due to inappropriate attendees is unable to be substantiated. The consistency of the growth in demand over the last decade reflects not only the changing demographics of the Australian population but also the changes in health status, standards of acute health care and other social factors. The growth is also amongst patients with acute injury and poisoning which is inconsistent with rates of chronic disease as a fundamental driver. We have also interviewed patients in regard to their decision making choices for acute health care and the factors that influence these decisions and this will be the subject of a third Monograph and publications.
Resumo:
Developers and policy makers are consistently at odds over the debate as to whether impact fees increase house prices. This debate continues despite the extensive body of theoretical and empirical international literature that discusses the passing on to home buyers of impact fees, and the corresponding increase to housing prices. In attempting to quantify this impact, over a dozen empirical studies have been carried out in the US and Canada since the 1980’s. However the methodologies used vary greatly, as do the results. Despite similar infrastructure funding policies in numerous developed countries, no such empirical works exist outside of the US/Canada. The purpose of this research is to analyse the existing econometric models in order to identify, compare and contrast the theoretical bases, methodologies, key assumptions and findings of each. This research will assist in identifying if further model development is required and/or whether any of these models have external validity and are readily transferable outside of the US. The findings conclude that there is very little explicit rationale behind the various model selections and that significant model deficiencies appear still to exist.
Resumo:
This paper examines the case of a procurement auction for a single project, in which the breakdown of the winning bid into its component items determines the value of payments subsequently made to bidder as the work progresses. Unbalanced bidding, or bid skewing, involves the uneven distribution of mark-up among the component items in such a way as to attempt to derive increased benefit to the unbalancer but without involving any change in the total bid. One form of unbalanced bidding for example, termed Front Loading (FL), is thought to be widespread in practice. This involves overpricing the work items that occur early in the project and underpricing the work items that occur later in the project in order to enhance the bidder's cash flow. Naturally, auctioners attempt to protect themselves from the effects of unbalancing—typically reserving the right to reject a bid that has been detected as unbalanced. As a result, models have been developed to both unbalance bids and detect unbalanced bids but virtually nothing is known of their use, success or otherwise. This is of particular concern for the detection methods as, without testing, there is no way of knowing the extent to which unbalanced bids are remaining undetected or balanced bids are being falsely detected as unbalanced. This paper reports on a simulation study aimed at demonstrating the likely effects of unbalanced bid detection models in a deterministic environment involving FL unbalancing in a Texas DOT detection setting, in which bids are deemed to be unbalanced if an item exceeds a maximum (or fails to reach a minimum) ‘cut-off’ value determined by the Texas method. A proportion of bids are automatically and maximally unbalanced over a long series of simulated contract projects and the profits and detection rates of both the balancers and unbalancers are compared. The results show that, as expected, the balanced bids are often incorrectly detected as unbalanced, with the rate of (mis)detection increasing with the proportion of FL bidders in the auction. It is also shown that, while the profit for balanced bidders remains the same irrespective of the number of FL bidders involved, the FL bidder's profit increases with the greater proportion of FL bidders present in the auction. Sensitivity tests show the results to be generally robust, with (mis)detection rates increasing further when there are fewer bidders in the auction and when more data are averaged to determine the baseline value, but being smaller or larger with increased cut-off values and increased cost and estimate variability depending on the number of FL bidders involved. The FL bidder's expected benefit from unbalancing, on the other hand, increases, when there are fewer bidders in the auction. It also increases when the cut-off rate and discount rate is increased, when there is less variability in the costs and their estimates, and when less data are used in setting the baseline values.
Resumo:
A synthesis is presented of the predictive capability of a family of near-wall wall-normal free Reynolds stress models (which are completely independent of wall topology, i.e., of the distance fromthe wall and the normal-to-thewall orientation) for oblique-shock-wave/turbulent-boundary-layer interactions. For the purpose of comparison, results are also presented using a standard low turbulence Reynolds number k–ε closure and a Reynolds stress model that uses geometric wall normals and wall distances. Studied shock-wave Mach numbers are in the range MSW = 2.85–2.9 and incoming boundary-layer-thickness Reynolds numbers are in the range Reδ0 = 1–2×106. Computations were carefully checked for grid convergence. Comparison with measurements shows satisfactory agreement, improving on results obtained using a k–ε model, and highlights the relative importance of redistribution and diffusion closures, indicating directions for future modeling work.
Resumo:
The use of Bayesian methodologies for solving optimal experimental design problems has increased. Many of these methods have been found to be computationally intensive for design problems that require a large number of design points. A simulation-based approach that can be used to solve optimal design problems in which one is interested in finding a large number of (near) optimal design points for a small number of design variables is presented. The approach involves the use of lower dimensional parameterisations that consist of a few design variables, which generate multiple design points. Using this approach, one simply has to search over a few design variables, rather than searching over a large number of optimal design points, thus providing substantial computational savings. The methodologies are demonstrated on four applications, including the selection of sampling times for pharmacokinetic and heat transfer studies, and involve nonlinear models. Several Bayesian design criteria are also compared and contrasted, as well as several different lower dimensional parameterisation schemes for generating the many design points.
Resumo:
This chapter is a tutorial that teaches you how to design extended finite state machine (EFSM) test models for a system that you want to test. EFSM models are more powerful and expressive than simple finite state machine (FSM) models, and are one of the most commonly used styles of models for model-based testing, especially for embedded systems. There are many languages and notations in use for writing EFSM models, but in this tutorial we write our EFSM models in the familiar Java programming language. To generate tests from these EFSM models we use ModelJUnit, which is an open-source tool that supports several stochastic test generation algorithms, and we also show how to write your own model-based testing tool. We show how EFSM models can be used for unit testing and system testing of embedded systems, and for offline testing as well as online testing.
Resumo:
This article addresses the transformation of a process model with an arbitrary topology into an equivalent structured process model. In particular, this article studies the subclass of process models that have no equivalent well-structured representation but which, nevertheless, can be partially structured into their maximally-structured representation. The transformations are performed under a behavioral equivalence notion that preserves the observed concurrency of tasks in equivalent process models. The article gives a full characterization of the subclass of acyclic process models that have no equivalent well-structured representation, but do have an equivalent maximally-structured one, as well as proposes a complete structuring method. Together with our previous results, this article completes the solution of the process model structuring problem for the class of acyclic process models.
Resumo:
Process mining encompasses the research area which is concerned with knowledge discovery from information system event logs. Within the process mining research area, two prominent tasks can be discerned. First of all, process discovery deals with the automatic construction of a process model out of an event log. Secondly, conformance checking focuses on the assessment of the quality of a discovered or designed process model in respect to the actual behavior as captured in event logs. Hereto, multiple techniques and metrics have been developed and described in the literature. However, the process mining domain still lacks a comprehensive framework for assessing the goodness of a process model from a quantitative perspective. In this study, we describe the architecture of an extensible framework within ProM, allowing for the consistent, comparative and repeatable calculation of conformance metrics. For the development and assessment of both process discovery as well as conformance techniques, such a framework is considered greatly valuable.