629 resultados para biological models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to provide of a review of the theory and models underlying project management (PM) research degrees that encourage reflective learning. Design/methodology/approach – Review of the literature and reflection on the practice of being actively involved in conducting and supervising academic research and disseminating academic output. The paper argues the case for the potential usefulness of reflective academic research to PM practitioners. It also highlights theoretical drivers of and barriers to reflective academic research by PM practitioners. Findings – A reflective learning approach to research can drive practical results though it requires a great deal of commitment and support by both academic and industry partners. Practical implications – This paper suggests how PM practitioners can engage in academic research that has practical outcomes and how to be more effective at disseminating these research outcomes. Originality/value – Advanced academic degrees, in particular those completed by PM practitioners, can validate a valuable source of innovative ideas and approaches that should be more quickly absorbed into the PM profession’s sources of knowledge. The value of this paper is to critically review and facilitate a reduced adaptation time for implementation of useful reflective academic research to industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Australian income tax regime is generally regarded as a mechanism by which the Federal Government raises revenue, with much of the revenue raised used to support public spending programs. A prime example of this type of spending program is health care. However, a government may also decide that the private sector should provide a greater share of the nation's health care. To achieve such a policy it can bring about change through positive regulation, or it can use the taxation regime, via tax expenditures, not to raise revenue but to steer or influence individuals in its desired direction. When used for this purpose, tax expenditures steer taxpayers towards or away from certain behaviour by either imposing costs on, or providing benefits to them. Within the context of the health sector, the Australian Federal Government deploys social steering via the tax system, with the Medicare Levy Surcharge and the 30 percent Private Health Insurance Rebate intended to steer taxpayer behaviour towards the Government’s policy goal of increasing the amount of health provision through the private sector. These steering mechanisms are complemented by the ‘Lifetime Health Cover Initiative’. This article, through the lens of behavioural economics, considers the ways in which these assorted mechanisms might have been expected to operate and whether they encourage individuals to purchase private health insurance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Airports represent the epitome of complex systems with multiple stakeholders, multiple jurisdictions and complex interactions between many actors. The large number of existing models that capture different aspects of the airport are a testament to this. However, these existing models do not consider in a systematic sense modelling requirements nor how stakeholders such as airport operators or airlines would make use of these models. This can detrimentally impact on the verification and validation of models and makes the development of extensible and reusable modelling tools difficult. This paper develops from the Concept of Operations (CONOPS) framework a methodology to help structure the review and development of modelling capabilities and usage scenarios. The method is applied to the review of existing airport terminal passenger models. It is found that existing models can be broadly categorised according to four usage scenarios: capacity planning, operational planning and design, security policy and planning, and airport performance review. The models, the performance metrics that they evaluate and their usage scenarios are discussed. It is found that capacity and operational planning models predominantly focus on performance metrics such as waiting time, service time and congestion whereas performance review models attempt to link those to passenger satisfaction outcomes. Security policy models on the other hand focus on probabilistic risk assessment. However, there is an emerging focus on the need to be able to capture trade-offs between multiple criteria such as security and processing time. Based on the CONOPS framework and literature findings, guidance is provided for the development of future airport terminal models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fire safety of buildings has been recognised as very important by the building industry and the community at large. Gypsum plasterboards are widely used to protect light gauge steel frame (LSF) walls all over the world. Gypsum contains free and chemically bound water in its crystal structure. Plasterboard also contains gypsum (CaSO4.2H2O) and calcium carbonate (CaCO3). The dehydration of gypsum and the decomposition of calcium carbonate absorb heat, and thus are able to protect LSF walls from fires. Kolarkar and Mahendran (2008) developed an innovative composite wall panel system, where the insulation was sandwiched between two plasterboards to improve the thermal and structural performance of LSF wall panels under fire conditions. In order to understand the performance of gypsum plasterboards and LSF wall panels under standard fire conditions, many experiments were conducted in the Fire Research Laboratory of Queensland University of Technology (Kolarkar, 2010). Fire tests were conducted on single, double and triple layers of Type X gypsum plasterboards and load bearing LSF wall panels under standard fire conditions. However, suitable numerical models have not been developed to investigate the thermal performance of LSF walls using the innovative composite panels under standard fire conditions. Continued reliance on expensive and time consuming fire tests is not acceptable. Therefore this research developed suitable numerical models to investigate the thermal performance of both plasterboard assemblies and load bearing LSF wall panels. SAFIR, a finite element program, was used to investigate the thermal performance of gypsum plasterboard assemblies and LSF wall panels under standard fire conditions. Appropriate values of important thermal properties were proposed for plasterboards and insulations based on laboratory tests, literature review and comparisons of finite element analysis results of small scale plasterboard assemblies from this research and corresponding experimental results from Kolarkar (2010). The important thermal properties (thermal conductivity, specific heat capacity and density) of gypsum plasterboard and insulation materials were proposed as functions of temperature and used in the numerical models of load bearing LSF wall panels. Using these thermal properties, the developed finite element models were able to accurately predict the time temperature profiles of plasterboard assemblies while they predicted them reasonably well for load bearing LSF wall systems despite the many complexities that are present in these LSF wall systems under fires. This thesis presents the details of the finite element models of plasterboard assemblies and load bearing LSF wall panels including those with the composite panels developed by Kolarkar and Mahendran (2008). It examines and compares the thermal performance of composite panels developed based on different insulating materials of varying densities and thicknesses based on 11 small scale tests, and makes suitable recommendations for improved fire performance of stud wall panels protected by these composite panels. It also presents the thermal performance data of LSF wall systems and demonstrates the superior performance of LSF wall systems using the composite panels. Using the developed finite element of models of LSF walls, this thesis has proposed new LSF wall systems with increased fire rating. The developed finite element models are particularly useful in comparing the thermal performance of different wall panel systems without time consuming and expensive fire tests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The QUT Extreme Science and Engineering program provides free hands-on workshops to schools, presented by scientists and engineers to students from prep to year 12 in their own classrooms. The workshops are tied to the school curriculum and give students access to professional quality instruments, helping to stimulate their interest in science and engineering, with the aim of generating a greater take up of STEM related subjects in the senior high school years. In addition to engaging students in activities, workshop presenters provide role models of both genders, helping to breakdown preconceived ideas of the type of person who becomes a scientist or engineer and demystifying the university experience. The Extreme Science and Engineering vans have been running for 10 years and as such demonstrate a sustainable and reproducible model for schools engagement. With funding provided through QUT’s Widening Participation Equity initiative (HEPPP funded) the vans which averaged 120 school visits each year has increased to 150+ visits in 2010. Additionally 100+ workshops (hands-on and career focused) have been presented to students from low socio-economic status schools, on the three QUT campuses in 2011. While this is designed as a long-term initiative the short term results have been very promising, with 3000 students attending the workshops in the first six months and teacher and students feedback has been overwhelmingly positive.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The question of under what conditions conceptual representation is compositional remains debatable within cognitive science. This paper proposes a well developed mathematical apparatus for a probabilistic representation of concepts, drawing upon methods developed in quantum theory to propose a formal test that can determine whether a specific conceptual combination is compositional, or not. This test examines a joint probability distribution modeling the combination, asking whether or not it is factorizable. Empirical studies indicate that some combinations should be considered non-compositionally.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Digital human models (DHM) have evolved as useful tools for ergonomic workplace design and product development, and found in various industries and education. DHM systems which dominate the market were developed for specific purposes and differ significantly, which is not only reflected in non-compatible results of DHM simulations, but also provoking misunderstanding of how DHM simulations relate to real world problems. While DHM developers are restricted by uncertainty about the user need and lack of model data related standards, users are confined to one specific product and cannot exchange results, or upgrade to another DHM system, as their previous results would be rendered worthless. Furthermore, origin and validity of anthropometric and biomechanical data is not transparent to the user. The lack of standardisation in DHM systems has become a major roadblock in further system development, affecting all stakeholders in the DHM industry. Evidently, a framework for standardising digital human models is necessary to overcome current obstructions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The importance of actively managing and analyzing business processes is acknowledged more than ever in organizations nowadays. Business processes form an essential part of an organization and their ap-plication areas are manifold. Most organizations keep records of various activities that have been carried out for auditing purposes, but they are rarely used for analysis purposes. This paper describes the design and implementation of a process analysis tool that replays, analyzes and visualizes a variety of performance metrics using a process definition and its execution logs. Performing performance analysis on existing and planned process models offers a great way for organizations to detect bottlenecks within their processes and allow them to make more effective process improvement decisions. Our technique is applied to processes modeled in the YAWL language. Execution logs of process instances are compared against the corresponding YAWL process model and replayed in a robust manner, taking into account any noise in the logs. Finally, performance characteristics, obtained from replaying the log in the model, are projected onto the model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Random walk models based on an exclusion process with contact effects are often used to represent collective migration where individual agents are affected by agent-to-agent adhesion. Traditional mean field representations of these processes take the form of a nonlinear diffusion equation which, for strong adhesion, does not predict the averaged discrete behavior. We propose an alternative suite of mean-field representations, showing that collective migration with strong adhesion can be accurately represented using a moment closure approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, a number of phylogenetic methods have been developed for estimating molecular rates and divergence dates under models that relax the molecular clock constraint by allowing rate change throughout the tree. These methods are being used with increasing frequency, but there have been few studies into their accuracy. We tested the accuracy of several relaxed-clock methods (penalized likelihood and Bayesian inference using various models of rate change) using nucleotide sequences simulated on a nine-taxon tree. When the sequences evolved with a constant rate, the methods were able to infer rates accurately, but estimates were more precise when a molecular clock was assumed. When the sequences evolved under a model of autocorrelated rate change, rates were accurately estimated using penalized likelihood and by Bayesian inference using lognormal and exponential models of rate change, while other models did not perform as well. When the sequences evolved under a model of uncorrelated rate change, only Bayesian inference using an exponential rate model performed well. Collectively, the results provide a strong recommendation for using the exponential model of rate change if a conservative approach to divergence time estimation is required. A case study is presented in which we use a simulation-based approach to examine the hypothesis of elevated rates in the Cambrian period, and it is found that these high rate estimates might be an artifact of the rate estimation method. If this bias is present, then the ages of metazoan divergences would be systematically underestimated. The results of this study have implications for studies of molecular rates and divergence dates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In phylogenetics, the unrooted model of phylogeny and the strict molecular clock model are two extremes of a continuum. Despite their dominance in phylogenetic inference, it is evident that both are biologically unrealistic and that the real evolutionary process lies between these two extremes. Fortunately, intermediate models employing relaxed molecular clocks have been described. These models open the gate to a new field of “relaxed phylogenetics.” Here we introduce a new approach to performing relaxed phylogenetic analysis. We describe how it can be used to estimate phylogenies and divergence times in the face of uncertainty in evolutionary rates and calibration times. Our approach also provides a means for measuring the clocklikeness of datasets and comparing this measure between different genes and phylogenies. We find no significant rate autocorrelation among branches in three large datasets, suggesting that autocorrelated models are not necessarily suitable for these data. In addition, we place these datasets on the continuum of clocklikeness between a strict molecular clock and the alternative unrooted extreme. Finally, we present analyses of 102 bacterial, 106 yeast, 61 plant, 99 metazoan, and 500 primate alignments. From these we conclude that our method is phylogenetically more accurate and precise than the traditional unrooted model while adding the ability to infer a timescale to evolution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Potyviruses are found world wide, are spread by probing aphids and cause considerable crop damage. Potyvirus is one of the two largest plant virus genera and contains about 15% of all named plant virus species. When and why did the potyviruses become so numerous? Here we answer the first question and discuss the other. Methods and Findings: We have inferred the phylogenies of the partial coat protein gene sequences of about 50 potyviruses, and studied in detail the phylogenies of some using various methods and evolutionary models. Their phylogenies have been calibrated using historical isolation and outbreak events: the plum pox virus epidemic which swept through Europe in the 20th century, incursions of potyviruses into Australia after agriculture was established by European colonists, the likely transport of cowpea aphid-borne mosaic virus in cowpea seed from Africa to the Americas with the 16th century slave trade and the similar transport of papaya ringspot virus from India to the Americas. Conclusions/Significance: Our studies indicate that the partial coat protein genes of potyviruses have an evolutionary rate of about 1.1561024 nucleotide substitutions/site/year, and the initial radiation of the potyviruses occurred only about 6,600 years ago, and hence coincided with the dawn of agriculture. We discuss the ways in which agriculture may have triggered the prehistoric emergence of potyviruses and fostered their speciation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sequence data often have competing signals that are detected by network programs or Lento plots. Such data can be formed by generating sequences on more than one tree, and combining the results, a mixture model. We report that with such mixture models, the estimates of edge (branch) lengths from maximum likelihood (ML) methods that assume a single tree are biased. Based on the observed number of competing signals in real data, such a bias of ML is expected to occur frequently. Because network methods can recover competing signals more accurately, there is a need for ML methods allowing a network. A fundamental problem is that mixture models can have more parameters than can be recovered from the data, so that some mixtures are not, in principle, identifiable. We recommend that network programs be incorporated into best practice analysis, along with ML and Bayesian trees.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite recent methodological advances in inferring the time-scale of biological evolution from molecular data, the fundamental question of whether our substitution models are sufficiently well specified to accurately estimate branch-lengths has received little attention. I examine this implicit assumption of all molecular dating methods, on a vertebrate mitochondrial protein-coding dataset. Comparison with analyses in which the data are RY-coded (AG → R; CT → Y) suggests that even rates-across-sites maximum likelihood greatly under-compensates for multiple substitutions among the standard (ACGT) NT-coded data, which has been subject to greater phylogenetic signal erosion. Accordingly, the fossil record indicates that branch-lengths inferred from the NT-coded data translate into divergence time overestimates when calibrated from deeper in the tree. Intriguingly, RY-coding led to the opposite result. The underlying NT and RY substitution model misspecifications likely relate respectively to “hidden” rate heterogeneity and changes in substitution processes across the tree, for which I provide simulated examples. Given the magnitude of the inferred molecular dating errors, branch-length estimation biases may partly explain current conflicts with some palaeontological dating estimates.