977 resultados para policy simulation
Resumo:
Efficiency of analysis using generalized estimation equations is enhanced when intracluster correlation structure is accurately modeled. We compare two existing criteria (a quasi-likelihood information criterion, and the Rotnitzky-Jewell criterion) to identify the true correlation structure via simulations with Gaussian or binomial response, covariates varying at cluster or observation level, and exchangeable or AR(l) intracluster correlation structure. Rotnitzky and Jewell's approach performs better when the true intracluster correlation structure is exchangeable, while the quasi-likelihood criteria performs better for an AR(l) structure.
Resumo:
The electric field in certain electrostatic devices can be modeled by a grounded plate electrode affected by a corona discharge generated by a series of parallel wires connected to a DC high-voltage supply. The system of differential equations that describe the behaviour (i.e., charging and motion) of the conductive particle in such an electric field has been numerically solved, using several simplifying assumptions. Thus, it was possible to investigate the effect of various electrical and mechanical factors on the trajectories of conductive particles. This model has been employed to study the behaviour of coalparticles in fly-ash corona separators.
Resumo:
Sheep in western Queensland have been predominantly reared for wool. When wool prices became depressed interest in the sheep meat industry, increased. For north west Queensland producers, opportunities may exist to participate in live sheep and meat export to Asia. A simulation model was developed to determine whether this sheep producing area has the capability to provide sufficient numbers of sheep under variable climatic conditions while sustaining the land resources. Maximum capacity for sustainability of resources (as described by stock numbers) was derived from an in-depth study of the agricultural and pastoral potential of Queensland. Decades of sheep production and climatic data spanning differing seasonal conditions were collated for analysis. A ruminant biology model adapted from Grazplan was used to simulate pregnancy rate. Empirical equations predict mortalities, marking rates, and weight characteristics of sheep of various ages from simple climatic measures, stocking rate and reproductive status. The initial age structure of flocks was determined by running the model for several years with historical climatic conditions. Drought management strategies such as selling a proportion of wethers progressively down to two-tooth and oldest ewes were incorporated. Management decisions such as time of joining, age at which ewes were cast-for-age, wether turn-off age and turning-off rate of lambs vary with geographical area and can be specified at run time. The model is run for sequences of climatic conditions generated stochastically from distributions based on historical climatic data correlated in some instances. The model highlights the difficulties of sustaining a consistent supply of sheep under variable climatic conditions.
Resumo:
The widespread and increasing resistance of internal parasites to anthelmintic control is a serious problem for the Australian sheep and wool industry. As part of control programmes, laboratories use the Faecal Egg Count Reduction Test (FECRT) to determine resistance to anthelmintics. It is important to have confidence in the measure of resistance, not only for the producer planning a drenching programme but also for companies investigating the efficacy of their products. The determination of resistance and corresponding confidence limits as given in anthelmintic efficacy guidelines of the Standing Committee on Agriculture (SCA) is based on a number of assumptions. This study evaluated the appropriateness of these assumptions for typical data and compared the effectiveness of the standard FECRT procedure with the effectiveness of alternative procedures. Several sets of historical experimental data from sheep and goats were analysed to determine that a negative binomial distribution was a more appropriate distribution to describe pre-treatment helminth egg counts in faeces than a normal distribution. Simulated egg counts for control animals were generated stochastically from negative binomial distributions and those for treated animals from negative binomial and binomial distributions. Three methods for determining resistance when percent reduction is based on arithmetic means were applied. The first was that advocated in the SCA guidelines, the second similar to the first but basing the variance estimates on negative binomial distributions, and the third using Wadley’s method with the distribution of the response variate assumed negative binomial and a logit link transformation. These were also compared with a fourth method recommended by the International Co-operation on Harmonisation of Technical Requirements for Registration of Veterinary Medicinal Products (VICH) programme, in which percent reduction is based on the geometric means. A wide selection of parameters was investigated and for each set 1000 simulations run. Percent reduction and confidence limits were then calculated for the methods, together with the number of times in each set of 1000 simulations the theoretical percent reduction fell within the estimated confidence limits and the number of times resistance would have been said to occur. These simulations provide the basis for setting conditions under which the methods could be recommended. The authors show that given the distribution of helminth egg counts found in Queensland flocks, the method based on arithmetic not geometric means should be used and suggest that resistance be redefined as occurring when the upper level of percent reduction is less than 95%. At least ten animals per group are required in most circumstances, though even 20 may be insufficient where effectiveness of the product is close to the cut off point for defining resistance.
Resumo:
This thesis presents an interdisciplinary analysis of how models and simulations function in the production of scientific knowledge. The work is informed by three scholarly traditions: studies on models and simulations in philosophy of science, so-called micro-sociological laboratory studies within science and technology studies, and cultural-historical activity theory. Methodologically, I adopt a naturalist epistemology and combine philosophical analysis with a qualitative, empirical case study of infectious-disease modelling. This study has a dual perspective throughout the analysis: it specifies the modelling practices and examines the models as objects of research. The research questions addressed in this study are: 1) How are models constructed and what functions do they have in the production of scientific knowledge? 2) What is interdisciplinarity in model construction? 3) How do models become a general research tool and why is this process problematic? The core argument is that the mediating models as investigative instruments (cf. Morgan and Morrison 1999) take questions as a starting point, and hence their construction is intentionally guided. This argument applies the interrogative model of inquiry (e.g., Sintonen 2005; Hintikka 1981), which conceives of all knowledge acquisition as process of seeking answers to questions. The first question addresses simulation models as Artificial Nature, which is manipulated in order to answer questions that initiated the model building. This account develops further the "epistemology of simulation" (cf. Winsberg 2003) by showing the interrelatedness of researchers and their objects in the process of modelling. The second question clarifies why interdisciplinary research collaboration is demanding and difficult to maintain. The nature of the impediments to disciplinary interaction are examined by introducing the idea of object-oriented interdisciplinarity, which provides an analytical framework to study the changes in the degree of interdisciplinarity, the tools and research practices developed to support the collaboration, and the mode of collaboration in relation to the historically mutable object of research. As my interest is in the models as interdisciplinary objects, the third research problem seeks to answer my question of how we might characterise these objects, what is typical for them, and what kind of changes happen in the process of modelling. Here I examine the tension between specified, question-oriented models and more general models, and suggest that the specified models form a group of their own. I call these Tailor-made models, in opposition to the process of building a simulation platform that aims at generalisability and utility for health-policy. This tension also underlines the challenge of applying research results (or methods and tools) to discuss and solve problems in decision-making processes.
Resumo:
This article explains the essence of the context-sensitive parameters and dimensions in play at the time of an intervention, through the application of Rog’s (2012) model of contextual parameters. Rog’s model offers evaluators a structured approach to examine an intervention. The initial study provided a systematic way to clarify the scope, variables, timing, and appropriate evaluation methodology to evaluate the implementation of a government policy. Given that the government implementation of an educational intervention under study did not follow the experimental research approach, nor the double cycle of action research approach, the application of Rog’s model provided an in-depth understanding of the context-sensitive environment; it is from this clear purpose that the broader evaluation was conducted. Overall, when governments or institutions implement policy to invoke educational change (and this intervention is not guided by an appropriate evaluation approach), then program evaluation is achievable post-implementation. In this situation, Rog’s (2012) model of contextual parameters is a useful way to achieve clarity of purpose to guide the program evaluation.
Resumo:
In this paper we present a novel application of scenario methods to engage a diverse constituency of senior stakeholders, with limited time availability, in debate to inform planning and policy development. Our case study project explores post-carbon futures for the Latrobe Valley region of the Australian state of Victoria. Our approach involved initial deductive development of two ‘extreme scenarios’ by a multi-disciplinary research team, based upon an extensive research programme. Over four workshops with the stakeholder constituency, these initial scenarios were discussed, challenged, refined and expanded through an inductive process, whereby participants took ‘ownership’ of a final set of three scenarios. These were both comfortable and challenging to them. The outcomes of this process subsequently informed public policy development for the region. Whilst this process did not follow a single extant structured, multi-stage scenario approach, neither was it devoid of form. Here, we seek to theorise and codify elements of our process – which we term ‘scenario improvisation’ – such that others may adopt it.
Resumo:
For many complex natural resources problems, planning and management efforts involve groups of organizations working collaboratively through networks (Agranoff, 2007; Booher & Innes, 2010). These networks sometimes involve formal roles and relationships, but often include informal elements (Edelenbos & Klijn, 2007). All of these roles and relationships undergo change in response to changes in personnel, priorities and policy. There has been considerable focus in the planning and public policy literature on describing and characterizing these networks (Mandell & Keast, 2008; Provan & Kenis, 2007). However, there has been far less research assessing how networks change and adjust in response to policy and political change. In the Australian state of Queensland, Natural Resource Management (NRM) organizations were created as lead organizations to address land and water management issues on a regional basis with Commonwealth funding and state support. In 2012, a change in state government signaled a dramatic change in policy that resulted in a significant reduction of state support and commitment. In response to this change, NRM organizations have had to adapt their networks and relationships. In this study, we examine the issues of network relationships, capacity and changing relationships over time using written surveys and focus groups with NRM CEOs, managers and planners (note: data collection events scheduled for March and April 2015). The research team will meet with each of these three groups separately, conduct an in-person survey followed by a facilitated focus group discussion. The NRM participant focus groups will also be subdivided by region, which correlates with capacity (inland/low capacity; coastal/high capacity). The findings focus on how changes in state government commitment have affected NRM networks and their relationships with state agencies. We also examine how these changes vary according to the level within the organization and the capacity of the organization. We hypothesize that: (1) NRM organizations have struggled to maintain capacity in the wake of state agency withdrawal of support; (2) NRM organizations with the lowest capacity have been most adversely affected, while some high capacity NRM organizations may have become more resilient as they have sought out other partners; (3) Network relationships at the highest levels of the organization have been affected the most by state policy change; (4) NRM relationships at the lowest levels of the organizations have changed the least, as formal relationships are replaced by informal networks and relationships.
Resumo:
Adoption is a complex social phenomenon, intimately knitted into its family law framework and shaped by the pressures affecting the family in its local social context. It is a mirror reflecting the changes in our family life and the efforts of family law to address those changes. This has caused it to be variously defined in different societies in the same society, at different times and across a range of contemporary societies.
Resumo:
Background: Plotless density estimators are those that are based on distance measures rather than counts per unit area (quadrats or plots) to estimate the density of some usually stationary event, e.g. burrow openings, damage to plant stems, etc. These estimators typically use distance measures between events and from random points to events to derive an estimate of density. The error and bias of these estimators for the various spatial patterns found in nature have been examined using simulated populations only. In this study we investigated eight plotless density estimators to determine which were robust across a wide range of data sets from fully mapped field sites. They covered a wide range of situations including animal damage to rice and corn, nest locations, active rodent burrows and distribution of plants. Monte Carlo simulations were applied to sample the data sets, and in all cases the error of the estimate (measured as relative root mean square error) was reduced with increasing sample size. The method of calculation and ease of use in the field were also used to judge the usefulness of the estimator. Estimators were evaluated in their original published forms, although the variable area transect (VAT) and ordered distance methods have been the subjects of optimization studies. Results: An estimator that was a compound of three basic distance estimators was found to be robust across all spatial patterns for sample sizes of 25 or greater. The same field methodology can be used either with the basic distance formula or the formula used with the Kendall-Moran estimator in which case a reduction in error may be gained for sample sizes less than 25, however, there is no improvement for larger sample sizes. The variable area transect (VAT) method performed moderately well, is easy to use in the field, and its calculations easy to undertake. Conclusion: Plotless density estimators can provide an estimate of density in situations where it would not be practical to layout a plot or quadrat and can in many cases reduce the workload in the field.
Resumo:
This report describes the development and simulation of a variable rate controller for a 6-degree of freedom nonlinear model. The variable rate simulation model represents an off the shelf autopilot. Flight experiment involves risks and can be expensive. Therefore a dynamic model to understand the performance characteristics of the UAS in mission simulation before actual flight test or to obtain parameters needed for the flight is important. The control and guidance is implemented in Simulink. The report tests the use of the model for air search and air sampling path planning. A GUI in which a set of mission scenarios, in which two experts (mission expert, i.e. air sampling or air search and an UAV expert) interact, is presented showing the benefits of the method.
Resumo:
This report provides an analysis of the cultural, policy and legal implications of ‘mash-ups’. This study provides a short history of mash-ups, explaining how the current ‘remix culture’ builds upon a range of creative antecedents and cultural traditions, which valorised appropriation, quotation, and transformation. It provides modern examples of mash-ups, such as sound recordings, musical works, film and artistic works, focusing on works seen on You Tube and other online applications. In particular, it considers - * Literary mash-ups of canonical texts, including Pride and Prejudice and Zombies, The Wind Done Gone, After the Rain, and 60 Years Later; * Artistic mash-ups, highlighting the Obama Hope poster, the ‘Column’ case, and the competition for extending famous album covers; * Geographical mash-ups, most notably, the Google Australia bushfires map; * Musical mash-ups, such as The Grey Album and the work of Girl Talk; * Cinematic mash-ups, including remixes of There Will Be Blood and The Downfall; and This survey provides an analysis of why mash-up culture is valuable. It highlights the range of aesthetic, political, comic, and commercial impulses behind the creation and the dissemination of mash-ups. This report highlights the tensions between copyright law and mash-ups in particular cultural sectors. Second, this report emphasizes the importance of civil society institutions in promoting and defending mash-ups in both copyright litigation and policy debates. It provides a study of key organisations – including: * The Fair Use Project; * The Organization for Transformative Works; * Public Knowledge; * The Electronic Frontier Foundation; and * The Chilling Effects Clearinghouse This report suggests that much can be learnt from this network of organisations in the United States. There is a dearth of comparable legal clinics, advocacy groups, and creative institutions in Australia. As a result, the public interest values of copyright law have only received weak, incidental support from defendant companies – such as Network Ten and IceTV – with other copyright agendas. Third, this report canvasses a succinct model for legislative reform in respect of copyright law and mash-ups. It highlights: * The extent to which mash-ups are ‘tolerated uses’; * The conflicting judicial precedents on substantiality in Australia and the United States; * The debate over copyright exceptions relating to mash-ups and remixes; * The use of the take-down and notice system under the safe harbours regime by copyright owners in respect of mash-ups; * The impact of technological protection measures on mash-ups and remixes; * The possibility of statutory licensing in respect of mash-ups; * The use of Creative Commons licences; * The impact of moral rights protection upon mash-ups; * The interaction between economic and moral rights under copyright law; and * Questions of copyright law, freedom of expression, and political mash-ups.
Resumo:
APSIM-ORYZA is a new functionality developed in the APSIM framework to simulate rice production while addressing management issues such as fertilisation and transplanting, which are particularly important in Korean agriculture. To validate the model for Korean rice varieties and field conditions, the measured yields and flowering times from three field experiments conducted by the Gyeonggi Agricultural Research and Extension Services (GARES) in Korea were compared against the simulated outputs for different management practices and rice varieties. Simulated yields of early-, mid- and mid-to-late-maturing varieties of rice grown in a continuous rice cropping system from 1997 to 2004 showed close agreement with the measured data. Similar results were also found for yields simulated under seven levels of nitrogen application. When different transplanting times were modelled, simulated flowering times ranged from within 3 days of the measured values for the early-maturing varieties, to up to 9 days after the measured dates for the mid- and especially mid-to-late-maturing varieties. This was associated with highly variable simulated yields which correlated poorly with the measured data. This suggests the need to accurately calibrate the photoperiod sensitivity parameters of the model for the photoperiod-sensitive rice varieties in Korea.
Resumo:
Cultivation and cropping of soils results in a decline in soil organic carbon and soil nitrogen, and can lead to reduced crop yields. The CENTURY model was used to simulate the effects of continuous cultivation and cereal cropping on total soil organic matter (C and N), carbon pools, nitrogen mineralisation, and crop yield from 6 locations in southern Queensland. The model was calibrated for each replicate from the original datasets, allowing comparisons for each replicate rather than site averages. The CENTURY model was able to satisfactorily predict the impact of long-term cultivation and cereal cropping on total organic carbon, but was less successful in simulating the different fractions and nitrogen mineralisation. The model firstly over-predicted the initial (pre-cropping) soil carbon and nitrogen concentration of the sites. To account for the unique shrinking and swelling characteristics of the Vertosol soils, the default annual decomposition rates of the slow and passive carbon pools were doubled, and then the model accurately predicted initial conditions. The ability of the model to predict carbon pool fractions varied, demonstrating the difficulty inherent in predicting the size of these conceptual pools. The strength of the model lies in the ability to closely predict the starting soil organic matter conditions, and the ability to predict the impact of clearing, cultivation, fertiliser application, and continuous cropping on total soil carbon and nitrogen.
Resumo:
This brief provides an overview of the Representative Payee program administered by Social Security. Discussed are the many provisions of the programs as well as practice tips and implications for BPA&O and PABSS personnel.