181 resultados para D72 - Models of Political Processes: Rent-Seeking, Elections, Legislatures, and Voting Behavior

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

There has been considerable research conducted over the last 20 years focused on predicting motor vehicle crashes on transportation facilities. The range of statistical models commonly applied includes binomial, Poisson, Poisson-gamma (or negative binomial), zero-inflated Poisson and negative binomial models (ZIP and ZINB), and multinomial probability models. Given the range of possible modeling approaches and the host of assumptions with each modeling approach, making an intelligent choice for modeling motor vehicle crash data is difficult. There is little discussion in the literature comparing different statistical modeling approaches, identifying which statistical models are most appropriate for modeling crash data, and providing a strong justification from basic crash principles. In the recent literature, it has been suggested that the motor vehicle crash process can successfully be modeled by assuming a dual-state data-generating process, which implies that entities (e.g., intersections, road segments, pedestrian crossings, etc.) exist in one of two states—perfectly safe and unsafe. As a result, the ZIP and ZINB are two models that have been applied to account for the preponderance of “excess” zeros frequently observed in crash count data. The objective of this study is to provide defensible guidance on how to appropriate model crash data. We first examine the motor vehicle crash process using theoretical principles and a basic understanding of the crash process. It is shown that the fundamental crash process follows a Bernoulli trial with unequal probability of independent events, also known as Poisson trials. We examine the evolution of statistical models as they apply to the motor vehicle crash process, and indicate how well they statistically approximate the crash process. We also present the theory behind dual-state process count models, and note why they have become popular for modeling crash data. A simulation experiment is then conducted to demonstrate how crash data give rise to “excess” zeros frequently observed in crash data. It is shown that the Poisson and other mixed probabilistic structures are approximations assumed for modeling the motor vehicle crash process. Furthermore, it is demonstrated that under certain (fairly common) circumstances excess zeros are observed—and that these circumstances arise from low exposure and/or inappropriate selection of time/space scales and not an underlying dual state process. In conclusion, carefully selecting the time/space scales for analysis, including an improved set of explanatory variables and/or unobserved heterogeneity effects in count regression models, or applying small-area statistical methods (observations with low exposure) represent the most defensible modeling approaches for datasets with a preponderance of zeros

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Invasion waves of cells play an important role in development, disease and repair. Standard discrete models of such processes typically involve simulating cell motility, cell proliferation and cell-to-cell crowding effects in a lattice-based framework. The continuum-limit description is often given by a reaction–diffusion equation that is related to the Fisher–Kolmogorov equation. One of the limitations of a standard lattice-based approach is that real cells move and proliferate in continuous space and are not restricted to a predefined lattice structure. We present a lattice-free model of cell motility and proliferation, with cell-to-cell crowding effects, and we use the model to replicate invasion wave-type behaviour. The continuum-limit description of the discrete model is a reaction–diffusion equation with a proliferation term that is different from lattice-based models. Comparing lattice based and lattice-free simulations indicates that both models lead to invasion fronts that are similar at the leading edge, where the cell density is low. Conversely, the two models make different predictions in the high density region of the domain, well behind the leading edge. We analyse the continuum-limit description of the lattice based and lattice-free models to show that both give rise to invasion wave type solutions that move with the same speed but have very different shapes. We explore the significance of these differences by calibrating the parameters in the standard Fisher–Kolmogorov equation using data from the lattice-free model. We conclude that estimating parameters using this kind of standard procedure can produce misleading results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 2002, the United Nations Office on Drugs and Crime (UNODC) issued a report entitled Results of a pilot survey of forty selected organized criminal groups in sixteen countries which established five models of organised crime. This paper reviews these and other common organised crime models and drug trafficking models, and applies them to cases of South East Asian drug trafficking in the Australian state of Queensland. The study tests the following hypotheses: (1) South-East Asian drug trafficking groups in Queensland will operate within a criminal network or core group; (2) Wholesale drug distributors in Queensland will not fit consistently under any particular UN organised crime model; and (3) Street dealers will have no organisational structure. The study concluded that drug trafficking or importation closely resembles a criminal network or core group structure. Wholesale dealers did not fit consistently into any UN organised crime model. Street dealers had no organisational structure as an organisational structure is typically found in mid- to high-level drug trafficking.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In his 2007 PESA keynote address, Paul Smeyers discussed the increasing regulation of child-rearing through government intervention and the generation of “experts,” citing particular examples from Europe where cases of childhood obesity and parental neglect have stirred public opinion and political debate. In his paper (this issue), Smeyers touches on a number of tensions before concluding that child rearing qualifies as a practice in which liberal governments should be reluctant to intervene. In response, I draw on recent experiences in Australia and argue that certain tragic events of late are the result of an ethical, moral and social vacuum in which these tensions coalesce. While I agree with Smeyers that governments should be reluctant to “intervene” in the private domain of the family, I argue that there is a difference between intervention and support. In concluding, I maintain that if certain Western liberal democracies did a more comprehensive job of supporting children and their families through active social investment in primary school education, then both families and schools would be better equipped to deal with the challenges they now face.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis makes several contributions towards improved methods for encoding structure in computational models of word meaning. New methods are proposed and evaluated which address the requirement of being able to easily encode linguistic structural features within a computational representation while retaining the ability to scale to large volumes of textual data. Various methods are implemented and evaluated on a range of evaluation tasks to demonstrate the effectiveness of the proposed methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This presentation discusses topics and issues that connect closely with the Conference Themes and themes in the ARACY Report Card. For example, developing models of public space that are safe, welcoming and relevant to children and young people will impact on their overall wellbeing and may help to prevent many of the tensions occurring in Australia and elsewhere around the world. This area is the subject of ongoing international debate, research and policy formation, relevant to concerns in the ARACY Report Card about children and young people’s health and safety, participation, behaviours and risks and peer and family relationships.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Service processes such as financial advice, booking a business trip or conducting a consulting project have emerged as units of analysis of high interest for the business process and service management communities in practice and academia. While the transactional nature of production processes is relatively well understood and deployed, the less predictable and highly interactive nature of service processes still lacks in many areas appropriate methodological grounding. This paper proposes a framework of a process laboratory as a new IT artefact in order to facilitate the holistic analysis and simulation of such service processes. Using financial services as an example, it will be shown how such a process laboratory can be used to reduce the complexity of service process analysis and facilitate operational service process control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mainstream business process modelling techniques promote a design paradigm wherein the activities to be performed within a case, together with their usual execution order, form the backbone of a process model, on top of which other aspects are anchored. This paradigm, while eective in standardised and production-oriented domains, shows some limitations when confronted with processes where case-by-case variations and exceptions are the norm. In this thesis we develop the idea that the eective design of exible process models calls for an alternative modelling paradigm, one in which process models are modularised along key business objects, rather than along activity decompositions. The research follows a design science method, starting from the formulation of a research problem expressed in terms of requirements, and culminating in a set of artifacts that have been devised to satisfy these requirements. The main contributions of the thesis are: (i) a meta-model for object-centric process modelling incorporating constructs for capturing exible processes; (ii) a transformation from this meta-model to an existing activity-centric process modelling language, namely YAWL, showing the relation between object-centric and activity-centric process modelling approaches; and (iii) a Coloured Petri Net that captures the semantics of the proposed meta-model. The meta-model has been evaluated using a framework consisting of a set of work ow patterns. Moreover, the meta-model has been embodied in a modelling tool that has been used to capture two industrial scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The improvement and optimization of business processes is one of the top priorities in an organization. Although process analysis methods are mature today, business analysts and stakeholders are still hampered by communication issues. That is, analysts cannot effectively obtain accurate business requirements from stakeholders, and stakeholders are often confused about analytic results offered by analysts. We argue that using a virtual world to model a business process can benefit communication activities. We believe that virtual worlds can be used as an efficient model-view approach, increasing the cognition of business requirements and analytic results, as well as the possibility of business plan validation. A healthcare case study is provided as an approach instance, illustrating how intuitive such an approach can be. As an exploration paper, we believe that this promising research can encourage people to investigate more research topics in the interdisciplinary area of information system, visualization and multi-user virtual worlds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A pressing cost issue facing construction is the procurement of off-site pre-manufactured assemblies. In order to encourage Australian adoption of off-site manufacture (OSM), a new approach to underlying processes is required. The advent of object oriented digital models for construction design assumes intelligent use of data. However, the construction production system relies on traditional methods and data sources and is expected to benefit from the application of well-established business process management techniques. The integration of the old and new data sources allows for the development of business process models which, by capturing typical construction processes involving OSM, provides insights into such processes. This integrative approach is the foundation of research into the use of OSM to increase construction productivity in Australia. The purpose of this study is to develop business process models capturing the procurement, resources and information flow of construction projects. For each stage of the construction value chain, a number of sub-processes are identified. Business Process Modelling Notation (BPMN), a mainstream business process modelling standard, is used to create base-line generic construction process models. These models identify OSM decision-making points that could provide cost reductions in procurement workflow and management systems. This paper reports on phase one of an on-going research aiming to develop a proto-type workflow application that can provide semi-automated support to construction processes involving OSM and assist in decision-making in the adoption of OSM thus contributing to a sustainable built environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vendors provide reference process models as consolidated, off-the-shelf solutions to capture best practices in a given industry domain. Customers can then adapt these models to suit their specific requirements. Traditional process flexibility approaches facilitate this operation, but do not fully address it as they do not sufficiently take controlled change guided by vendors' reference models into account. This tension between the customer's freedom of adapting reference models, and the ability to incorporate with relatively low effort vendor-initiated reference model changes, thus needs to be carefully balanced. This paper introduces process extensibility as a new paradigm for customizing reference processes and managing their evolution over time. Process extensibility mandates a clear recognition of the different responsibilities and interests of reference model vendors and consumers, and is concerned with keeping the effort of customer-side reference model adaptations low while allowing sufficient room for model change.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cognitive modelling of phenomena in clinical practice allows the operationalisation of otherwise diffuse descriptive terms such as craving or flashbacks. This supports the empirical investigation of the clinical phenomena and the development of targeted treatment interventions. This paper focuses on the cognitive processes underpinning craving, which is recognised as a motivating experience in substance dependence. We use a high-level cognitive architecture, Interacting Cognitive Subsystems (ICS), to compare two theories of craving: Tiffany's theory, centred on the control of automated action schemata, and our own Elaborated Intrusion theory of craving. Data from a questionnaire study of the subjective aspects of everyday desires experienced by a large non-clinical population are presented. Both the data and the high-level modelling support the central claim of the Elaborated Intrusion theory that imagery is a key element of craving, providing the subjective experience and mediating much of the associated disruption of concurrent cognition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This chapter will address psychodynamic, cognitive-behavioural, and developmental models in supervision by initially considering the historical underpinnings of each and then examining in turn some of the key processes that are evident in the supervisory relationships. Case studies are included where appropriate to highlight the application of theory to practice and several processes are fully elaborated over all models to enable a contemporary view of style and substance in the supervision context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have developed a new experimental method for interrogating statistical theories of music perception by implementing these theories as generative music algorithms. We call this method Generation in Context. This method differs from most experimental techniques in music perception in that it incorporates aesthetic judgments. Generation In Context is designed to measure percepts for which the musical context is suspected to play an important role. In particular the method is suitable for the study of perceptual parameters which are temporally dynamic. We outline a use of this approach to investigate David Temperley’s (2007) probabilistic melody model, and provide some provisional insights as to what is revealed about the model. We suggest that Temperley’s model could be improved by dynamically modulating the probability distributions according to the changing musical context.