915 resultados para 2-STATE MARKOV-PROCESSES
Resumo:
This work reviews the rationale and processes for raising revenue and allocating funds to perform information intensive activities that are pertinent to the work of democratic government. ‘Government of the people, by the people, for the people’ expresses an idea that democratic government has no higher authority than the people who agree to be bound by its rules. Democracy depends on continually learning how to develop understandings and agreements that can sustain voting majorities on which democratic law making and collective action depends. The objective expressed in constitutional terms is to deliver ‘peace, order and good government’. Meeting this objective requires a collective intellectual authority that can understand what is possible; and a collective moral authority to understand what ought to happen in practice. Facts of life determine that a society needs to retain its collective competence despite a continual turnover of its membership as people die but life goes on. Retaining this ‘collective competence’ in matters of self-government depends on each new generation: • acquiring a collective knowledge of how to produce goods and services needed to sustain a society and its capacity for self-government; • Learning how to defend society diplomatically and militarily in relation to external forces to prevent overthrow of its self-governing capacity; and • Learning how to defend society against divisive internal forces to preserve the authority of representative legislatures, allow peaceful dispute resolution and maintain social cohesion.
Resumo:
Plant biosecurity requires statistical tools to interpret field surveillance data in order to manage pest incursions that threaten crop production and trade. Ultimately, management decisions need to be based on the probability that an area is infested or free of a pest. Current informal approaches to delimiting pest extent rely upon expert ecological interpretation of presence / absence data over space and time. Hierarchical Bayesian models provide a cohesive statistical framework that can formally integrate the available information on both pest ecology and data. The overarching method involves constructing an observation model for the surveillance data, conditional on the hidden extent of the pest and uncertain detection sensitivity. The extent of the pest is then modelled as a dynamic invasion process that includes uncertainty in ecological parameters. Modelling approaches to assimilate this information are explored through case studies on spiralling whitefly, Aleurodicus dispersus and red banded mango caterpillar, Deanolis sublimbalis. Markov chain Monte Carlo simulation is used to estimate the probable extent of pests, given the observation and process model conditioned by surveillance data. Statistical methods, based on time-to-event models, are developed to apply hierarchical Bayesian models to early detection programs and to demonstrate area freedom from pests. The value of early detection surveillance programs is demonstrated through an application to interpret surveillance data for exotic plant pests with uncertain spread rates. The model suggests that typical early detection programs provide a moderate reduction in the probability of an area being infested but a dramatic reduction in the expected area of incursions at a given time. Estimates of spiralling whitefly extent are examined at local, district and state-wide scales. The local model estimates the rate of natural spread and the influence of host architecture, host suitability and inspector efficiency. These parameter estimates can support the development of robust surveillance programs. Hierarchical Bayesian models for the human-mediated spread of spiralling whitefly are developed for the colonisation of discrete cells connected by a modified gravity model. By estimating dispersal parameters, the model can be used to predict the extent of the pest over time. An extended model predicts the climate restricted distribution of the pest in Queensland. These novel human-mediated movement models are well suited to demonstrating area freedom at coarse spatio-temporal scales. At finer scales, and in the presence of ecological complexity, exploratory models are developed to investigate the capacity for surveillance information to estimate the extent of red banded mango caterpillar. It is apparent that excessive uncertainty about observation and ecological parameters can impose limits on inference at the scales required for effective management of response programs. The thesis contributes novel statistical approaches to estimating the extent of pests and develops applications to assist decision-making across a range of plant biosecurity surveillance activities. Hierarchical Bayesian modelling is demonstrated as both a useful analytical tool for estimating pest extent and a natural investigative paradigm for developing and focussing biosecurity programs.
Resumo:
The field of workflow technology has burgeoned in recent years providing a variety of means of automating business processes. It is a great source of opportunity for organisations seeking to streamline and optimise their operations. Despite these advantages however, the current generation of workflow technologies are subject to a variety of criticisms, in terms of their restricted view of what comprises a business process, their imprecise definition and their general inflexibility. As a remedy to these potential difficulties, in this paper we propose a series of development goals for the next generation of workflow technology. We also present newYAWL, a formally defined, multi-perspective reference language for workflow systems.
Resumo:
Competitive markets are increasingly driving new initiatives for shorter cycle times resulting in increased overlapping of project phases. This, in turn, necessitates improving the interfaces between the different phases to be overlapped (integrated), thus allowing transfer of processes, information and knowledge from one individual or team to another. This transfer between phases, within and between projects, is one of the basic challenges to the philosophy of project management. To make the process transfer more transparent with minimal loss of momentum and project knowledge, this paper draws upon Total Quality Management (TQM) and Business Process Re-engineering (BPR) philosophies to develop a Best Practice Model for managing project phase integration. The paper presents the rationale behind the model development and outlines its two key parts; (1) Strategic Framework and (2) Implementation Plan. Key components of both the Strategic Framework and the Implementation Plan are presented and discussed.
Resumo:
This is the second article in a series of three that examines the legal role of medical professionals in decisions to withhold or withdraw life-sustaining treatment from adults who lack capacity. This article considers the position in Queensland, including the parens patriae jurisdiction of the Supreme Court. A review of the law in this State reveals that medical professionals play significant legal roles in these decisions. However, the law is problematic in a number of respects and this is likely to impede medical professionals’ legal knowledge in this area. The article examines the level of training medical professionals receive on issues such as advance health directives and substitute decision-making, and the available empirical evidence as to the state of medical professionals’ knowledge of the law at the end of life. It concludes that there are gaps in legal knowledge and that law reform is needed in Queensland.
Resumo:
Increasingly, almost everything we do in our daily lives is being influenced by information and communications technologies (ICTs) including the Internet. The task of governance is no exception with an increasing number of national, state, and local governments utilizing ICTs to support government operations, engage citizens, and provide government services. As with other things, the process of governance is now being prefixed with an “e”. E-governance can range from simple Web sites that convey basic information to complex sites that transform the customary ways of delivering all sorts of government services. In this respect local e-government is the form of e-governance that specifically focuses on the online delivery of suitable local services by local authorities. In practice local e-government reflects four dimensions, each one dealing with the functions of government itself. The four are: (a) e-services, the electronic delivery of government information, programs, and services often over the Internet; (b) e-management, the use of information technology to improve the management of government. This might range from streamlining business processes to improving the flow of information within government departments; (c) e-democracy the use of electronic communication vehicles, such as e-mail and the Internet, to increase citizen participation in the public decision-making process; (d) e-commerce, the exchange of money for goods and services over the Internet which might include citizens paying taxes and utility bills, renewing vehicle registrations, and paying for recreation programs, or government buying office supplies and auctioning surplus equipment (Cook, LaVigne, Pagano, Dawes, & Pardo, 2002). Commensurate with the rapid increase in the process of developing e-governance tools, there has been an increased interest in benchmarking the process of local e-governance. This benchmarking, which includes the processes involved in e-governance as well as the extent of e-governance adoption or take-up is important as it allows for improved processes and enables government agencies to move towards world best practice. It is within this context that this article discusses benchmarking local e-government. It brings together a number of discussions regarding the significance of benchmarking, best practices and actions for local e-government, and key elements of a successful local e-government project.
Resumo:
Different from conventional methods for structural reliability evaluation, such as, first/second-order reliability methods (FORM/SORM) or Monte Carlo simulation based on corresponding limit state functions, a novel approach based on dynamic objective oriented Bayesian network (DOOBN) for prediction of structural reliability of a steel bridge element has been proposed in this paper. The DOOBN approach can effectively model the deterioration processes of a steel bridge element and predict their structural reliability over time. This approach is also able to achieve Bayesian updating with observed information from measurements, monitoring and visual inspection. Moreover, the computational capacity embedded in the approach can be used to facilitate integrated management and maintenance optimization in a bridge system. A steel bridge girder is used to validate the proposed approach. The predicted results are compared with those evaluated by FORM method.
Resumo:
Research on diaspora has long been dominated by approaches that centre on displacement, relocation, mixed identities, cultural hybridity, loss, yearning and disaffection. In this paper, I outline a fresh conceptual framework, franchise nation, which approaches the study of diaspora from the perspective of the state. What this framework allows is the study of the processes that states employ to woo, nurture and engage their diasporas so as to extend their sovereignty extra-territorially, ie. statecraft. The franchise nation concept draws on the notion of cultural expediency and complements two approaches that dominate the study of statecraft today: soft power and nation branding. However, the point of this is not, to borrow Gayatri Spivak’s words, to be either pro or anti-sovereign but rather to stay awake to how sovereignty is “invoked, extended, deterritorialised, aggregated, [and] abrogated” (2007). Far from suggesting the imminent arrival of a post-national period, the intention with the franchise nation concept is to explicate and better understand the complexities that inhabit the terrain between diaspora, home and host nation that allow and accompany the exercise of sovereignty from afar.
Resumo:
Information overload has become a serious issue for web users. Personalisation can provide effective solutions to overcome this problem. Recommender systems are one popular personalisation tool to help users deal with this issue. As the base of personalisation, the accuracy and efficiency of web user profiling affects the performances of recommender systems and other personalisation systems greatly. In Web 2.0, the emerging user information provides new possible solutions to profile users. Folksonomy or tag information is a kind of typical Web 2.0 information. Folksonomy implies the users‘ topic interests and opinion information. It becomes another source of important user information to profile users and to make recommendations. However, since tags are arbitrary words given by users, folksonomy contains a lot of noise such as tag synonyms, semantic ambiguities and personal tags. Such noise makes it difficult to profile users accurately or to make quality recommendations. This thesis investigates the distinctive features and multiple relationships of folksonomy and explores novel approaches to solve the tag quality problem and profile users accurately. Harvesting the wisdom of crowds and experts, three new user profiling approaches are proposed: folksonomy based user profiling approach, taxonomy based user profiling approach, hybrid user profiling approach based on folksonomy and taxonomy. The proposed user profiling approaches are applied to recommender systems to improve their performances. Based on the generated user profiles, the user and item based collaborative filtering approaches, combined with the content filtering methods, are proposed to make recommendations. The proposed new user profiling and recommendation approaches have been evaluated through extensive experiments. The effectiveness evaluation experiments were conducted on two real world datasets collected from Amazon.com and CiteULike websites. The experimental results demonstrate that the proposed user profiling and recommendation approaches outperform those related state-of-the-art approaches. In addition, this thesis proposes a parallel, scalable user profiling implementation approach based on advanced cloud computing techniques such as Hadoop, MapReduce and Cascading. The scalability evaluation experiments were conducted on a large scaled dataset collected from Del.icio.us website. This thesis contributes to effectively use the wisdom of crowds and expert to help users solve information overload issues through providing more accurate, effective and efficient user profiling and recommendation approaches. It also contributes to better usages of taxonomy information given by experts and folksonomy information contributed by users in Web 2.0.
Resumo:
According to Zygmunt Bauman in Liquid Modernity (2000), the formerly solid and stable institutions of social life that characterised earlier stages of modernity have become fluid. He sees this as an outcome of the modernist project of progress itself, which in seeking to dismantle oppressive structures failed to reconstruct new roles for society, community and the individual. The un-tethering of social life from tradition in the latter stages of the twentieth century has produced unprecedented freedoms and unparalleled uncertainties, at least in the West. Although Bauman’s elaboration of some of the features and drivers of liquid modernity – increased mobility, rapid communications technologies, individualism – suggests it to be a neologism for globalisation, it is arguably also the context which has allowed this phenomenon to flourish. The qualities of fluidity, leakage, and flow that distinguish uncontained liquids also characterise globalisation, which encompasses a range of global trends and processes no longer confined to, or controlled by, the ‘container’ of the nation or state. The concept of liquid modernity helps to explain the conditions under which globalisation discourses have found a purchase and, by extension, the world in which contemporary children’s literature, media, and culture are produced. Perhaps more significantly, it points to the fluid conceptions of self and other that inform the ‘liquid’ worldview of the current generation of consumers of texts for children and young adults. This generation is growing up under the phase of globalisation we describe in this chapter.
Resumo:
The structure and dynamics of a modern business environment are very hard to model using traditional methods. Such complexity raises challenges to effective business analysis and improvement. The importance of applying business process simulation to analyze and improve business activities has been widely recognized. However, one remaining challenge is the development of approaches to human resource behavior simulation. To address this problem, we describe a novel simulation approach where intelligent agents are used to simulate human resources by performing allocated work from a workflow management system. The behavior of the intelligent agents is driven a by state transition mechanism called a Hierarchical Task Network (HTN). We demonstrate and validate our simulator via a medical treatment process case study. Analysis of the simulation results shows that the behavior driven by the HTN is consistent with design of the workflow model. We believe these preliminary results support the development of more sophisticated agent-based human resource simulation systems.
Resumo:
In recent years, there has been a significant amount of research and development in the area of solar photocatalysis. This paper reviews and summarizes the mechanism of photocatalytic oxidation process, types of photocatalyst, and the factors influencing the photoreactor efficiency and the most recent findings related to solar detoxification and disinfection of water contaminants. Various solar reactors for photocatlytic water purification are also briefly described. The future potential of solar photocatlysis for storm water treatment and reuse is also discussed to ensure sustainable use of solar energy and storm water resources.
Resumo:
Scholars of local government have repeatedly lamented the lack of literature on the subject (e.g., Mowbray 1997; Pini, Previte, Haslam & McKenzie 2007). As Dollery, Marshall and Worthington (2003: 1) have commented, local government has often been the ‘poor cousin of its more exalted relatives in terms of the attention it attracts from the research community.’ The exalted relatives Dollery et al. (2003) refer to are national political environments, where women’s participation has elicited significant attention. However, the dearth of research on the specific subject of women’s representation in local government is rarely acknowledged (Neyland & Tucker 1996; Whip & Fletcher 1999). This edited book attempts to redress this situation. Each chapter applies an explicit gender analysis to their specific topic of focus, making ‘gender visible in social phenomenon; [and] asking if, how, and why social processes, standards, and opportunities differ systematically for women and men’ (Howard, Risman & Sprague 2003: 1). These analyses in the local government context are critical for understanding the extent and nature of balanced representation at all levels of government. Furthermore, some women start their elective careers serving on school boards, city or town councils or as mayors, before progressing to state and national legislative offices. Hence, the experiences of women in local government illustrate broader notions of democracy and may for some individual women, shape their opportunities further along the political pipeline.
Resumo:
Tony Fitzgerald’s visionary leap was to see beyond localised, individual wrongdoing. He suggested remedies that were systemic, institutionalised, and directed at underlying structural problems that led to corruption. His report said ‘the problems with which this Inquiry is concerned are not merely associated with individuals, but are institutionalized and related to attitudes which have become entrenched’ (Fitzgerald Report 1989, 13). His response was to suggest an enmeshed system of measures to not only respond reactively to future corruption, but also to prevent its recurrence through improved integrity systems. In the two decades since that report the primary focus of corruption studies and anti-corruption activism has remained on corruption at the local level or within sovereign states. International activism was largely directed at co-ordinating national campaigns and to use international instruments to make these campaigns more effective domestically. This reflects the broader fact that, since the rise of the nation state, states have comprised the majority of the largest institutional actors and have been the most significant institution in the lives of most individuals. This made states the ‘main game in town’ for the ‘governance disciplines’ of ethics, law, political science and economics.
Resumo:
Asset health inspections can produce two types of indicators: (1) direct indicators (e.g. the thickness of a brake pad, and the crack depth on a gear) which directly relate to a failure mechanism; and (2) indirect indicators (e.g. the indicators extracted from vibration signals and oil analysis data) which can only partially reveal a failure mechanism. While direct indicators enable more precise references to asset health condition, they are often more difficult to obtain than indirect indicators. The state space model provides an efficient approach to estimating direct indicators by using indirect indicators. However, existing state space models to estimate direct indicators largely depend on assumptions such as, discrete time, discrete state, linearity, and Gaussianity. The discrete time assumption requires fixed inspection intervals. The discrete state assumption entails discretising continuous degradation indicators, which often introduces additional errors. The linear and Gaussian assumptions are not consistent with nonlinear and irreversible degradation processes in most engineering assets. This paper proposes a state space model without these assumptions. Monte Carlo-based algorithms are developed to estimate the model parameters and the remaining useful life. These algorithms are evaluated for performance using numerical simulations through MATLAB. The result shows that both the parameters and the remaining useful life are estimated accurately. Finally, the new state space model is used to process vibration and crack depth data from an accelerated test of a gearbox. During this application, the new state space model shows a better fitness result than the state space model with linear and Gaussian assumption.