933 resultados para Bayesian framework
Resumo:
Despite an increasing number of acclaimed abstract animations being created through the application of motion capture technologies there has been little detailed documentation and analysis of this approach for abstract animation production. More specifically, it is unclear what the key considerations are, and what issues practitioners might face, when integrating motion capture movement data into their practice. In response to this issue this study explored and documented the practice of generating abstract visual and temporal artefacts from motion captured dance movements that compose abstract animated short films. The study has resulted in a possible framework for this form of practice and outlines five key considerations which should be taken into account by practitioners who use motion capture in the production of abstract animated short films.
Resumo:
Provides an accessible foundation to Bayesian analysis using real world models This book aims to present an introduction to Bayesian modelling and computation, by considering real case studies drawn from diverse fields spanning ecology, health, genetics and finance. Each chapter comprises a description of the problem, the corresponding model, the computational method, results and inferences as well as the issues that arise in the implementation of these approaches. Case Studies in Bayesian Statistical Modelling and Analysis: •Illustrates how to do Bayesian analysis in a clear and concise manner using real-world problems. •Each chapter focuses on a real-world problem and describes the way in which the problem may be analysed using Bayesian methods. •Features approaches that can be used in a wide area of application, such as, health, the environment, genetics, information science, medicine, biology, industry and remote sensing. Case Studies in Bayesian Statistical Modelling and Analysis is aimed at statisticians, researchers and practitioners who have some expertise in statistical modelling and analysis, and some understanding of the basics of Bayesian statistics, but little experience in its application. Graduate students of statistics and biostatistics will also find this book beneficial.
Bayesian networks as a complex system tool in the context of a major industry and university project
Resumo:
Stakeholders commonly agree that food systems need to be urgently reformed. Yet, how food systems should be reformed is extremely contested. Public international law and regulations are uniquely placed to influence and guide law, policy, programmes and action at regional, national and local levels. Although plenty of international legal instruments intersect with food-related issues, the international regulation of food systems is fragmented, understudied and contested. In order to address these issues, this paper maps and analyses the public international regulatory aspects of food production with a view to providing recommendations for reform. Accordingly, this paper brings together a variety of binding and non-binding international regulatory instruments that to varying degrees and from a range of angles deals with the first activity in the food system: food production. The following paper traces the regulatory tools from natural resources, to the farmers and farm workers that apply skill and experience, and finally to the different dimension of world trade in food. The various regulatory instruments identified, and their collective whole, will be analysed against a rights-based approach to food security.
Resumo:
Conceptual combination performs a fundamental role in creating the broad range of compound phrases utilised in everyday language. While the systematicity and productivity of language provide a strong argument in favour of assuming compositionality, this very assumption is still regularly questioned in both cognitive science and philosophy. This article provides a novel probabilistic framework for assessing whether the semantics of conceptual combinations are compositional, and so can be considered as a function of the semantics of the constituent concepts, or not. Rather than adjudicating between different grades of compositionality, the framework presented here contributes formal methods for determining a clear dividing line between compositional and non-compositional semantics. Compositionality is equated with a joint probability distribution modelling how the constituent concepts in the combination are interpreted. Marginal selectivity is emphasised as a pivotal probabilistic constraint for the application of the Bell/CH and CHSH systems of inequalities (referred to collectively as Bell-type). Non-compositionality is then equated with either a failure of marginal selectivity, or, in the presence of marginal selectivity, with a violation of Bell-type inequalities. In both non-compositional scenarios, the conceptual combination cannot be modelled using a joint probability distribution with variables corresponding to the interpretation of the individual concepts. The framework is demonstrated by applying it to an empirical scenario of twenty-four non-lexicalised conceptual combinations.
Resumo:
What type of probability theory best describes the way humans make judgments under uncertainty and decisions under conflict? Although rational models of cognition have become prominent and have achieved much success, they adhere to the laws of classical probability theory despite the fact that human reasoning does not always conform to these laws. For this reason we have seen the recent emergence of models based on an alternative probabilistic framework drawn from quantum theory. These quantum models show promise in addressing cognitive phenomena that have proven recalcitrant to modeling by means of classical probability theory. This review compares and contrasts probabilistic models based on Bayesian or classical versus quantum principles, and highlights the advantages and disadvantages of each approach.
Resumo:
This research seeks to demonstrate the ways in which urban design factors, individually and in various well-considered arrangements, stimulate and encourage social activities in Brisbane’s public squares through the mapping and analysis of user behaviour. No design factors contribute to public space in isolation, so the combinations of different design factors, contextual and social impacts as well as local climate are considered to be highly influential to the way in which Brisbane’s public engages with public space. It is this local distinctiveness that this research seeks to ascertain. The research firstly pinpoints and consolidates the design factors identified and recommended in existing literature and then maps the identified factors as they are observed at case study sites in Brisbane. This is then set against observational mappings of the site’s corresponding user activities and engagement. These mappings identify a number of patterns of behaviour; pertinently that “activated” areas of social gathering actively draw people in, and the busier a space is, both the frequency and duration of people lingering in the space increases. The study finds that simply providing respite from the urban environment (and/or weather conditions) does not adequately encourage social interaction and that people friendly design factors can instigate social activities which, if coexisting in a public space, can themselves draw in further users of the space. One of the primary conclusions drawn from these observations is that members of the public in Brisbane are both actively and passively social and often seek out locations where “people-watching” and being around other members of the public (both categorised as passive social activities) are facilitated and encouraged. Spaces that provide respite from the urban environment but that do not sufficiently accommodate social connections and activities are less favourable and are often left abandoned despite their comparable tranquillity and available space.
Resumo:
In this paper, we used a nonconservative Lagrangian mechanics approach to formulate a new statistical algorithm for fluid registration of 3-D brain images. This algorithm is named SAFIRA, acronym for statistically-assisted fluid image registration algorithm. A nonstatistical version of this algorithm was implemented, where the deformation was regularized by penalizing deviations from a zero rate of strain. In, the terms regularizing the deformation included the covariance of the deformation matrices Σ and the vector fields (q). Here, we used a Lagrangian framework to reformulate this algorithm, showing that the regularizing terms essentially allow nonconservative work to occur during the flow. Given 3-D brain images from a group of subjects, vector fields and their corresponding deformation matrices are computed in a first round of registrations using the nonstatistical implementation. Covariance matrices for both the deformation matrices and the vector fields are then obtained and incorporated (separately or jointly) in the nonconservative terms, creating four versions of SAFIRA. We evaluated and compared our algorithms' performance on 92 3-D brain scans from healthy monozygotic and dizygotic twins; 2-D validations are also shown for corpus callosum shapes delineated at midline in the same subjects. After preliminary tests to demonstrate each method, we compared their detection power using tensor-based morphometry (TBM), a technique to analyze local volumetric differences in brain structure. We compared the accuracy of each algorithm variant using various statistical metrics derived from the images and deformation fields. All these tests were also run with a traditional fluid method, which has been quite widely used in TBM studies. The versions incorporating vector-based empirical statistics on brain variation were consistently more accurate than their counterparts, when used for automated volumetric quantification in new brain images. This suggests the advantages of this approach for large-scale neuroimaging studies.
Resumo:
This paper provides an important and timely overview of a conceptual framework designed to assist with the development of message content, as well as the evaluation, of persuasive health messages. While an earlier version of this framework was presented in a prior publication by the authors in 2009, important refinements to the framework have seen it evolve in recent years, warranting the need for an updated review. This paper outlines the Step approach to Message Design and Testing (or SatMDT) in accordance with the theoretical evidence which underpins, as well as empirical evidence which demonstrates the relevance and feasibility, of each of the framework’s steps. The development and testing of the framework have thus far been based exclusively within the road safety advertising context; however, the view expressed herein is that the framework may have broader appeal and application to the health persuasion context.
Resumo:
Industry-school partnerships (ISPs) are increasingly being recognised as a new way of providing vocational education opportunities. However, there is limited research investigating their impact on systemic (organisational and structural) and human resource (teachers and education managers) capacity to support school to work transitions. This paper reports on a government led ISP, established by the Queensland state government. ISPs across three industry sectors: minerals and energy; building and construction; and aviation are included in this study. This research adopted a qualitative case study methodology and draws upon boundary crossing theory to understand the dynamics of how each industry sector responded to systemic and human resource issues that emerged in each ISP. The main finding being that the systematic application of boundary crossing mechanisms by all partners pro-duced mutually beneficial outcomes. ISPs from the three sectors adopted different models, leveraged different boundary crossing objects but all maintained the joint vision and mutually agreed outcomes. All three ISPs genuinely crossed boundaries, albeit in different ways, and assisted teachers to co-pro-duce industry-based curriculums, share sector specific knowledge and skills that help enhance the school to work transition for school graduates.
Resumo:
This study identified the common factors that influence social care practice across disciplines (such as social work and psychology), practice fields, and geographical contexts and further developed the Practice Domain Framework as an empirically-based conceptual framework to assist practitioners in understanding practice complexities. The framework has application in critical reflection, professional supervision, interdisciplinary understanding, teamwork, management, teaching and research. A mixed-methods design was used to identify the components and structure of the refined framework. Eighteen influential factors were identified and organised into eight domains: the Societal, Structural, Organisational, Practice Field, Professional Practice, Accountable Practice, Community of Place, and Personal.
Resumo:
Compensation systems are an essential tool to link corporate goals such as customer orientation with individual and organisational performance. While some authors demonstrate the positive effects of incorporating nonfinancial measures into the compensation system empirically, companies have encountered problems after linking pay to customer satisfaction. We argue that reasons for this can be attributed to the measurement of customer satisfaction as well as to the missing link between customer satisfaction and customer retention and profitability in theses cases. Hence, there is a strong need for the development of an holistic reward and performance measurement model enabling an organisation to identify cause-and-effect relationships when linking rewards to nonfinancial performance measures. We present a conceptual framework of a success chain driven reward system that enables organisations to systematically derive a customer-oriented reward strategy. In the context of performance evaluation, we propose to rely on integrated and multidimensional measurement methods.
Resumo:
A central dimension of the State’s responsibility in a liberal democracy and any just society is the protection of individuals’ central rights and freedoms, and the creation of the minimum conditions under which each individual has an opportunity to lead a life of sufficient equality, dignity and value. A special subset of this responsibility is to protect those who are unable to protect themselves from genuine harm. Substantial numbers of children suffer serious physical, emotional and sexual abuse, and neglect at the hands of their parents and caregivers or by other known parties. Child abuse and neglect occurs in a situation of extreme power asymmetry. The physical, social, behavioural and economic costs to the individual, and the social and economic costs to communities, are vast. Children are not generally able to protect themselves from serious abuse and neglect. This enlivens both the State’s responsibility to protect the child, and the debate about how that responsibility can and should be discharged. A core question arises for all societies, given that most serious child maltreatment occurs in the family sphere, is unlikely to be disclosed, causes substantial harm to both individual and community, and infringes fundamental individual rights and freedoms. The question is: how can society identify these situations so that the maltreatment can be interrupted, the child’s needs for security and safety, and health and other rehabilitation can be met, and the family’s needs can be addressed to reduce the likelihood of recurrence? This chapter proposes a theoretical framework applicable for any society that is considering justifiable and effective policy approaches to identify and respond to cases of serious child abuse and neglect. The core of the theoretical framework is based on major principles from both classical liberal political philosophy (Locke and Mill), and leading political philosophers from the twentieth century and the first part of the new millennium (Rawls, Rorty, Okin, Nussbaum), and is further situated within fundamental frameworks of civil and criminal law, and health and economics.
Resumo:
Scenario planning is a method widely used by strategic planners to address uncertainty about the future. However, current methods either fail to address the future behaviour and impact of stakeholders or they treat the role of stakeholders informally. We present a practical decision-analysis-based methodology for analysing stakeholder objectives and likely behaviour within contested unfolding futures. We address issues of power, interest, and commitment to achieve desired outcomes across a broad stakeholder constituency. Drawing on frameworks for corporate social responsibility (CSR), we provide an illustrative example of our approach to analyse a complex contested issue that crosses geographic, organisational and cultural boundaries. Whilst strategies can be developed by individual organisations that consider the interests of others – for example in consideration of an organisation's CSR agenda – we show that our augmentation of scenario method provides a further, nuanced, analysis of the power and objectives of all concerned stakeholders across a variety of unfolding futures. The resulting modelling framework is intended to yield insights and hence more informed decision making by individual stakeholders or regulators.
Resumo:
The total entropy utility function is considered for the dual purpose of Bayesian design for model discrimination and parameter estimation. A sequential design setting is proposed where it is shown how to efficiently estimate the total entropy utility for a wide variety of data types. Utility estimation relies on forming particle approximations to a number of intractable integrals which is afforded by the use of the sequential Monte Carlo algorithm for Bayesian inference. A number of motivating examples are considered for demonstrating the performance of total entropy in comparison to utilities for model discrimination and parameter estimation. The results suggest that the total entropy utility selects designs which are efficient under both experimental goals with little compromise in achieving either goal. As such, the total entropy utility is advocated as a general utility for Bayesian design in the presence of model uncertainty.