122 resultados para rational interpolants
Resumo:
It is an exciting era for molecular computation because molecular logic gates are being pushed in new directions. The use of sulfur rather than the commonplace nitrogen as the key receptor atom in metal ion sensors is one of these directions; plant cells coming within the jurisdiction of fluorescent molecular thermometers is another, combining photochromism with voltammetry for molecular electronics is yet another. Two-input logic gates benefit from old ideas such as rectifying bilayer electrodes, cyclodextrin-enhanced room-temperature phosphorescence, steric hindrance, the polymerase chain reaction, charge transfer absorption of donor–acceptor complexes and lectin–glycocluster interactions. Furthermore, the concept of photo-uncaging enables rational ways of concatenating logic gates. Computational concepts are also applied to potential cancer theranostics and to the selective monitoring of neurotransmitters in situ. Higher numbers of inputs are also accommodated with the concept of functional integration of gates, where complex input–output patterns are sought out and analysed. Molecular emulation of computational components such as demultiplexers and parity generators/checkers are achieved in related ways. Complexity of another order is tackled with molecular edge detection routines.
Resumo:
While the repeated nature of Discrete Choice Experiments is advantageous from a sampling efficiency perspective, patterns of choice may differ across the tasks, due, in part, to learning and fatigue. Using probabilistic decision process models, we find in a field study that learning and fatigue behavior may only be exhibited by a small subset of respondents. Most respondents in our sample show preference and variance stability consistent with rational pre-existent and
well formed preferences. Nearly all of the remainder exhibit both learning and fatigue effects. An important aspect of our approach is that it enables learning and fatigue effects to be explored, even though they were not envisaged during survey design or data collection.
Resumo:
Beyond Criminal Justice presents a vision of a future without brutal, authoritarian and repressive penal regimes. Many of the papers brought together here have been unavailable for more than two decades. Their republication indicates not only their continuing theoretical importance to abolitionist studies but also how they provide important insights into the nature and legitimacy of criminal processes in the here and now. Contributors highlight the human consequences of the harms of imprisonment, evidencing the hurt, injury and damage of penal incarceration across a number of different countries in Europe. Focusing on penal power and prisoner contestation to such power, the moral and political crises of imprisonment are laid bare. The contributors to Beyond Criminal Justice explore the urgent need for a coherent, rational and morally and politically sophisticated theoretical basis for penal abolitionism. Advocating a utopian imagination and at the same time practical solutions already implemented in countries around Europe - alongside grappling with controversial debates such as abolitionist responses to rape and sexual violence - the book steps outside of common sense assumptions regarding 'crime', punishment and 'criminal justice'. Beyond Criminal Justice will be of interest to students of criminology, zemiology, sociology, penology and critical legal studies as well as anyone interested in rethinking the problem of 'crime' and challenging the logic of the penal rationale.
Resumo:
Seeds are traditionally considered as common or even public goods, their traits as ‘products of nature’. They are also essential to biodiversity, food security and food sovereignty. However, a suite of techno-legal interventions has legislated the enclosure of seeds: seed patents, plant variety protections, and stewardship agreements. These instruments create and protect private proprietary interests over plant material and point to the interface between seeds, capitalism, and law. In the following article, we consider the latest innovations, the bulk of which have been directed toward genetically disabling the reproductive capacities of seeds (terminator technology) or tying these capacities to outputs (‘round-up necessary’). In both instances, scarcity moves from artificial to real.
For the agro-industrial complex, the innovations are perfectly rational as they can simultaneously control supply and demand. For those outside the complex, however, the consequences are potentially ruinous. The practices of seed-saving and exchange no longer are feasible, even covertly. Contemporary genetic controls have upped the ante, by either disabling the reproductive capacity of seeds or, through cross-pollination and outcrossing, facilitating the autonomous spread of the genetic modifications that are importantly still traceable, identifiable and therefore capable of legal protection. In both instances, genuine scarcity becomes the new standard as private interests dominate what was a public sphere.
Resumo:
Highway structures such as bridges are subject to continuous degradation primarily due to ageing and environmental factors. A rational transport policy requires the monitoring of this transport infrastructure to provide adequate maintenance and guarantee the required levels of transport service and safety. In Europe, this is now a legal requirement - a European Directive requires all member states of the European Union to implement a Bridge Management System. However, the process is expensive, requiring the installation of sensing equipment and data acquisition electronics on the bridge. This paper investigates the use of an instrumented vehicle fitted with accelerometers on its axles to monitor the dynamic behaviour of bridges as an indicator of its structural condition. This approach eliminates the need for any on-site installation of measurement equipment. A simplified half-car vehicle-bridge interaction model is used in theoretical simulations to test the possibility of extracting the dynamic parameters of the bridge from the spectra of the vehicle accelerations. The effect of vehicle speed, vehicle mass and bridge span length on the detection of the bridge dynamic parameters are investigated. The algorithm is highly sensitive to the condition of the road profile and simulations are carried out for both smooth and rough profiles
Resumo:
Highway structures such as bridges are subject to continuous degradation primarily due to ageing, loading and environmental factors. A rational transport policy must monitor and provide adequate maintenance to this infrastructure to guarantee the required levels of transport service and safety. Increasingly in recent years, bridges are being instrumented and monitored on an ongoing basis due to the implementation of Bridge Management Systems. This is very effective and provides a high level of protection to the public and early warning if the bridge becomes unsafe. However, the process can be expensive and time consuming, requiring the installation of sensors and data acquisition electronics on the bridge. This paper investigates the use of an instrumented 2-axle vehicle fitted with accelerometers to monitor the dynamic behaviour of a bridge network in a simple and cost-effective manner. A simplified half car-beam interaction model is used to simulate the passage of a vehicle over a bridge. This investigation involves the frequency domain analysis of the axle accelerations as the vehicle crosses the bridge. The spectrum of the acceleration record contains noise, vehicle, bridge and road frequency components. Therefore, the bridge dynamic behaviour is monitored in simulations for both smooth and rough road surfaces. The vehicle mass and axle spacing are varied in simulations along with bridge structural damping in order to analyse the sensitivity of the vehicle accelerations to a change in bridge properties. These vehicle accelerations can be obtained for different periods of time and serve as a useful tool to monitor the variation of bridge frequency and damping with time.
Resumo:
Accounting has been viewed, especially through the lens of the recent managerial reforms, as a neutral technology that, in the hands of rational managers, can support effective and efficient decision making. However, the introduction of new accounting practices can be framed in a variety of ways, from value-neutral procedures to ideologically-charged instruments. Focusing on financial accounting, budgeting and performance management changes in the UK central government, and through extensive textual analysis and interviews in three government departments, this paper investigates: how accounting changes are discussed and introduced at the political level through the use of global discourses; and what strategies organisational actors subsequently use to talk about and legitimate such discourses at different organisational levels. The results shows that in political discussions there is a consistency between the discourses (largely NPM) and the accounting-related changes that took place. The research suggests that a cocktail of legitimation strategies was used by organisational actors to construct a sense of the changes, with authorisation, often in combination with, at the very least, rationalisation strategies most widely utilised. While previous literature posits that different actors tend to use the same rhetorical sequences during periods of change, this study highlights differences at different organisational levels.
Resumo:
The ability of an agent to make quick, rational decisions in an uncertain environment is paramount for its applicability in realistic settings. Markov Decision Processes (MDP) provide such a framework, but can only model uncertainty that can be expressed as probabilities. Possibilistic counterparts of MDPs allow to model imprecise beliefs, yet they cannot accurately represent probabilistic sources of uncertainty and they lack the efficient online solvers found in the probabilistic MDP community. In this paper we advance the state of the art in three important ways. Firstly, we propose the first online planner for possibilistic MDP by adapting the Monte-Carlo Tree Search (MCTS) algorithm. A key component is the development of efficient search structures to sample possibility distributions based on the DPY transformation as introduced by Dubois, Prade, and Yager. Secondly, we introduce a hybrid MDP model that allows us to express both possibilistic and probabilistic uncertainty, where the hybrid model is a proper extension of both probabilistic and possibilistic MDPs. Thirdly, we demonstrate that MCTS algorithms can readily be applied to solve such hybrid models.
Resumo:
The ability of an autonomous agent to select rational actions is vital in enabling it to achieve its goals. To do so effectively in a high-stakes setting, the agent must be capable of considering the risk and potential reward of both immediate and future actions. In this paper we provide a novel method for calculating risk alongside utility in online planning algorithms. We integrate such a risk-aware planner with a BDI agent, allowing us to build agents that can set their risk aversion levels dynamically based on their changing beliefs about the environment. To guide the design of a risk-aware agent we propose a number of principles which such an agent should adhere to and show how our proposed framework satisfies these principles. Finally, we evaluate our approach and demonstrate that a dynamically risk-averse agent is capable of achieving a higher success rate than an agent that ignores risk, while obtaining a higher utility than an agent with a static risk attitude.
Resumo:
Boolean games are a framework for reasoning about the rational behavior of agents whose goals are formalized using propositional formulas. Compared to normal form games, a well-studied and related game framework, Boolean games allow for an intuitive and more compact representation of the agents’ goals. So far, Boolean games have been mainly studied in the literature from the Knowledge Representation perspective, and less attention has been paid on the algorithmic issues underlying the computation of solution concepts. Although some suggestions for solving specific classes of Boolean games have been made in the literature, there is currently no work available on the practical performance. In this paper, we propose the first technique to solve general Boolean games that does not require an exponential translation to normal-form games. Our method is based on disjunctive answer set programming and computes solutions (equilibria) of arbitrary Boolean games. It can be applied to a wide variety of solution concepts, and can naturally deal with extensions of Boolean games such as constraints and costs. We present detailed experimental results in which we compare the proposed method against a number of existing methods for solving specific classes of Boolean games, as well as adaptations of methods that were initially designed for normal-form games. We found that the heuristic methods that do not require all payoff matrix entries performed well for smaller Boolean games, while our ASP based technique is faster when the problem instances have a higher number of agents or action variables.
Resumo:
Boolean games are a framework for reasoning about the rational behaviour of agents, whose goals are formalized using propositional formulas. They offer an attractive alternative to normal-form games, because they allow for a more intuitive and more compact encoding. Unfortunately, however, there is currently no general, tailor-made method available to compute the equilibria of Boolean games. In this paper, we introduce a method for finding the pure Nash equilibria based on disjunctive answer set programming. Our method is furthermore capable of finding the core elements and the Pareto optimal equilibria, and can easily be modified to support other forms of optimality, thanks to the declarative nature of disjunctive answer set programming. Experimental results clearly demonstrate the effectiveness of the proposed method.
Resumo:
The advent of novel genomic technologies that enable the evaluation of genomic alterations on a genome-wide scale has significantly altered the field of genomic marker research in solid tumors. Researchers have moved away from the traditional model of identifying a particular genomic alteration and evaluating the association between this finding and a clinical outcome measure to a new approach involving the identification and measurement of multiple genomic markers simultaneously within clinical studies. This in turn has presented additional challenges in considering the use of genomic markers in oncology, such as clinical study design, reproducibility and interpretation and reporting of results. This Review will explore these challenges, focusing on microarray-based gene-expression profiling, and highlights some common failings in study design that have impacted on the use of putative genomic markers in the clinic. Despite these rapid technological advances there is still a paucity of genomic markers in routine clinical use at present. A rational and focused approach to the evaluation and validation of genomic markers is needed, whereby analytically validated markers are investigated in clinical studies that are adequately powered and have pre-defined patient populations and study endpoints. Furthermore, novel adaptive clinical trial designs, incorporating putative genomic markers into prospective clinical trials, will enable the evaluation of these markers in a rigorous and timely fashion. Such approaches have the potential to facilitate the implementation of such markers into routine clinical practice and consequently enable the rational and tailored use of cancer therapies for individual patients. © 2010 Macmillan Publishers Limited. All rights reserved.
Resumo:
The recent discovery of oncogenic drivers and subsequent development of novel targeted strategies has significantly added to the therapeutic armamentarium of anti-cancer therapies. Targeting BCR-ABL in chronic myeloid leukemia (CML) or HER2 in breast cancer has led to practice-changing clinical benefits, while promising therapeutic responses have been achieved by precision medicine approaches in EGFR mutant lung cancer, colorectal cancer and BRAF mutant melanoma. However, although initial therapeutic responses to targeted therapies can be substantial, many patients will develop disease progression within 6-12 months. An increasing application of powerful omics-based approaches and improving preclinical models have enabled the rapid identification of secondary resistance mechanisms. Herein, we discuss how this knowledge has translated into rational, novel treatment strategies for relapsed patients in genomically selected cancer populations.
Resumo:
Diabetes Distress is a rational emotional response to the threat of a life-changing illness. Distinct from depression, it is rooted in the demands of diabetes management and is a product of psychological adjustment. Diabetes distress has been found to be significantly associated with HbA1c and self-care, which demonstrates its clinical use in treatment outcomes. Interpersonal factors such as perceived support and protectiveness of partners significantly contribute to elevated distress, suggesting that these are valued areas of focus for interventions. Pioneering large-scale research, DAWN2, gives voices to the families of those with diabetes and reaffirms the need to consider psychosocial factors in routine diabetes care. Structured diabetes education programmes are the most widely used in helping individuals cope with diabetes, but they fail to consider the psychological or interpersonal aspects of diabetes management. Psycho-educational approaches are found to be effective in reducing diabetes distress while also improving HbA1c. Certain limitations in the current literature are discussed, along with future directions. Of utmost importance is the need for health practitioners, irrespective of background, to demonstrate an understanding of diabetes distress and actively engage in discussion with individuals struggling to cope with diabetes; to normalize this and integrate it into routine diabetes practice.
Resumo:
This paper presents a method for rational behaviour recognition that combines vision-based pose estimation with knowledge modeling and reasoning. The proposed method consists of two stages. First, RGB-D images are used in the estimation of the body postures. Then, estimated actions are evaluated to verify that they make sense. This method requires rational behaviour to be exhibited. To comply with this requirement, this work proposes a rational RGB-D dataset with two types of sequences, some for training and some for testing. Preliminary results show the addition of knowledge modeling and reasoning leads to a significant increase of recognition accuracy when compared to a system based only on computer vision.