989 resultados para Threshold models


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper focuses on the general problem of coordinating of multi-robot systems, more specifically, it addresses the self-election of heterogeneous and specialized tasks by autonomous robots. In this regard, it has proposed experimenting with two different techniques based chiefly on selforganization and emergence biologically inspired, by applying response threshold models as well as ant colony optimization. Under this approach it can speak of multi-tasks selection instead of multi-tasks allocation, that means, as the agents or robots select the tasks instead of being assigned a task by a central controller. The key element in these algorithms is the estimation of the stimuli and the adaptive update of the thresholds. This means that each robot performs this estimate locally depending on the load or the number of pending tasks to be performed. It has evaluated the robustness of the algorithms, perturbing the number of pending loads to simulate the robot’s error in estimating the real number of pending tasks and also the dynamic generation of loads through time. The paper ends with a critical discussion of experimental results.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cellular immunity is mediated by the interaction of an αβ T cell receptor (TCR) with a peptide presented within the context of a major histocompatibility complex (MHC) molecule. Alloreactive T cells have αβ TCRs that can recognize both self- and foreign peptide–MHC (pMHC) complexes, implying that the TCR has significant complementarity with different pMHC. To characterize the molecular basis for alloreactive TCR recognition of pMHC, we have produced a soluble, recombinant form of an alloreactive αβ T cell receptor in Drosophila melanogaster cells. This recombinant TCR, 2C, is expressed as a correctly paired αβ heterodimer, with the chains covalently connected via a disulfide bond in the C-terminal region. The native conformation of the 2C TCR was probed by surface plasmon resonance (SPR) analysis by using conformation-specific monoclonal antibodies, as well as syngeneic and allogeneic pMHC ligands. The 2C interaction with H-2Kb-dEV8, H-2Kbm3-dEV8, H-2Kb-SIYR, and H-2Ld-p2Ca spans a range of affinities from Kd = 10−4 to 10−6M for the syngeneic (H-2Kb) and allogeneic (H-2Kbm3, H-2Ld) ligands. In general, the syngeneic ligands bind with weaker affinities than the allogeneic ligands, consistent with current threshold models of thymic selection and T cell activation. Crystallization of the 2C TCR required proteolytic trimming of the C-terminal residues of the α and β chains. X-ray quality crystals of complexes of 2C with H-2Kb-dEV8, H-2Kbm3-dEV8 and H-2Kb-SIYR have been grown.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundamental principles of precaution are legal maxims that ask for preventive actions, perhaps as contingent interim measures while relevant information about causality and harm remains unavailable, to minimize the societal impact of potentially severe or irreversible outcomes. Such principles do not explain how to make choices or how to identify what is protective when incomplete and inconsistent scientific evidence of causation characterizes the potential hazards. Rather, they entrust lower jurisdictions, such as agencies or authorities, to make current decisions while recognizing that future information can contradict the scientific basis that supported the initial decision. After reviewing and synthesizing national and international legal aspects of precautionary principles, this paper addresses the key question: How can society manage potentially severe, irreversible or serious environmental outcomes when variability, uncertainty, and limited causal knowledge characterize their decision-making? A decision-analytic solution is outlined that focuses on risky decisions and accounts for prior states of information and scientific beliefs that can be updated as subsequent information becomes available. As a practical and established approach to causal reasoning and decision-making under risk, inherent to precautionary decision-making, these (Bayesian) methods help decision-makers and stakeholders because they formally account for probabilistic outcomes, new information, and are consistent and replicable. Rational choice of an action from among various alternatives-defined as a choice that makes preferred consequences more likely-requires accounting for costs, benefits and the change in risks associated with each candidate action. Decisions under any form of the precautionary principle reviewed must account for the contingent nature of scientific information, creating a link to the decision-analytic principle of expected value of information (VOI), to show the relevance of new information, relative to the initial ( and smaller) set of data on which the decision was based. We exemplify this seemingly simple situation using risk management of BSE. As an integral aspect of causal analysis under risk, the methods developed in this paper permit the addition of non-linear, hormetic dose-response models to the current set of regulatory defaults such as the linear, non-threshold models. This increase in the number of defaults is an important improvement because most of the variants of the precautionary principle require cost-benefit balancing. Specifically, increasing the set of causal defaults accounts for beneficial effects at very low doses. We also show and conclude that quantitative risk assessment dominates qualitative risk assessment, supporting the extension of the set of default causal models.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: The objective was to determine whether the pattern of environmental and genetic influences on deviant personality scores differs from that observed for the normative range of personality, comparing results in adolescent and adult female twins. Methods: A sample of 2,796 female adolescent twins ascertained from birth records provided Junior Eysenck Personality Questionnaire data. The average age of the sample was 17.0 years ( S. D. 2.3). Genetic analyses of continuous and extreme personality scores were conducted. Results were compared for 3,178 adult female twins. Results: Genetic analysis of continuous traits in adolescent female twins were similar to findings in adult female twins, with genetic influences accounting for between 37% and 44% of the variance in Extraversion (Ex), Neuroticism (N), and Social Non-Conformity (SNC), with significant evidence of shared environmental influences (19%) found only for SNC in the adult female twins. Analyses of extreme personality characteristics, defined categorically, in the adolescent data and replicated in the adult female data, yielded estimates for high N and high SNC that deviated substantially (p < .05) from those obtained in the continuous trait analyses, and provided suggestive evidence that shared family environment may play a more important role in determining personality deviance than has been previously found when personality is viewed continuously. However, multiple-threshold models that assumed the same genetic and environmental determinants of both normative range variation and extreme scores gave acceptable fits for each personality dimension. Conclusions: The hypothesis of differences in genetic or environmental factors responsible for N and SNC among female twins with scores in the extreme versus normative ranges was partially supported, but not for Ex.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The advances in three related areas of state-space modeling, sequential Bayesian learning, and decision analysis are addressed, with the statistical challenges of scalability and associated dynamic sparsity. The key theme that ties the three areas is Bayesian model emulation: solving challenging analysis/computational problems using creative model emulators. This idea defines theoretical and applied advances in non-linear, non-Gaussian state-space modeling, dynamic sparsity, decision analysis and statistical computation, across linked contexts of multivariate time series and dynamic networks studies. Examples and applications in financial time series and portfolio analysis, macroeconomics and internet studies from computational advertising demonstrate the utility of the core methodological innovations.

Chapter 1 summarizes the three areas/problems and the key idea of emulating in those areas. Chapter 2 discusses the sequential analysis of latent threshold models with use of emulating models that allows for analytical filtering to enhance the efficiency of posterior sampling. Chapter 3 examines the emulator model in decision analysis, or the synthetic model, that is equivalent to the loss function in the original minimization problem, and shows its performance in the context of sequential portfolio optimization. Chapter 4 describes the method for modeling the steaming data of counts observed on a large network that relies on emulating the whole, dependent network model by independent, conjugate sub-models customized to each set of flow. Chapter 5 reviews those advances and makes the concluding remarks.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Four Ss were run in a visual span of apprehension experiment to determine whether second choices made following incorrect first responses are at the chance level, as implied by various high threshold models proposed for this situation. The relationships between response biases on first and second choices, and between first choice biases on trials with two or three possible responses, were also examined in terms of Luce's (1959) choice theory. The results were: (a) second choice performance in this task appears to be determined by response bias alone, i.e., second choices were at the chance level; (b)first and second choice response biases were not related according to Luce's choice axiom; and (c) the choice axiom predicted with reasonable accuracy the relationships between first choice response biases corresponding to trials with different numbers of possible response alternatives. © 1967 Psychonomic Society, Inc.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Randomly diluted quantum boson and spin models in two dimensions combine the physics of classical percolation with the well-known dimensionality dependence of ordering in quantum lattice models. This combination is rather subtle for models that order in two dimensions but have no true order in one dimension, as the percolation cluster near threshold is a fractal of dimension between 1 and 2: two experimentally relevant examples are the O(2) quantum rotor and the Heisenberg antiferromagnet. We study two analytic descriptions of the O(2) quantum rotor near the percolation threshold. First a spin-wave expansion is shown to predict long-ranged order, but there are statistically rare points on the cluster that violate the standard assumptions of spin-wave theory. A real-space renormalization group (RSRG) approach is then used to understand how these rare points modify ordering of the O(2) rotor. A new class of fixed points of the RSRG equations for disordered one-dimensional bosons is identified and shown to support the existence of long-range order on the percolation backbone in two dimensions. These results are relevant to experiments on bosons in optical lattices and superconducting arrays, and also (qualitatively) for the diluted Heisenberg antiferromagnet La-2(Zn,Mg)(x)Cu1-xO4.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We discuss a general approach to dynamic sparsity modeling in multivariate time series analysis. Time-varying parameters are linked to latent processes that are thresholded to induce zero values adaptively, providing natural mechanisms for dynamic variable inclusion/selection. We discuss Bayesian model specification, analysis and prediction in dynamic regressions, time-varying vector autoregressions, and multivariate volatility models using latent thresholding. Application to a topical macroeconomic time series problem illustrates some of the benefits of the approach in terms of statistical and economic interpretations as well as improved predictions. Supplementary materials for this article are available online. © 2013 Copyright Taylor and Francis Group, LLC.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Although financial theory rests heavily upon the assumption that asset returns are normally distributed, value indices of commercial real estate display significant departures from normality. In this paper, we apply and compare the properties of two recently proposed regime switching models for value indices of commercial real estate in the US and the UK, both of which relax the assumption that observations are drawn from a single distribution with constant mean and variance. Statistical tests of the models' specification indicate that the Markov switching model is better able to capture the non-stationary features of the data than the threshold autoregressive model, although both represent superior descriptions of the data than the models that allow for only one state. Our results have several implications for theoretical models and empirical research in finance.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The goal of this paper is to examine evidence for co-integration between nominal exchange rates for Canada, the UK, Japan, Germany, Italy and France (G6) vis-à-vis the US dollar, and the relative price ratios using monthly data over the period 1973:01 to 1997:04. Motivated by the fact that exchange rate adjustment may be asymmetric, we allowed for asymmetric adjustment in exchange rates by using the threshold autoregressive model and the momentum threshold autoregressive model. We do not find any evidence of a co-integrating relationship; hence, we fail to establish long-run purchasing power parity.