895 resultados para Anchoring heuristic


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ecological theory predicts that communities using the same resources should have similar structure, but evolutionary constraints on colonization and niche shifts may hamper such convergence. Multitrophic communities of wasps exploiting fig fruits, which first evolved about 75MYA, do not show long-term “inheritance” of taxonomic (lineage) composition or species diversity. However, communities on three continents have converged ecologically in the presence and relative abundance of five insect guilds that we define. Some taxa fill the same niches in each community (phylogenetic niche conservatism). However, we show that overall convergence in ecological community structure depends also on a combination of niche shifts by resident lineages and local colonizations of figs by other insect lineages. Our study explores new ground, and develops new heuristic tools, in combining ecology and phylogeny to address patterns in the complex multitrophic communities of insect on plants, which comprise a large part of terrestrial biodiversity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the most problematic aspects of the ‘Harvard School’ of liberal international theory is its failure to fulfil its own methodological ideals. Although Harvard School liberals subscribe to a nomothetic model of explanation, in practice they employ their theories as heuristic resources. Given this practice, we should expect them neither to develop candidate causal generalizations nor to be value-neutral: their explanatory insights are underpinned by value-laden choices about which questions to address and what concepts to employ. A key question for liberal theorists, therefore, is how a theory may be simultaneously explanatory and value-oriented. The difficulties inherent in resolving this problem are manifested in Ikenberry’s writing: whilst his work on constitutionalism in international politics partially fulfils the requirements of a more satisfactory liberal explanatory theory, his recent attempts to develop prescriptions for US foreign policy reproduce, in a new form, key failings of Harvard School realism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Roots are important to plants for a wide variety of processes, including nutrient and water uptake, anchoring and mechanical support, storage functions, and as the major interface between the plant and various biotic and abiotic factors in the soil environment. Therefore, understanding the development and architecture of roots holds potential for the manipulation of root traits to improve the productivity and sustainability of agricultural systems and to better understand and manage natural ecosystems. While lateral root development is a traceable process along the primary root and different stages can be found along this longitudinal axis of time and development, root system architecture is complex and difficult to quantify. Here, we comment on assays to describe lateral root phenotypes and propose ways to move forward regarding the description of root system architecture, also considering crops and the environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Future climate change projections are often derived from ensembles of simulations from multiple global circulation models using heuristic weighting schemes. This study provides a more rigorous justification for this by introducing a nested family of three simple analysis of variance frameworks. Statistical frameworks are essential in order to quantify the uncertainty associated with the estimate of the mean climate change response. The most general framework yields the “one model, one vote” weighting scheme often used in climate projection. However, a simpler additive framework is found to be preferable when the climate change response is not strongly model dependent. In such situations, the weighted multimodel mean may be interpreted as an estimate of the actual climate response, even in the presence of shared model biases. Statistical significance tests are derived to choose the most appropriate framework for specific multimodel ensemble data. The framework assumptions are explicit and can be checked using simple tests and graphical techniques. The frameworks can be used to test for evidence of nonzero climate response and to construct confidence intervals for the size of the response. The methodology is illustrated by application to North Atlantic storm track data from the Coupled Model Intercomparison Project phase 5 (CMIP5) multimodel ensemble. Despite large variations in the historical storm tracks, the cyclone frequency climate change response is not found to be model dependent over most of the region. This gives high confidence in the response estimates. Statistically significant decreases in cyclone frequency are found on the flanks of the North Atlantic storm track and in the Mediterranean basin.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We construct a two-variable model which describes the interaction between local baroclinicity and eddy heat flux in order to understand aspects of the variance in storm tracks. It is a heuristic model for diabatically forced baroclinic instability close to baroclinic neutrality. The two-variable model has the structure of a nonlinear oscillator. It exhibits some realistic properties of observed storm track variability, most notably the intermittent nature of eddy activity. This suggests that apparent threshold behaviour can be more accurately and succinctly described by a simple nonlinearity. An analogy is drawn with triggering of convective events.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the last decade, due to the Gravity Recovery And Climate Experiment (GRACE) mission and, more recently, the Gravity and steady state Ocean Circulation Explorer (GOCE) mission, our ability to measure the ocean’s mean dynamic topography (MDT) from space has improved dramatically. Here we use GOCE to measure surface current speeds in the North Atlantic and compare our results with a range of independent estimates that use drifter data to improve small scales. We find that, with filtering, GOCE can recover 70% of the Gulf Steam strength relative to the best drifter-based estimates. In the subpolar gyre the boundary currents obtained from GOCE are close to the drifter-based estimates. Crucial to this result is careful filtering which is required to remove small-scale errors, or noise, in the computed surface. We show that our heuristic noise metric, used to determine the degree of filtering, compares well with the quadratic sum of mean sea surface and formal geoid errors obtained from the error variance–covariance matrix associated with the GOCE gravity model. At a resolution of 100 km the North Atlantic mean GOCE MDT error before filtering is 5 cm with almost all of this coming from the GOCE gravity model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of the current article is to support the investigation of linguistic relativity in second language acquisition and sketch methodological and theoretical prerequisites toward developing the domain into a full research program. We identify and discuss three theoretical-methodological components that we believe are needed to succeed in this enterprise. First, we highlight the importance of using nonverbal methods to study linguistic relativity effects in second language (L2) speakers. The use of nonverbal tasks is necessary in order to avoid the circularity that arises when inferences about nonverbal behavior are made on the basis of verbal evidence alone. Second, we identify and delineate the likely cognitive mechanisms underpinning cognitive restructuring in L2 speakers by introducing the theoretical framework of associative learning. By doing so, we demonstrate that the extent and nature of cognitive restructuring in L2 speakers is essentially a function of variation in individual learners’ trajectories. Third, we offer an in-depth discussion of the factors (e.g., L2 proficiency and L2 use) that characterize those trajectories, anchoring them to the framework of associative learning, and reinterpreting their relative strength in predicting L2 speaker cognition

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the UK, architectural design is regulated through a system of design control for the public interest, which aims to secure and promote ‘quality’ in the built environment. Design control is primarily implemented by locally employed planning professionals with political oversight, and independent design review panels, staffed predominantly by design professionals. Design control has a lengthy and complex history, with the concept of ‘design’ offering a range of challenges for a regulatory system of governance. A simultaneously creative and emotive discipline, architectural design is a difficult issue to regulate objectively or consistently, often leading to policy that is regarded highly discretionary and flexible. This makes regulatory outcomes difficult to predict, as approaches undertaken by the ‘agents of control’ can vary according to the individual. The role of the design controller is therefore central, tasked with the responsibility of interpreting design policy and guidance, appraising design quality and passing professional judgment. However, little is really known about what influences the way design controllers approach their task, providing a ‘veil’ over design control, shrouding the basis of their decisions. This research engaged directly with the attitudes and perceptions of design controllers in the UK, lifting this ‘veil’. Using in-depth interviews and Q-Methodology, the thesis explores this hidden element of control, revealing a number of key differences in how controllers approach and implement policy and guidance, conceptualise design quality, and rationalise their evaluations and judgments. The research develops a conceptual framework for agency in design control – this consists of six variables (Regulation; Discretion; Skills; Design Quality; Aesthetics; and Evaluation) and it is suggested that this could act as a ‘heuristic’ instrument for UK controllers, prompting more reflexivity in relation to evaluating their own position, approaches, and attitudes, leading to better practice and increased transparency of control decisions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The performance of rank dependent preference functionals under risk is comprehensively evaluated using Bayesian model averaging. Model comparisons are made at three levels of heterogeneity plus three ways of linking deterministic and stochastic models: the differences in utilities, the differences in certainty equivalents and contextualutility. Overall, the"bestmodel", which is conditional on the form of heterogeneity is a form of Rank Dependent Utility or Prospect Theory that cap tures the majority of behaviour at both the representative agent and individual level. However, the curvature of the probability weighting function for many individuals is S-shaped, or ostensibly concave or convex rather than the inverse S-shape commonly employed. Also contextual utility is broadly supported across all levels of heterogeneity. Finally, the Priority Heuristic model, previously examined within a deterministic setting, is estimated within a stochastic framework, and allowing for endogenous thresholds does improve model performance although it does not compete well with the other specications considered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Advances in hardware and software technologies allow to capture streaming data. The area of Data Stream Mining (DSM) is concerned with the analysis of these vast amounts of data as it is generated in real-time. Data stream classification is one of the most important DSM techniques allowing to classify previously unseen data instances. Different to traditional classifiers for static data, data stream classifiers need to adapt to concept changes (concept drift) in the stream in real-time in order to reflect the most recent concept in the data as accurately as possible. A recent addition to the data stream classifier toolbox is eRules which induces and updates a set of expressive rules that can easily be interpreted by humans. However, like most rule-based data stream classifiers, eRules exhibits a poor computational performance when confronted with continuous attributes. In this work, we propose an approach to deal with continuous data effectively and accurately in rule-based classifiers by using the Gaussian distribution as heuristic for building rule terms on continuous attributes. We show on the example of eRules that incorporating our method for continuous attributes indeed speeds up the real-time rule induction process while maintaining a similar level of accuracy compared with the original eRules classifier. We termed this new version of eRules with our approach G-eRules.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We define and experimentally test a public provision mechanism that meets three basic ethical requirements and allows community members to influence, via monetary bids, which of several projects is implemented. For each project, participants are assigned personal values, which can be positive or negative. We provide either public or private information about personal values. This produces two distinct public provision games, which are experimentally implemented and analyzed for various projects. In spite of the complex experimental task, participants do not rely on bidding their own personal values as an obvious simple heuristic whose general acceptance would result in fair and efficient outcomes. Rather, they rely on strategic underbidding. Although underbidding is affected by projects’ characteristics, the provision mechanism mostly leads to the implementation of the most efficient project.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In order to accelerate computing the convex hull on a set of n points, a heuristic procedure is often applied to reduce the number of points to a set of s points, s ≤ n, which also contains the same hull. We present an algorithm to precondition 2D data with integer coordinates bounded by a box of size p × q before building a 2D convex hull, with three distinct advantages. First, we prove that under the condition min(p, q) ≤ n the algorithm executes in time within O(n); second, no explicit sorting of data is required; and third, the reduced set of s points forms a simple polygonal chain and thus can be directly pipelined into an O(n) time convex hull algorithm. This paper empirically evaluates and quantifies the speed up gained by preconditioning a set of points by a method based on the proposed algorithm before using common convex hull algorithms to build the final hull. A speedup factor of at least four is consistently found from experiments on various datasets when the condition min(p, q) ≤ n holds; the smaller the ratio min(p, q)/n is in the dataset, the greater the speedup factor achieved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Team Formation problem (TFP) has become a well-known problem in the OR literature over the last few years. In this problem, the allocation of multiple individuals that match a required set of skills as a group must be chosen to maximise one or several social positive attributes. Speci�cally, the aim of the current research is two-fold. First, two new dimensions of the TFP are added by considering multiple projects and fractions of people's dedication. This new problem is named the Multiple Team Formation Problem (MTFP). Second, an optimization model consisting in a quadratic objective function, linear constraints and integer variables is proposed for the problem. The optimization model is solved by three algorithms: a Constraint Programming approach provided by a commercial solver, a Local Search heuristic and a Variable Neighbourhood Search metaheuristic. These three algorithms constitute the first attempt to solve the MTFP, being a variable neighbourhood local search metaheuristic the most effi�cient in almost all cases. Applications of this problem commonly appear in real-life situations, particularly with the current and ongoing development of social network analysis. Therefore, this work opens multiple paths for future research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For the diagnosis and prognosis of the problems of quality of life, a multidisciplinary ecosystemic approach encompasses four dimensions of being-in-the-world, as donors and recipients: intimate, interactive, social and biophysical. Social, cultural and environmental vulnerabilities are understood and dealt with, in different circumstances of space and time, as the conjugated effect of all dimensions of being-in-the-world, as they induce the events (deficits and assets), cope with consequences (desired or undesired) and contribute for change. Instead of fragmented and reduced representations of reality, diagnosis and prognosis of cultural, educational, environmental and health problems considers the connections (assets) and ruptures (deficits) between the different dimensions, providing a planning model to develop and evaluate research, teaching programmes, public policies and field projects. The methodology is participatory, experiential and reflexive; heuristic-hermeneutic processes unveil cultural and epistemic paradigms that orient subject-object relationships; giving people the opportunity to reflect on their own realities, engage in new experiences and find new ways to live better in a better world. The proposal is a creative model for thought and practice, providing many opportunities for discussion, debate and development of holistic projects integrating different scientific domains (social sciences, psychology, education, philosophy, etc.).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Melanin granule (melanosome) dispersion within Xenopus laevis melanophores is evoked either by light or alpha-MSH. We have previously demonstrated that the initial biochemical steps of light and alpha-MSH signaling are distinct, since the increase in cAMP observed in response to alpha-MSH was not seen after light exposure. cAMP concentrations in response to alpha-MSH were significantly lower in cells pre-exposed to light as compared to the levels in dark-adapted melanophores. Here we demonstrate the presence of an adenylyl cyclase (AC) in the Xenopus melanophore, similar to the mammalian type IX which is inhibited by Ca(2+)-calmodulin-activated phosphatase. This finding supports the hypothesis that the cyclase could be negatively modulated by a light-promoted Ca(2+) increase. In fact, the activity of calcineurin PP2B phosphatase was increased by light, which could result in AC IX inhibition, thus decreasing the response to alpha-MSH. St-Ht31, a disrupting agent of protein kinase A (PKA)-anchoring kinase A protein (AKAP) complex totally blocked the melanosome dispersing response to alpha-MSH, but did not impair the photo-response in Xenopus melanophores. Sequence comparison of a melanophore AKAP partial clone with GenBank sequences showed that the anchoring protein was a gravin-like adaptor previously sequenced from Xenopus non-pigmentary tissues. Co-immunoprecipitation of Xenopus AKAP and the catalytic subunit of PKA demonstrated that PKA is associated with AKAP and it is released in the presence of alpha-MSH. We conclude that in X laevis melanophores, AKAP12 (gravin-like) contains a site for binding the inactive PKA thus compartmentalizing PKA signaling and also possesses binding sites for PKC. Light diminishes alpha-MSH-induced increase of cAMP by increasing calcineurin (PP2B) activity, which in turn inhibits adenylyl cyclase type IX, and/or by activating PKC, which phosphorylates the gravin-like molecule, thus destabilizing its binding to the cell membrane. (C) 2009 Elsevier Inc. All rights reserved.