44 resultados para design-based inference
em CentAUR: Central Archive University of Reading - UK
Resumo:
The utility of an "ecologically rational" recognition-based decision rule in multichoice decision problems is analyzed, varying the type of judgment required (greater or lesser). The maximum size and range of a counterintuitive advantage associated with recognition-based judgment (the "less-is-more effect") is identified for a range of cue validity values. Greater ranges of the less-is-more effect occur when participants are asked which is the greatest of to choices (m > 2) than which is the least. Less-is-more effects also have greater range for larger values of in. This implies that the classic two-altemative forced choice task, as studied by Goldstein and Gigerenzer (2002), may not be the most appropriate test case for less-is-more effects.
Resumo:
Inference on the basis of recognition alone is assumed to occur prior to accessing further information (Pachur & Hertwig, 2006). A counterintuitive result of this is the “less-is-more” effect: a drop in the accuracy with which choices are made as to which of two or more items scores highest on a given criterion as more items are learned (Frosch, Beaman & McCloy, 2007; Goldstein & Gigerenzer, 2002). In this paper, we show that less-is-more effects are not unique to recognition-based inference but can also be observed with a knowledge-based strategy provided two assumptions, limited information and differential access, are met. The LINDA model which embodies these assumptions is presented. Analysis of the less-is-more effects predicted by LINDA and by recognition-driven inference shows that these occur for similar reasons and casts doubt upon the “special” nature of recognition-based inference. Suggestions are made for empirical tests to compare knowledge-based and recognition-based less-is-more effects
Resumo:
In participatory design situations the competence of the facilitator will influence the opportunities for a user group to become engaged in the process of design. Based on the observation of the conversations from a series of design workshops, the performance of design facilitation expertise by an expert architect is compared with a less experienced architectural graduate. The skills that are the focus of this research are the conversational competences deployed by architects to engage users in the design of an architectural project. The difference between the conversational behaviour of a project architect and a less experienced graduate was observed to illustrate with examples the effect the performance of facilitation had on the opportunity for user engagement in design, and of learning the skill of facilitation that occurred in these situations.
Resumo:
Studies of ignorance-driven decision making have been employed to analyse when ignorance should prove advantageous on theoretical grounds or else they have been employed to examine whether human behaviour is consistent with an ignorance-driven inference strategy (e. g., the recognition heuristic). In the current study we examine whether-under conditions where such inferences might be expected-the advantages that theoretical analyses predict are evident in human performance data. A single experiment shows that, when asked to make relative wealth judgements, participants reliably use recognition as a basis for their judgements. Their wealth judgements under these conditions are reliably more accurate when some of the target names are unknown than when participants recognize all of the names (a "less-is-more effect"). These results are consistent across a number of variations: the number of options given to participants and the nature of the wealth judgement. A basic model of recognition-based inference predicts these effects.
Resumo:
“Fast & frugal” heuristics represent an appealing way of implementing bounded rationality and decision-making under pressure. The recognition heuristic is the simplest and most fundamental of these heuristics. Simulation and experimental studies have shown that this ignorance-driven heuristic inference can prove superior to knowledge based inference (Borges, Goldstein, Ortman & Gigerenzer, 1999; Goldstein & Gigerenzer, 2002) and have shown how the heuristic could develop from ACT-R’s forgetting function (Schooler & Hertwig, 2005). Mathematical analyses also demonstrate that, under certain conditions, a “less-is-more effect” will always occur (Goldstein & Gigerenzer, 2002). The further analyses presented in this paper show, however, that these conditions may constitute a special case and that the less-is-more effect in decision-making is subject to the moderating influence of the number of options to be considered and the framing of the question.
Resumo:
A common problem in many data based modelling algorithms such as associative memory networks is the problem of the curse of dimensionality. In this paper, a new two-stage neurofuzzy system design and construction algorithm (NeuDeC) for nonlinear dynamical processes is introduced to effectively tackle this problem. A new simple preprocessing method is initially derived and applied to reduce the rule base, followed by a fine model detection process based on the reduced rule set by using forward orthogonal least squares model structure detection. In both stages, new A-optimality experimental design-based criteria we used. In the preprocessing stage, a lower bound of the A-optimality design criterion is derived and applied as a subset selection metric, but in the later stage, the A-optimality design criterion is incorporated into a new composite cost function that minimises model prediction error as well as penalises the model parameter variance. The utilisation of NeuDeC leads to unbiased model parameters with low parameter variance and the additional benefit of a parsimonious model structure. Numerical examples are included to demonstrate the effectiveness of this new modelling approach for high dimensional inputs.
Resumo:
Sampling strategies for monitoring the status and trends in wildlife populations are often determined before the first survey is undertaken. However, there may be little information about the distribution of the population and so the sample design may be inefficient. Through time, as data are collected, more information about the distribution of animals in the survey region is obtained but it can be difficult to incorporate this information in the survey design. This paper introduces a framework for monitoring motile wildlife populations within which the design of future surveys can be adapted using data from past surveys whilst ensuring consistency in design-based estimates of status and trends through time. In each survey, part of the sample is selected from the previous survey sample using simple random sampling. The rest is selected with inclusion probability proportional to predicted abundance. Abundance is predicted using a model constructed from previous survey data and covariates for the whole survey region. Unbiased design-based estimators of status and trends and their variances are derived from two-phase sampling theory. Simulations over the short and long-term indicate that in general more precise estimates of status and trends are obtained using this mixed strategy than a strategy in which all of the sample is retained or all selected with probability proportional to predicted abundance. Furthermore the mixed strategy is robust to poor predictions of abundance. Estimates of status are more precise than those obtained from a rotating panel design.
Resumo:
Competency management is a very important part of a well-functioning organisation. Unfortunately competency descriptions are not uniformly specified nor defined across borders: National, sectorial or organisational, leading to an opaque competency description market with a multitude of competency frameworks and competency benchmarks. An ontology is a formalised description of a domain, which enables automated reasoning engines to be built which by utilising the interrelations between entities can make “intelligent” choices in different situations within the domain. Introducing formalised competency ontologies automated tools, such as skill gap analysis, training suggestion generation, job search and recruitment, can be developed, which compare and contrast different competency descriptions on the semantic level. The major problem with defining a common formalised ontology for competencies is that there are so many viewpoints of competencies and competency frameworks. Work within the TRACE project has focused on finding common trends within different competency frameworks in order to allow an intermediate competency description to be made, which other frameworks can reference. This research has shown that competencies can be divided up into “knowledge”, “skills” and what we call “others”. An ontology has been created based on this with a simple structure of different “kinds” of “knowledges” and “skills” using semantic interrelations to define the basic semantic structure of the ontology. A prototype tool for analysing a skill gap analysis has been developed. Personal profiles can be produced using the tool and a skill gap analysis is performed on a desired competency profile by using an ontologically based inference engine, which is able to list closest fit and possible proficiency gaps
Resumo:
Inferences consistent with “recognition-based” decision-making may be drawn for various reasons other than recognition alone. We demonstrate that, for 2-alternative forced-choice decision tasks, less-is-more effects (reduced performance with additional learning) are not restricted to recognition-based inference but can also be seen in circumstances where inference is knowledge-based but item knowledge is limited. One reason why such effects may not be observed more widely is the dependence of the effect on specific values for the validity of recognition and knowledge cues. We show that both recognition and knowledge validity may vary as a function of the number of items recognized. The implications of these findings for the special nature of recognition information, and for the investigation of recognition-based inference, are discussed
Resumo:
Neuromuscular disorders affect millions of people world-wide. Upper limb tremor is a common symptom, and due to its complex aetiology it is difficult to compensate for except, in particular cases by surgical intervention or drug therapy. Wearable devices that mechanically compensate for limb tremor could benefit a considerable number of patients, but the technology to assist suffers in this way is under-developed. In this paper we propose an innovative orthosis that can dynamically suppress pathological tremor, by applying viscous damping to the affected limb in a controlled manner. The orthosis design utilises a new actuator design based on Magneto-Rheological Fluids that efficiently deliver damping action in response to the instantaneous tremor frequency and amplitude.
Resumo:
Driven by new network and middleware technologies such as mobile broadband, near-field communication, and context awareness the so-called ambient lifestyle will foster innovative use cases in building automation, healthcare and agriculture. In the EU project Hydra1 highlevel security, trust and privacy concerns such as loss of control, profiling and surveillance are considered at the outset. At the end of this project the Hydra middleware development platform will have been designed so as to enable developers to realise secure ambient scenarios especially in the user domains of building automation, healthcare, and agriculture. This paper gives a short introduction to the Hydra project, its user domains and its approach to ensure security by design. Based on the results of a focus group analysis of the building automation domain typical threats are evaluated and their risks are assessed. Then, specific security requirements with respect to security, privacy, and trust are derived in order to incorporate them into the Hydra Security Meta Model. How concepts such as context security, semantic security, and virtualisation support the overall Hydra approach will be introduced and illustrated on the basis of a technical building automation scenario.
Resumo:
Driven by new network and middleware technologies such as mobile broadband, near-field communication, and context awareness the so-called ambient lifestyle will foster innovative use cases in different domains. In the EU project Hydra high-level security, trust and privacy concerns such as loss of control, profiling and surveillance are considered at the outset. At the end of this project the. Hydra middleware development platform will have been designed so as to enable developers to realise secure ambient scenarios. This paper gives a short introduction to the Hydra project and its approach to ensure security by design. Based on the results of a focus group analysis of the user domain "building automation" typical threats are evaluated and their risks are assessed. Then, specific security requirements with respect to security, privacy, and trust are derived in order to incorporate them into the Hydra Security Meta-Model. How concepts such as context, semantic resolution of security, and virtualisation support the overall Hydra approach will be introduced and illustrated on the basis of it technical building automation scenario.
Resumo:
We report on a distributed moisture detection scheme which uses a cable design based on waterswellable hydrogel polymers. The cable modulates the loss characteristic of light guided within a multi-mode optical fibre in response to relative water potentials in the surrounding environment. Interrogation of the cable using conventional optical time-domain reflectometry (OTDR) instruments allows water ingress points to be identified and located with a spatial resolution of 50 cm. The system has been tested in a simulated tendon duct grouting experiment as a means of mapping the extent of fill along the duct during the grouting process. Voided regions were detected and identified to within 50 cm. A series of salt solutions has been used to determine the sensor behaviour over a range of water potentials. These experiments predict that measurements of soil moisture content can be made over the range 0 to – 1500 kPa. Preliminary data on soil measurements have shown that the sensor can detect water pressure changes with a resolution of 45 kPa. Applications for the sensor include quality assurance of grouting procedures, verification of waterproofing barriers and soil moisture content determination (for load-bearing calculations).
Resumo:
Natural-ventilation potential (NVP) value can provide the designers significant information to properly design and arrange natural ventilation strategy at the preliminary or conceptual stage of ventilation and building design. Based on the previous study by Yang et al. [Investigation potential of natural driving forces for ventilation in four major cities in China. Building and Environment 2005;40:739–46], we developed a revised model to estimate the potential for natural ventilation considering both thermal comfort and IAQ issues for buildings in China. It differs from the previous one by Yang et al. in two predominant aspects: (1) indoor air temperature varies synchronously with the outdoor air temperature rather than staying at a constant value as assumed by Yang et al. This would recover the real characteristic of natural ventilation, (2) thermal comfort evaluation index is integrated into the model and thus the NVP can be more reasonably predicted. By adopting the same input parameters, the NVP values are obtained and compared with the early work of Yang et al. for a single building in four representative cities which are located in different climates, i.e., Urumqi in severe cold regions, Beijing in cold regions, Shanghai in hot summer and cold winter regions and Guangzhou in hot summer and warm winter regions of China. Our outcome shows that Guangzhou has the highest and best yearly natural-ventilation potential, followed by Shanghai, Beijing and Urumqi, which is quite distinct from that of Yang et al. From the analysis, it is clear that our model evaluates the NVP values more consistently with the outdoor climate data and thus reveals the true value of NVP.