15 resultados para Analysis task

em CentAUR: Central Archive University of Reading - UK


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Standard form contracts are typically developed through a negotiated consensus, unless they are proffered by one specific interest group. Previously published plans of work and other descriptions of the processes in construction projects tend to focus on operational issues, or they tend to be prepared from the point of view of one or other of the dominant interest groups. Legal practice in the UK permits those who draft contracts to define their terms as they choose. There are no definitive rulings from the courts that give an indication as to the detailed responsibilities of project participants. The science of terminology offers useful guidance for discovering and describing terms and their meanings in their practical context, but has never been used for defining terms for responsibilities of participants in the construction project management process. Organizational analysis enables the management task to be deconstructed into its elemental parts in order that effective organizational structures can be developed. Organizational mapping offers a useful technique for reducing text-based descriptions of project management roles and responsibilities to a comparable basis. Research was carried out by means of a desk study, detailed analysis of nine plans of work and focus groups representing all aspects of the construction industry. No published plan of work offers definitive guidance. There is an enormous amount of variety in the way that terms are used for identifying responsibilities of project participants. A catalogue of concepts and terms (a “Terminology”) has been compiled and indexed to enable those who draft contracts to choose the most appropriate titles for project participants. The purpose of this terminology is to enable the selection and justification of appropriate terms in order to help define roles. The terminology brings an unprecedented clarity to the description of roles and responsibilities in construction projects and, as such, will be helpful for anyone seeking to assemble a team and specify roles for project participants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Frequency recognition is an important task in many engineering fields such as audio signal processing and telecommunications engineering, for example in applications like Dual-Tone Multi-Frequency (DTMF) detection or the recognition of the carrier frequency of a Global Positioning, System (GPS) signal. This paper will present results of investigations on several common Fourier Transform-based frequency recognition algorithms implemented in real time on a Texas Instruments (TI) TMS320C6713 Digital Signal Processor (DSP) core. In addition, suitable metrics are going to be evaluated in order to ascertain which of these selected algorithms is appropriate for audio signal processing(1).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – To describe some research done, as part of an EPSRC funded project, to assist engineers working together on collaborative tasks. Design/methodology/approach – Distributed finite state modelling and agent techniques are used successfully in a new hybrid self-organising decision making system applied to collaborative work support. For the particular application, analysis of the tasks involved has been performed and these tasks are modelled. The system then employs a novel generic agent model, where task and domain knowledge are isolated from the support system, which provides relevant information to the engineers. Findings – The method is applied in the despatch of transmission commands within the control room of The National Grid Company Plc (NGC) – tasks are completed significantly faster when the system is utilised. Research limitations/implications – The paper describes a generic approach and it would be interesting to investigate how well it works in other applications. Practical implications – Although only one application has been studied, the methodology could equally be applied to a general class of cooperative work environments. Originality/value – One key part of the work is the novel generic agent model that enables the task and domain knowledge, which are application specific, to be isolated from the support system, and hence allows the method to be applied in other domains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the design and implementation of an agent based network for the support of collaborative switching tasks within the control room environment of the National Grid Company plc. This work includes aspects from several research disciplines, including operational analysis, human computer interaction, finite state modelling techniques, intelligent agents and computer supported co-operative work. Aspects of these procedures have been used in the analysis of collaborative tasks to produce distributed local models for all involved users. These models have been used as the basis for the production of local finite state automata. These automata have then been embedded within an agent network together with behavioural information extracted from the task and user analysis phase. The resulting support system is capable of task and communication management within the transmission despatch environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lateral epicondylitis (LE) is hypothesized to occur as a result of repetitive, strenuous and abnormal postural activities of the elbow and wrist. There is still a lack of understanding of how wrist and forearm positions contribute to this condition during common manual tasks. In this study the wrist kinematics and the wrist extensors’ musculotendon patterns were investigated during a manual task believed to elicit LE symptoms in susceptible subjects. A 42-year-old right-handed male, with no history of LE, performed a repetitive movement involving pushing and turning a spring-loaded mechanism. Motion capture data were acquired for the upper limb and an inverse kinematic and dynamic analysis was subsequently carried out. Results illustrated the presence of eccentric contractions sustained by the extensor carpi radialis longus (ECRL), together with an almost constant level of tendon strain of both extensor carpi radialis brevis (ECRB) and extensor digitorum communis lateral (EDCL) branch. It is believed that these factors may partly contribute to the onset of LE as they are both responsible for the creation of microtears at the tendons’ origins. The methodology of this study can be used to explore muscle actions during movements that might cause or exacerbate LE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding human movement is key to improving input devices and interaction techniques. This paper presents a study of mouse movements of motion-impaired users, with an aim to gaining a better understanding of impaired movement. The cursor trajectories of six motion-impaired users and three able-bodied users are studied according to their submovement structure. Several aspects of the movement are studied, including the frequency and duration of pauses between submovements, verification times, the number of submovements, the peak speed of submovements and the accuracy of submovements in two-dimensions. Results include findings that some motion-impaired users pause more often and for longer than able-bodied users, require up to five times more submovements to complete the same task, and exhibit a correlation between error and peak submovement speed that does not exist for able-bodied users.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The fascinating idea that tools become extensions of our body appears in artistic, literary, philosophical, and scientific works alike. In the last fifteen years, this idea has been re-framed into several related hypotheses, one of which states that tool use extends the neural representation of the multisensory space immediately surrounding the hands (variously termed peripersonal space, peri-hand space, peri-cutaneous space, action space, or near space). This and related hypotheses have been tested extensively in the cognitive neurosciences, with evidence from molecular, neurophysiological, neuroimaging, neuropsychological, and behavioural fields. Here, I briefly review the evidence for and against the hypothesis that tool use extends a neural representation of the space surrounding the hand, concentrating on neurophysiological, neuropsychological, and behavioural evidence. I then provide a re-analysis of data from six published and one unpublished experiments using the crossmodal congruency task to test this hypothesis. While the re-analysis broadly confirms the previously-reported finding that tool use does not literally extend peripersonal space, the overall effect-sizes are small and statistical power is low. I conclude by questioning whether the crossmodal congruency task can indeed be used to test the hypothesis that tool use modifies peripersonal space.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wernicke’s aphasia is a condition which results in severely disrupted language comprehension following a lesion to the left temporo-parietal region. A phonological analysis deficit has traditionally been held to be at the root of the comprehension impairment in WA, a view consistent with current functional neuroimaging which finds areas in the superior temporal cortex responsive to phonological stimuli. However behavioural evidence to support the link between a phonological analysis deficit and auditory comprehension has not been yet shown. This study extends seminal work by Blumstein et al. (1977) to investigate the relationship between acoustic-phonological perception, measured through phonological discrimination, and auditory comprehension in a case series of Wernicke’s aphasia participants. A novel adaptive phonological discrimination task was used to obtain reliable thresholds of the phonological perceptual distance required between nonwords before they could be discriminated. Wernicke’s aphasia participants showed significantly elevated thresholds compared to age and hearing matched control participants. Acoustic-phonological thresholds correlated strongly with auditory comprehension abilities in Wernicke’s aphasia. In contrast, nonverbal semantic skills showed no relationship with auditory comprehension. The results are evaluated in the context of recent neurobiological models of language and suggest that impaired acoustic-phonological perception underlies the comprehension impairment in Wernicke’s aphasia and favour models of language which propose a leftward asymmetry in phonological analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an investigation into learners’ and teachers’ perceptions of and criteria for task difficulty. Ten second language learners performed four oral narrative tasks and were retrospectively interviewed about which tasks they perceived as difficult, what factors affected this difficulty and how they identified and defined this task difficulty. Ten EFL/ESOL teachers were given the same tasks and asked to consider the difficulty of the tasks for their learners, and were invited to discuss the factors they believed contributed to this difficulty. Qualitative analysis of the data revealed that, although there were some differences between the two groups’ perceptions of task difficulty, there was substantial similarity between them in terms of the criteria they considered in identifying and defining task difficulty. The findings of this study lend support to the tenets of a cognitive approach to task-based language learning, and demonstrate which aspects of two models of task difficulty reflect the teachers’ and learners’ perceptions and perspectives.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The overarching aim of the research reported here was to investigate the effects of task structure and storyline complexity of oral narrative tasks on second language task performance. Participants were 60 Iranian language learners of English who performed six narrative tasks of varying degree of structure and storyline complexity in an assessment setting. A number of analytic detailed measures were employed to examine whether there were any differences in the participants’ performances elicited by the different tasks in terms of their accuracy, fluency, syntactic complexity and lexical diversity. Results of the data analysis showed that performance in the more structured tasks was more accurate and to a great extent more fluent than that in the less structured tasks. The results further revealed that syntactic complexity of L2 performance was related to the storyline complexity, i.e. more syntactic complexity was associated with narratives that had both foreground and background storylines. These findings strongly suggest that there is some unsystematic variance in the participants’ performance triggered by the different aspects of task design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose first, a simple task for the eliciting attitudes toward risky choice, the SGG lottery-panel task, which consists in a series of lotteries constructed to compensate riskier options with higher risk-return trade-offs. Using Principal Component Analysis technique, we show that the SGG lottery-panel task is capable of capturing two dimensions of individual risky decision making i.e. subjects’ average risk taking and their sensitivity towards variations in risk-return. From the results of a large experimental dataset, we confirm that the task systematically captures a number of regularities such as: A tendency to risk averse behavior (only around 10% of choices are compatible with risk neutrality); An attraction to certain payoffs compared to low risk lotteries, compatible with over-(under-) weighting of small (large) probabilities predicted in PT and; Gender differences, i.e. males being consistently less risk averse than females but both genders being similarly responsive to the increases in risk-premium. Another interesting result is that in hypothetical choices most individuals increase their risk taking responding to the increase in return to risk, as predicted by PT, while across panels with real rewards we see even more changes, but opposite to the expected pattern of riskier choices for higher risk-returns. Therefore, we conclude from our data that an “economic anomaly” emerges in the real reward choices opposite to the hypothetical choices. These findings are in line with Camerer's (1995) view that although in many domains, paid subjects probably do exert extra mental effort which improves their performance, choice over money gambles is not likely to be a domain in which effort will improve adherence to rational axioms (p. 635). Finally, we demonstrate that both dimensions of risk attitudes, average risk taking and sensitivity towards variations in the return to risk, are desirable not only to describe behavior under risk but also to explain behavior in other contexts, as illustrated by an example. In the second study, we propose three additional treatments intended to elicit risk attitudes under high stakes and mixed outcome (gains and losses) lotteries. Using a dataset obtained from a hypothetical implementation of the tasks we show that the new treatments are able to capture both dimensions of risk attitudes. This new dataset allows us to describe several regularities, both at the aggregate and within-subjects level. We find that in every treatment over 70% of choices show some degree of risk aversion and only between 0.6% and 15.3% of individuals are consistently risk neutral within the same treatment. We also confirm the existence of gender differences in the degree of risk taking, that is, in all treatments females prefer safer lotteries compared to males. Regarding our second dimension of risk attitudes we observe, in all treatments, an increase in risk taking in response to risk premium increases. Treatment comparisons reveal other regularities, such as a lower degree of risk taking in large stake treatments compared to low stake treatments and a lower degree of risk taking when losses are incorporated into the large stake lotteries. Results that are compatible with previous findings in the literature, for stake size effects (e.g., Binswanger, 1980; Antoni Bosch-Domènech & Silvestre, 1999; Hogarth & Einhorn, 1990; Holt & Laury, 2002; Kachelmeier & Shehata, 1992; Kühberger et al., 1999; B. J. Weber & Chapman, 2005; Wik et al., 2007) and domain effect (e.g., Brooks and Zank, 2005, Schoemaker, 1990, Wik et al., 2007). Whereas for small stake treatments, we find that the effect of incorporating losses into the outcomes is not so clear. At the aggregate level an increase in risk taking is observed, but also more dispersion in the choices, whilst at the within-subjects level the effect weakens. Finally, regarding responses to risk premium, we find that compared to only gains treatments sensitivity is lower in the mixed lotteries treatments (SL and LL). In general sensitivity to risk-return is more affected by the domain than the stake size. After having described the properties of risk attitudes as captured by the SGG risk elicitation task and its three new versions, it is important to recall that the danger of using unidimensional descriptions of risk attitudes goes beyond the incompatibility with modern economic theories like PT, CPT etc., all of which call for tests with multiple degrees of freedom. Being faithful to this recommendation, the contribution of this essay is an empirically and endogenously determined bi-dimensional specification of risk attitudes, useful to describe behavior under uncertainty and to explain behavior in other contexts. Hopefully, this will contribute to create large datasets containing a multidimensional description of individual risk attitudes, while at the same time allowing for a robust context, compatible with present and even future more complex descriptions of human attitudes towards risk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In probabilistic decision tasks, an expected value (EV) of a choice is calculated, and after the choice has been made, this can be updated based on a temporal difference (TD) prediction error between the EV and the reward magnitude (RM) obtained. The EV is measured as the probability of obtaining a reward x RM. To understand the contribution of different brain areas to these decision-making processes, functional magnetic resonance imaging activations related to EV versus RM (or outcome) were measured in a probabilistic decision task. Activations in the medial orbitofrontal cortex were correlated with both RM and with EV and confirmed in a conjunction analysis to extend toward the pregenual cingulate cortex. From these representations, TD reward prediction errors could be produced. Activations in areas that receive from the orbitofrontal cortex including the ventral striatum, midbrain, and inferior frontal gyrus were correlated with the TD error. Activations in the anterior insula were correlated negatively with EV, occurring when low reward outcomes were expected, and also with the uncertainty of the reward, implicating this region in basic and crucial decision-making parameters, low expected outcomes, and uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Classical regression methods take vectors as covariates and estimate the corresponding vectors of regression parameters. When addressing regression problems on covariates of more complex form such as multi-dimensional arrays (i.e. tensors), traditional computational models can be severely compromised by ultrahigh dimensionality as well as complex structure. By exploiting the special structure of tensor covariates, the tensor regression model provides a promising solution to reduce the model’s dimensionality to a manageable level, thus leading to efficient estimation. Most of the existing tensor-based methods independently estimate each individual regression problem based on tensor decomposition which allows the simultaneous projections of an input tensor to more than one direction along each mode. As a matter of fact, multi-dimensional data are collected under the same or very similar conditions, so that data share some common latent components but can also have their own independent parameters for each regression task. Therefore, it is beneficial to analyse regression parameters among all the regressions in a linked way. In this paper, we propose a tensor regression model based on Tucker Decomposition, which identifies not only the common components of parameters across all the regression tasks, but also independent factors contributing to each particular regression task simultaneously. Under this paradigm, the number of independent parameters along each mode is constrained by a sparsity-preserving regulariser. Linked multiway parameter analysis and sparsity modeling further reduce the total number of parameters, with lower memory cost than their tensor-based counterparts. The effectiveness of the new method is demonstrated on real data sets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We define and experimentally test a public provision mechanism that meets three basic ethical requirements and allows community members to influence, via monetary bids, which of several projects is implemented. For each project, participants are assigned personal values, which can be positive or negative. We provide either public or private information about personal values. This produces two distinct public provision games, which are experimentally implemented and analyzed for various projects. In spite of the complex experimental task, participants do not rely on bidding their own personal values as an obvious simple heuristic whose general acceptance would result in fair and efficient outcomes. Rather, they rely on strategic underbidding. Although underbidding is affected by projects’ characteristics, the provision mechanism mostly leads to the implementation of the most efficient project.