99 resultados para Formative constructs
Resumo:
There has been a clear lack of common data exchange semantics for inter-organisational workflow management systems where the research has mainly focused on technical issues rather than language constructs. This paper presents the neutral data exchanges semantics required for the workflow integration within the AXAEDIS framework and presents the mechanism for object discovery from the object repository where little or no knowledge about the object is available. The paper also presents workflow independent integration architecture with the AXAEDIS Framework.
Resumo:
Space applications are challenged by the reliability of parallel computing systems (FPGAs) employed in space crafts due to Single-Event Upsets. The work reported in this paper aims to achieve self-managing systems which are reliable for space applications by applying autonomic computing constructs to parallel computing systems. A novel technique, 'Swarm-Array Computing' inspired by swarm robotics, and built on the foundations of autonomic and parallel computing is proposed as a path to achieve autonomy. The constitution of swarm-array computing comprising for constituents, namely the computing system, the problem / task, the swarm and the landscape is considered. Three approaches that bind these constituents together are proposed. The feasibility of one among the three proposed approaches is validated on the SeSAm multi-agent simulator and landscapes representing the computing space and problem are generated using the MATLAB.
Resumo:
A self study course for learning to program using the C programming language has been developed. A Learning Object approach was used in the design of the course. One of the benefits of the Learning Object approach is that the learning material can be reused for different purposes. 'Me course developed is designed so that learners can choose the pedagogical approach most suited to their personal learning requirements. For all learning approaches a set of common Assessment Learning Objects (ALOs or tests) have been created. The design of formative assessments with ALOs can be carried out by the Instructional Designer grouping ALOs to correspond to a specific assessment intention. The course is non-credit earning, so there is no summative assessment, all assessment is formative. In this paper examples of ALOs and their uses is presented together with their uses as decided by the Instructional Designer and learner. Personalisation of the formative assessment of skills can be decided by the Instructional Designer or the learner using a repository of pre-designed ALOs. The process of combining ALOs can be carried out manually or in a semi-automated way using metadata that describes the ALO and the skill it is designed to assess.
Resumo:
The work reported in this paper proposes Swarm-Array computing, a novel technique inspired by swarm robotics, and built on the foundations of autonomic and parallel computing. The approach aims to apply autonomic computing constructs to parallel computing systems and in effect achieve the self-ware objectives that describe self-managing systems. The constitution of swarm-array computing comprising four constituents, namely the computing system, the problem/task, the swarm and the landscape is considered. Approaches that bind these constituents together are proposed. Space applications employing FPGAs are identified as a potential area for applying swarm-array computing for building reliable systems. The feasibility of a proposed approach is validated on the SeSAm multi-agent simulator and landscapes are generated using the MATLAB toolkit.
Resumo:
In this study, we report on the development and psychometric evaluation of the Risk-Taking (RT) and Self-Harm (SH) Inventory for Adolescents (RTSHIA), a self-report measure designed to assess adolescent RT and SH in community and clinical settings. 651 young people from secondary schools in England ranging in age from 11.6 years to 18.7 years and 71 young people referred to mental health services for SH behavior in London between the ages of 11.9 years and 17.5 years completed the RTSHIA along with standardized measures of adolescent psychopathology. Two factors emerged from the principal axis factoring, and RT and SH were further validated by a confirmatory factor analysis as related, but different, constructs, rather than elements of a single continuum. Inter-item and test–retest reliabilities were high for both components (Cronbach's α = .85, rtt = .90; Cronbach's α .93, rtt = .87), and considerable evidence emerged in support of the measure's convergent, concurrent, and divergent validity. The findings are discussed with regard to potential usefulness of the RTSHIA for research and clinical purposes with adolescents.
Resumo:
Programming is a skill which requires knowledge of both the basic constructs of the computer language used and techniques employing these constructs. How these are used in any given application is determined intuitively, and this intuition is based on experience of programs already written. One aim of this book is to describe the techniques and give practical examples of the techniques in action - to provide some experience. Another aim of the book is to show how a program should be developed, in particular how a relatively large program should be tackled in a structured manner. These aims are accomplished essentially by describing the writing of one large program, a diagram generator package, in which a number of useful programming techniques are employed. Also, the book provides a useful program, with an in-built manual describing not only how the program works, but also how it does it, with full source code listings. This means that the user can, if required, modify the package to meet particular requirements. A floppy disk is available from the publishers containing the program, including listings of the source code. All the programs are written in Modula-2, using JPI's Top Speed Modula-2 system running on IBM-PCs and compatibles. This language was chosen as it is an ideal language for implementing large programs and it is the main language taught in the Cybernetics Department at the University of Reading. There are some aspects of the Top Speed implementation which are not standard, so suitable comments are given when these occur. Although implemented in Modula-2, many of the techniques described here are appropriate to other languages, like Pascal of C, for example. The book and programs are based on a second year undergraduate course taught at Reading to Cybernetics students, entitled Algorithms and Data Structures. Useful techniques are described for the reader to use, applications where they are appropriate are recommended, but detailed analyses of the techniques are not given.
Resumo:
This paper describes the design, implementation and testing of a high speed controlled stereo “head/eye” platform which facilitates the rapid redirection of gaze in response to visual input. It details the mechanical device, which is based around geared DC motors, and describes hardware aspects of the controller and vision system, which are implemented on a reconfigurable network of general purpose parallel processors. The servo-controller is described in detail and higher level gaze and vision constructs outlined. The paper gives performance figures gained both from mechanical tests on the platform alone, and from closed loop tests on the entire system using visual feedback from a feature detector.
Resumo:
This paper assesses the relationship between amount of climate forcing – as indexed by global mean temperature change – and hydrological response in a sample of UK catchments. It constructs climate scenarios representing different changes in global mean temperature from an ensemble of 21 climate models assessed in the IPCC AR4. The results show a considerable range in impact between the 21 climate models, with – for example - change in summer runoff at a 2oC increase in global mean temperature varying between -40% and +20%. There is evidence of clustering in the results, particularly in projected changes in summer runoff and indicators of low flows, implying that the ensemble mean is not an appropriate generalised indicator of impact, and that the standard deviation of responses does not adequately characterise uncertainty. The uncertainty in hydrological impact is therefore best characterised by considering the shape of the distribution of responses across multiple climate scenarios. For some climate model patterns, and some catchments, there is also evidence that linear climate change forcings produce non-linear hydrological impacts. For most variables and catchments, the effects of climate change are apparent above the effects of natural multi-decadal variability with an increase in global mean temperature above 1oC, but there are differences between catchments. Based on the scenarios represented in the ensemble, the effect of climate change in northern upland catchments will be seen soonest in indicators of high flows, but in southern catchments effects will be apparent soonest in measures of summer and low flows. The uncertainty in response between different climate model patterns is considerably greater than the range due to uncertainty in hydrological model parameterisation.
Resumo:
A research has been conducted over methodological issues concerning the Theory of Planned Behaviour (TPB) by determining an appropriate measurement (direct and indirect) of constructs and selection of a plausible scaling techniques (unipolar and bipolar) of constructs: attitude, subjective norm, perceived behavioural control and intention that are important in explaining farm level tree planting in Pakistan. Unipolar scoring of beliefs showed higher correlation among the constructs of TPB than bipolar scaling technique. Both direct and indirect methods yielded significant results in explaining intention to perform farm forestry except the belief based measure of perceived behavioural control, which were analysed as statistically non-significant. A need to examine more carefully the scoring of perceived behavioural control (PBC) has been expressed
Resumo:
The UK private indirect real estate market has seen a rapid growth in the last seven years. The gross asset value (GAV) of the private property vehicle (PPV) market has about tripled from a GAV of £22.6bn in 1998 to a GAV of £67.1 billion at the end of 2005 (OPC, 2006). Although this trend of growing syndication of real estate is not only a UK phenomenon, the rate of growth has been significantly faster in the UK. For example the German open-ended funds have grown over the same period from €50.4bn to €85.1bn (BVI, 2006). In the US the market capitalization of equity real estate investment trusts (REIT) has grown 155% since 1999 to US$ 301bn (NAREIT, 2006). Each jurisdiction is offering different formats to invest indirectly into real estate but at the core all these vehicles are the same in that they provide a different route for investors to access real estate. In the UK, although the range of ‘products’ is now quite diverse, all structures have in common the ‘wrapping’ of property assets into a multi-investor vehicle. This paper examines the nature, pattern and process of market growth in PPVs and constructs a series of associations between causes and effects to explain this market shift.
Resumo:
This paper will present a conceptual framework for the examination of land redevelopment based on a complex systems/networks approach. As Alvin Toffler insightfully noted, modern scientific enquiry has become exceptionally good at splitting problems into pieces but has forgotten how to put the pieces back together. Twenty-five years after his remarks, governments and corporations faced with the requirements of sustainability are struggling to promote an ‘integrated’ or ‘holistic’ approach to tackling problems. Despite the talk, both practice and research provide few platforms that allow for ‘joined up’ thinking and action. With socio-economic phenomena, such as land redevelopment, promising prospects open up when we assume that their constituents can make up complex systems whose emergent properties are more than the sum of the parts and whose behaviour is inherently difficult to predict. A review of previous research shows that it has mainly focused on idealised, ‘mechanical’ views of property development processes that fail to recognise in full the relationships between actors, the structures created and their emergent qualities. When reality failed to live up to the expectations of these theoretical constructs then somebody had to be blamed for it: planners, developers, politicians. However, from a ‘synthetic’ point of view the agents and networks involved in property development can be seen as constituents of structures that perform complex processes. These structures interact, forming new more complex structures and networks. Redevelopment then can be conceptualised as a process of transformation: a complex system, a ‘dissipative’ structure involving developers, planners, landowners, state agencies etc., unlocks the potential of previously used sites, transforms space towards a higher order of complexity and ‘consumes’ but also ‘creates’ different forms of capital in the process. Analysis of network relations point toward the ‘dualism’ of structure and agency in these processes of system transformation and change. Insights from actor network theory can be conjoined with notions of complexity and chaos to build an understanding of the ways in which actors actively seek to shape these structures and systems, whilst at the same time are recursively shaped by them in their strategies and actions. This approach transcends the blame game and allows for inter-disciplinary inputs to be placed within a broader explanatory framework that does away with many past dichotomies. Better understanding of the interactions between actors and the emergent qualities of the networks they form can improve our comprehension of the complex socio-spatial phenomena that redevelopment comprises. The insights that this framework provides when applied in UK institutional investment into redevelopment are considered to be significant.
Resumo:
Background: Biases in the interpretation of ambiguous material are central to cognitive models of anxiety; however, understanding of the association between interpretation and anxiety in childhood is limited. To address this, a prospective investigation of the stability and specificity of anxious cognitions and anxiety and the relationship between these factors was conducted. Method: Sixty-five children (10–11 years) from a community sample completed measures of self-reported anxiety, depression, and conduct problems, and responded to ambiguous stories at three time points over one-year. Results: Individual differences in biases in interpretation of ambiguity (specifically “anticipated distress” and “threat interpretation”) were stable over time. Furthermore, anticipated distress and threat interpretation were specifically associated with anxiety symptoms. Distress anticipation predicted change in anxiety symptoms over time. In contrast, anxiety scores predicted change in threat interpretation over time. Conclusions: The results suggest that different cognitive constructs may show different longitudinal links with anxiety. These preliminary findings extend research and theory on anxious cognitions and their link with anxiety in children, and suggest that these cognitive processes may be valuable targets for assessment and intervention.
Resumo:
Many older adults wish to gain competence in using a computer, but many application interfaces are perceived as complex and difficult to use, deterring potential users from investing the time to learn them. Hence, this study looks at the potential of ‘familiar’ interface design which builds upon users’ knowledge of real world interactions, and applies existing skills to a new domain. Tools are provided in the form of familiar visual objects, and manipulated like real-world counterparts, rather than with buttons, icons and menus found in classic WIMP interfaces. This paper describes the formative evaluation of computer interactions that are based upon familiar real world tasks, which supports multitouch interaction, involves few buttons and icons, no menus, no right-clicks or double-clicks and no dialogs. Using an example of an email client to test the principles of using “familiarity”, the initial feedback was very encouraging, with 3 of the 4 participants being able to undertake some of the basic email tasks with no prior training and little or no help. The feedback has informed a number of refinements of the design principles, such as providing clearer affordance for visual objects. A full study is currently underway.
Resumo:
This article presents and assesses an algorithm that constructs 3D distributions of cloud from passive satellite imagery and collocated 2D nadir profiles of cloud properties inferred synergistically from lidar, cloud radar and imager data. It effectively widens the active–passive retrieved cross-section (RXS) of cloud properties, thereby enabling computation of radiative fluxes and radiances that can be compared with measured values in an attempt to perform radiative closure experiments that aim to assess the RXS. For this introductory study, A-train data were used to verify the scene-construction algorithm and only 1D radiative transfer calculations were performed. The construction algorithm fills off-RXS recipient pixels by computing sums of squared differences (a cost function F) between their spectral radiances and those of potential donor pixels/columns on the RXS. Of the RXS pixels with F lower than a certain value, the one with the smallest Euclidean distance to the recipient pixel is designated as the donor, and its retrieved cloud properties and other attributes such as 1D radiative heating rates are consigned to the recipient. It is shown that both the RXS itself and Moderate Resolution Imaging Spectroradiometer (MODIS) imagery can be reconstructed extremely well using just visible and thermal infrared channels. Suitable donors usually lie within 10 km of the recipient. RXSs and their associated radiative heating profiles are reconstructed best for extensive planar clouds and less reliably for broken convective clouds. Domain-average 1D broadband radiative fluxes at the top of theatmosphere(TOA)for (21 km)2 domains constructed from MODIS, CloudSat andCloud–Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) data agree well with coincidental values derived from Clouds and the Earth’s Radiant Energy System (CERES) radiances: differences betweenmodelled and measured reflected shortwave fluxes are within±10Wm−2 for∼35% of the several hundred domains constructed for eight orbits. Correspondingly, for outgoing longwave radiation∼65% are within ±10Wm−2.
Resumo:
Models used in neoclassical economics assume human behaviour to be purely rational. On the other hand, models adopted in social and behavioural psychology are founded on the ‘black box’ of human cognition. In view of these observations, this paper aims at bridging this gap by introducing psychological constructs in the well established microeconomic framework of choice behaviour based on random utility theory. In particular, it combines constructs developed employing Ajzen’s theory of planned behaviour with Lancaster’s theory of consumer demand for product characteristics to explain stated preferences over certified animal-friendly foods. To reach this objective a web survey was administered in the largest five EU-25 countries: France, Germany, Italy, Spain and the UK. Findings identify some salient cross-cultural differences between northern and southern Europe and suggest that psychological constructs developed using the Ajzen model are useful in explaining heterogeneity of preferences. Implications for policy makers and marketers involved with certified animal-friendly foods are discussed.