77 resultados para knowledge asymmetry
em CentAUR: Central Archive University of Reading - UK
Resumo:
The construction industry with its nature of project delivery is very fragmented in terms of the various processes that encompass design, construction, facilities and assets management. Facilities managers are in the forefront of delivering sustainable assets management and hence further the venture for mitigation and adaptation to climate change. A questionnaire survey was conducted to establish perceptions, level of commitment and knowledge chasm in practising sustainable facilities management (FM). This has significant implications for sustainable design management, especially in a fragmented industry. The majority of questionnaire respondents indicated the importance of sustainability for their organization. Many of them stated that they reported on sustainability as part of their organization annual reporting with energy efficiency, recycling and waste reduction as the main concern for them. The overwhelming barrier for implementing sound, sustainable FM is the lack of consensual understanding and focus of individuals and organizations about sustainability. There is a knowledge chasm regarding practical information on delivering sustainable FM. Sustainability information asymmetry in design, construction and FM processes render any sustainable design as a sentiment and mere design aspiration. Skills and training provision, traditionally offered separately to designers and facilities managers, needs to be re-evaluated. Sustainability education and training should be developed to provide effective structures and processes to apply sustainability throughout the construction and FM industries coherently and as common practice.
Resumo:
Calls for understanding the interface between L2 linguistic knowledge and development (Gregg 1996; Carroll 2001; Towell 2003) provide a context for analysing the role of memory (Paradis 2004), specifically working memory (Baddeley 1986, 2003) in L2 development. Miyake and Friedman (1998) have claimed that Working Memory (WM) may be the key to L2 acquisition, especially in explaining individual variation in L2 acquisition. Recent findings found a robust connection between greater working memory (WM) capacity and rapid, successful acquisition of L2 vocabulary, reading and oral fluency (Service 1992; Harrington and Sawyer 1992; Fortkamp 1999). This study adds to the growing body of research by investigating correlations between WM and variation in grammatical development, focusing on asymmetries in processing L2 English wh-constructions in an immersion setting.
Resumo:
Debate about the definition of “small state” has produced more fragmentation than consensus, even as the literature has demonstrated its subjects’ roles in joining international organizations propagating norms, executing creative diplomacy, influencing allies, avoiding and joining conflicts, and building peace. However, work on small states has struggled to identify commonalities in these states’ international relations, to cumulate knowledge, or to impact broader IR theory. This paper advocates a changed conceptual and definitional framework. Analysis of “small states” should pivot to examine the dynamics of the asymmetrical relationships in which these states are engaged. Instead of seeking an overall metric for size as the relevant variable—falling victim in a different way Dahl’s “lump-of-power fallacy,” we can recognize the multifaceted, variegated nature of power, whether in war or peacetime.
Resumo:
Climate model simulations of past and future climate invariably contain prescribed zonal mean stratospheric ozone. While the effects of zonal asymmetry in ozone have been examined in the Northern Hemisphere, much greater zonal asymmetry occurs in the Southern Hemisphere during the break up of the Antarctic ozone hole. We prescribe a realistic three-dimensional distribution of ozone in a high vertical resolution atmospheric model and compare results with a simulation containing zonal mean ozone. Prescribing the three dimensional ozone distribution results in a cooling of the stratosphere and upper troposphere comparable to that caused by ozone depletion itself. Our results suggest that changes in the zonal asymmetry of ozone have had important impacts on Southern Hemisphere climate, and will continue to do so in the future.
Resumo:
Recent analysis of the Arctic Oscillation (AO) in the stratosphere and troposphere has suggested that predictability of the state of the tropospheric AO may be obtained from the state of the stratospheric AO. However, much of this research has been of a purely qualitative nature. We present a more thorough statistical analysis of a long AO amplitude dataset which seeks to establish the magnitude of such a link. A relationship between the AO in the lower stratosphere and on the 1000 hPa surface on a 10-45 day time-scale is revealed. The relationship accounts for 5% of the variance of the 1000 hPa time series at its peak value and is significant at the 5% level. Over a similar time-scale the 1000 hPa time series accounts for 1% of itself and is not significant at the 5% level. Further investigation of the relationship reveals that it is only present during the winter season and in particular during February and March. It is also demonstrated that using stratospheric AO amplitude data as a predictor in a simple statistical model results in a gain of skill of 5% over a troposphere-only statistical model. This gain in skill is not repeated if an unrelated time series is included as a predictor in the model. Copyright © 2003 Royal Meteorological Society
Resumo:
Matheron's usual variogram estimator can result in unreliable variograms when data are strongly asymmetric or skewed. Asymmetry in a distribution can arise from a long tail of values in the underlying process or from outliers that belong to another population that contaminate the primary process. This paper examines the effects of underlying asymmetry on the variogram and on the accuracy of prediction, and the second one examines the effects arising from outliers. Standard geostatistical texts suggest ways of dealing with underlying asymmetry; however, this is based on informed intuition rather than detailed investigation. To determine whether the methods generally used to deal with underlying asymmetry are appropriate, the effects of different coefficients of skewness on the shape of the experimental variogram and on the model parameters were investigated. Simulated annealing was used to create normally distributed random fields of different size from variograms with different nugget:sill ratios. These data were then modified to give different degrees of asymmetry and the experimental variogram was computed in each case. The effects of standard data transformations on the form of the variogram were also investigated. Cross-validation was used to assess quantitatively the performance of the different variogram models for kriging. The results showed that the shape of the variogram was affected by the degree of asymmetry, and that the effect increased as the size of data set decreased. Transformations of the data were more effective in reducing the skewness coefficient in the larger sets of data. Cross-validation confirmed that variogram models from transformed data were more suitable for kriging than were those from the raw asymmetric data. The results of this study have implications for the 'standard best practice' in dealing with asymmetry in data for geostatistical analyses. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
This article reports on an exploratory investigation into the listening strategies of lower-intermediate learners of French as an L2, including the sources of knowledge they employed in order to comprehend spoken French. Data from 14 learners were analysed to investigate whether employment of strategies in general and sources of knowledge in particular varied according to the underlying linguistic knowledge of the student. While low linguistic knowledge learners were less likely to deploy effectively certain strategies or strategy clusters, high linguistic knowledge levels were not always associated with effective strategy use. Similarly, while there was an association between linguistic knowledge and learners’ ability to draw on more than one source of knowledge in a facilitative manner, there was also evidence that learners tended to over-rely on linguistic knowledge where other sources, such as world knowledge, would have proved facilitative. We conclude by arguing for a fresh approach to listening pedagogy and research, including strategy instruction, bottom-up skill development and a consideration of the role of linguistic knowledge in strategy use.
Resumo:
Seventeen-month-old infants were presented with pairs of images, in silence or with the non-directive auditory stimulus 'look!'. The images had been chosen so that one image depicted an item whose name was known to the infant, and the other image depicted an image whose name was not known to the infant. Infants looked longer at images for which they had names than at images for which they did not have names, despite the absence of any referential input. The experiment controlled for the familiarity of the objects depicted: in each trial, image pairs presented to infants had previously been judged by caregivers to be of roughly equal familiarity. From a theoretical perspective, the results indicate that objects with names are of intrinsic interest to the infant. The possible causal direction for this linkage is discussed and it is concluded that the results are consistent with Whorfian linguistic determinism, although other construals are possible. From a methodological perspective, the results have implications for the use of preferential looking as an index of early word comprehension.
Resumo:
Negative correlations between task performance in dynamic control tasks and verbalizable knowledge, as assessed by a post-task questionnaire, have been interpreted as dissociations that indicate two antagonistic modes of learning, one being “explicit”, the other “implicit”. This paper views the control tasks as finite-state automata and offers an alternative interpretation of these negative correlations. It is argued that “good controllers” observe fewer different state transitions and, consequently, can answer fewer post-task questions about system transitions than can “bad controllers”. Two experiments demonstrate the validity of the argument by showing the predicted negative relationship between control performance and the number of explored state transitions, and the predicted positive relationship between the number of explored state transitions and questionnaire scores. However, the experiments also elucidate important boundary conditions for the critical effects. We discuss the implications of these findings, and of other problems arising from the process control paradigm, for conclusions about implicit versus explicit learning processes.
Resumo:
This paper describes the user modeling component of EPIAIM, a consultation system for data analysis in epidemiology. The component is aimed at representing knowledge of concepts in the domain, so that their explanations can be adapted to user needs. The first part of the paper describes two studies aimed at analysing user requirements. The first one is a questionnaire study which examines the respondents' familiarity with concepts. The second one is an analysis of concept descriptions in textbooks and from expert epidemiologists, which examines how discourse strategies are tailored to the level of experience of the expected audience. The second part of the paper describes how the results of these studies have been used to design the user modeling component of EPIAIM. This module works in a two-step approach. In the first step, a few trigger questions allow the activation of a stereotype that includes a "body" and an "inference component". The body is the representation of the body of knowledge that a class of users is expected to know, along with the probability that the knowledge is known. In the inference component, the learning process of concepts is represented as a belief network. Hence, in the second step the belief network is used to refine the initial default information in the stereotype's body. This is done by asking a few questions on those concepts where it is uncertain whether or not they are known to the user, and propagating this new evidence to revise the whole situation. The system has been implemented on a workstation under UNIX. An example of functioning is presented, and advantages and limitations of the approach are discussed.
Resumo:
Two experiments examined the claim for distinct implicit and explicit learning modes in the artificial grammar-learning task (Reber, 1967, 1989). Subjects initially attempted to memorize strings of letters generated by a finite-state grammar and then classified new grammatical and nongrammatical strings. Experiment 1 showed that subjects' assessment of isolated parts of strings was sufficient to account for their classification performance but that the rules elicited in free report were not sufficient. Experiment 2 showed that performing a concurrent random number generation task under different priorities interfered with free report and classification performance equally. Furthermore, giving different groups of subjects incidental or intentional learning instructions did not affect classification or free report.