43 resultados para Not ludic way of playing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Imagined intergroup contact (Crisp & R. Turner, 2009) is a new indirect contact strategy for promoting tolerance and more positive intergroup relations. In this chapter, we review existing research on imagined contact and propose two routes-cognitive and affective-through which it can exert a positive influence on contact-related attitudes and intentions. We first review research that has established the beneficial impacts of imagined contact on intergroup attitudes via reduced intergroup anxiety, supporting its efficacy as an intervention where there exists little or no opportunity for direct contact. We then review more recent research showing that imagined contact not only improves attitudes, but can also enhance intentions to engage in future contact. These studies suggest that contact imagery provides a behavioural script that forms the cognitive basis for subsequent judgements about future contact intentions. Collectively, the findings from this research programme support the idea that imagined contact can complement more direct forms of contact-providing a way of initially encouraging an interest in engaging positively with outgroups before introducing face-to-face encounters. We discuss the implications of these findings for future theory and research, and how they can inform prejudice-reduction interventions seeking to capitalise on the beneficial effects of mental imagery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although a substantial corpus of digital materials is now available to scholarship across the disciplines, objective evidence of their use, impact, and value, based on a robust assessment, is sparse. Traditional methods of assessment of impact in the humanities, notably citation in scholarly publications, are not an effective way of assessing impact of digital content. These issues are problematic in the field of Digital Humanities where there is a need to effectively assess impact to justify its continued funding and existence. A number of qualitative and quantitative methods exist that can be used to monitor the use of digital resources in various contexts although they have yet to be applied widely. These have been made available to the creators, managers, and funders of digital content in an accessible form through the TIDSR (Toolkit for the Impact of Digital Scholarly Resources) developed by the Oxford Internet Institute. In 2011, the authors of this article developed the SPHERE project (Stormont Parliamentary Hansards: Embedded in Research and Education) specifically to use TIDSR to evaluate the use and impact of The Stormont Papers, a digital collection of the Hansards of the Stormont Northern Irish Parliament from 1921 to 1972. This article presents the methodology, findings, and analysis of the project. The authors argue that TIDSR is a useful and, critically, transferrable method to understand and increase the impact of digital resources. The findings of the project are modified into a series of wider recommendations on protecting the investment in digital resources by increasing their use, value, and impact. It is reasonable to suggest that effectively showing the impact of Digital Humanities is critical to its survival.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article explores the dynamics of the space of exception at the borders of Europe in the Spanish enclave of Melilla, and the neighboring Moroccan city of Oujda. Building upon field research conducted in the spring of 2008, I ask how we can understand the political space of migration not simply as exceptional, but as shaped by the mobility of the irregular migrants moving outside of the frameworks, policies, and practices of the state. By privileging the migrant narrative and making use of Rancière's conception of politics as shaped by the demands of those who “have no part,” I suggest an alternative way of understanding the politics of exception and agency of non-citizens—that is, one of disruption and demands to open up powerful potentials for change in an otherwise rigid regime.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Poverty research has increasingly focused on persistent income poverty, both as a crucial social indicator and as a target for policy intervention. Such an approach can lead to an identification of a sub-set of poor individuals facing particularly adverse circumstances and/or distinctive problems in escaping from poverty. Here we seek to establish whether, in comparison with cross-sectional measures, persistent poverty measures also provide a better measure of exclusion from a minimally acceptable way of life and relate with other important variables in a logical fashion. Our analysis draws upon the first three waves of the ECHP and shows that a persistent poverty measure does constitute a significant improvement over its cross-sectional counterpart in the explanation of levels of deprivation. Persistent poverty is related to life-style deprivation in a manner that comes close to being uniform across countries. The measure of persistence also conforms to our expectations of how a poverty measure should behave in that, unlike relative income poverty lines, defining the threshold level more stringently enables us to identify progressively groups of increasingly deprived respondents. Overall the persistent poverty measure constitutes a significant advance on cross-sectional income measures. However, there is clearly a great deal relating to the process of accumulation and of erosion of resources, which is not fully captured in the persistent poverty measure. In the absence of such information, there is a great deal to be said for making use of both types of indictors in formulating and evaluating policies while we continue to improve our understanding of longer-term processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As awareness of the limitations of relying solely on income to measure poverty has become more widespread, attention has been increasingly focused on multi-dimensional approaches, to the point where the EU has adopted a multidimensional poverty and social exclusion target for 2020. The rationale advanced is that the computation of a multidimensional poverty index is an effective way of communicating in a political environment, and a necessary tool in order to monitor 27 different national situations. By contrast with the rather ad hoc way in which the EU 2020 poverty target has been framed and rationalised, the adjusted head count ratio applied here has a number of desirable axiomatic properties. It constitutes a significant improvement on union and intersection approaches and allows for the decomposition of multidimensional poverty in terms of dimensions of deprivation and socio-economic attributes. Since understanding poverty as multidimensional does not necessarily require constructing a multidimensional poverty index, on the basis of our analysis we provide a more general consideration of the value of developing a multidimensional index of poverty for the European Union.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The key requirement for quantum networking is the distribution of entanglement between nodes. Surprisingly, entanglement can be generated across a network without direct transfer - or communication - of entanglement. In contrast to information gain, which cannot exceed the communicated information, the entanglement gain is bounded by the communicated quantum discord, a more general measure of quantum correlation that includes but is not limited to entanglement. Here, we experimentally entangle two communicating parties sharing three initially separable photonic qubits by exchange of a carrier photon that is unentangled with either party at all times. We show that distributing entanglement with separable carriers is resilient to noise and in some cases becomes the only way of distributing entanglement through noisy environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Self-determination and decision-making are acknowledged internationally as key rights of persons with disabilities and should play an important role in the development of educational plans and procedures. Not only is the chance for individuals with developmental disabilities to select their own tasks, leisure activities or reinforcers a valuable way of enhancing rights-based education and personal dignity, but choice-making opportunities may also function as a useful clinical or educational tool if they actually improve the efficacy of programmes aimed at the acquisition of socially relevant behaviours and life skills or the reduction of challenging behaviours.

The study reported here assessed whether or not choice affected effectiveness of an educational procedure for three children on the autism spectrum. Following a preference assessment, a number of discrete teaching trials were conducted with each child and, contingent upon targeted responses, either the child or the therapist selected one of three preferred reinforcer items. Reinforcer choice did not affect intervention effectiveness for two of the children; however, performance and motivation improved for the third child. Results re-affirmed the importance of thorough preference assessments prior to intervention and showed that additional stimulus choice contingent on the target response may improve motivation and outcomes for some children.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Time features in two key ways in cognition, each of which is discussed in turn in this chapter: time is processed as a dimension of stimuli or events, and time is represented as a framework in which events can be located. Section 1 examines the first of these from a developmental perspective, by reviewing research on age-related changes in the accuracy of duration processing. The Piagetian approach linked changes in duration processing to the development of a concept of time as a dimension of events separable from other event dimensions. This is contrasted with recent research conducted within the framework of Scalar Expectancy Theory, which models development in terms of changes in components of specialized timing mechanisms. Section 2 discusses developmental changes in the temporal frameworks that children use to represent the locations of events. Although as adults, we represent times as locations on a linear framework stretching from the past, to the present, and into the future, this way of representing time is not developmentally basic. A model is proposed of developmental stages in the acquisition of a mature temporal framework. The chapter concludes by considering two themes that cut across Section 1 and 2: the issue of whether there are both qualitative and quantitative change in children’s temporal abilities, and the link between temporal and spatial cognition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The technological constraints of early British television encouraged drama productions which emphasised the immediate, the enclosed and the close-up, an approach which Jason Jacobs described in the title of his seminal study as 'the intimate screen'. While Jacobs' book showed that this conception of early British television drama was only part of the reality, he did not focus on the role that special effects played in expanding the scope of the early television screen. This article will focus upon this role, showing that special effects were not only of use in expanding the temporal and spatial scope of television, but were also considered to be of interest to the audience as a way of exploring the new medium, receiving coverage in the popular press. These effects included pre-recorded film inserts, pre-recorded narration, multiple sets, model work and animation, combined with the live studio performances. Drawing upon archival research into television production files and scripts as well as audience responses and periodical coverage of television at the time of broadcast, this article will focus on telefantasy. This genre offered particular opportunities for utilising effects in ways that seemed appropriate for the experimentation with the form of television and for the drama narratives. This period also saw a variety of shifts within television as the BBC sought to determine a specific identity and understand the possibilities for the new medium.
This research also incorporates the BBC's own research and internal dialogue concerning audiences and how their tastes should best be met, at a time when the television audience was not only growing in terms of number but was also expanding geographically and socially beyond the moneyed Londoners who could afford the first television sets and were within range of the Alexandra Palace transmissions. The primary case study for this article will be the 1949 production of H.G.Wells’ The Time Machine, which incorporated pre-recorded audio and film inserts, which expanded the narrative out of the live studio performance both temporally and spatially, with the effects work receiving coverage in the popular magazine Illustrated. Other productions considered will be the 1938 and 1948 productions of RUR, the 1948 production of Blithe Spirit, and the 1950 adaptation of The Strange Case of Dr Jekyll and Mr Hyde. Despite the focus on telefantasy, this article will also include examples from other genres, both dramatic and factual, showing how the BBC's response to the changing television audience was to restrict drama to a more 'realistic' aesthetic and to move experimentation with televisual form to non-drama productions such as variety performances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel way of cooking rice to maximize the removal of the carcinogen inorganic arsenic (Asi) is presented here. In conventional rice cooking water and grain are in continuous contact, and it is known that the larger the water:rice cooking ratio, the more Asi removed by cooking, suggesting that the Asi in the grain is mobile in water. Experiments were designed where rice is cooked in a continual stream of percolating near boiling water, either low in Asi, or Asi free. This has the advantage of not only exposing grain to large volumes of cooking water, but also physically removes any Asi leached from the grain into the water receiving vessel. The relationship between cooking water volume and Asi removal in conventional rice cooking was demonstrated for the rice types under study. At a water-to-rice cooking ratio of 12:1, 57±5% of Asi could be removed, average of 6 wholegrain and 6 polished rice samples. Two types of percolating technology were tested, one where the cooking water was recycled through condensing boiling water steam and passing the freshly distilled hot water through the grain in a laboratory setting, and one where tap water was used to cook the rice held in an off-the-shelf coffee percolator in a domestic setting. Both approaches proved highly effective in removing Asi from the cooking rice, with up to 85% of Asi removed from individual rice types. For the recycled water experiment 59±8% and 69±10% of Asi was removed, on average, compared to uncooked rice for polished (n=27) and wholegrain (n=13) rice, respectively. For coffee percolation there was no difference between wholegrain and polished rice, and the effectiveness of Asi removal was 49±7% across 6 wholegrain and 6 polished rice samples. The manuscript explores the potential applications and further optimization of this percolating cooking water, high Asi removal, discovery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: To survey the outcomes used in Cochrane Reviews, as part of our work within the Core Outcome Measures in Effectiveness Trials Initiative.

STUDY DESIGN AND SETTING: A descriptive survey of Cochrane Reviews, divided by Cochrane Review Group (CRG), published in full for the first time in 2007 and 2011. Outcomes specified in the methods section of each review and outcomes reported in the results section of each review were of interest, in this exploration of the common use of outcomes and core outcome sets (COS).

RESULTS: Seven hundred eighty-eight reviews, specifying 6,127 outcomes, were included. When we excluded specified outcomes from the 86 reviews that did not include any studies, we found that 1,996 (37%) specified outcomes were not reported. Of the 361 new reviews with studies from 2011, 113 (31%) had a "summary of findings" table (SoF). Fifteen broad outcome categories were identified and used to manage the outcome data. We found consistency in the use of these categories across CRGs but inconsistency in outcomes within these categories.

CONCLUSION: COS have been used rarely in Cochrane Reviews, but the introduction of SoF makes the development and application of COS timelier than ever.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This review describes an approach to the prevention of graft-versus-host disease (GVHD) and graft rejection following allogeneic BMT that differs from conventional methods. Ultraviolet (UV) irradiation inhibits the proliferative responses of lymphoid cells to mitogens and alloantigens by inactivation of T lymphocytes and dendritic cells, and in animal models this can prevent both GVHD and graft rejection. It is important that the marrow repopulating capacity of haemopoietic stem cells is not damaged by the irradiation process. We have found that polymorphic microsatellite markers are a sensitive way of assessing the impact of UV irradiation on chimerism after BMT in rodents.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Peak power consumption is the first order design constraint of data centers. Though peak power consumption is rarely, if ever, observed, the entire data center facility must prepare for it, leading to inefficient usage of its resources. The most prominent way for addressing this issue is to limit the power consumption of the data center IT facility far below its theoretical peak value. Many approaches have been proposed to achieve that, based on the same small set of enforcement mechanisms, but there has been no corresponding work on systematically examining the advantages and disadvantages of each such mechanism. In the absence of such a study,it is unclear what is the optimal mechanism for a given computing environment, which can lead to unnecessarily poor performance if an inappropriate scheme is used. This paper fills this gap by comparing for the first time five widely used power capping mechanisms under the same hardware/software setting. We also explore possible alternative power capping mechanisms beyond what has been previously proposed and evaluate them under the same setup. We systematically analyze the strengths and weaknesses of each mechanism, in terms of energy efficiency, overhead, and predictable behavior. We show how these mechanisms can be combined in order to implement an optimal power capping mechanism which reduces the slow down compared to the most widely used mechanism by up to 88%. Our results provide interesting insights regarding the different trade-offs of power capping techniques, which will be useful for designing and implementing highly efficient power capping in the future.