2 resultados para distributed cognition theory

em CORA - Cork Open Research Archive - University College Cork - Ireland


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of this study is to explore the nature and how of leadership in Irish post-primary schools. It considers school leadership within the context of contemporary Distributed Leadership theory. Associated concepts such as Distributed Cognition and Activity Theory are used to frame the study. From a distributed perspective, it is now widely accepted that other agents (e.g. teachers) have a leadership role, as part of collaborative, participative and supportive learning communities. Thus, this study considers how principals interact and build leadership capacity throughout the school. The study draws on two main sources of evidence. In analysing the implications of accountability agendas for school leadership, there is an exploration and focus on the conceptualisations of school leadership that are fore-grounded in 21 WSE reports. Elements of Critical Discourse Analysis are employed as an investigative tool to decipher how the construction of leadership practice is produced. The second prong of the study explores leadership in 3 case-study post-primary schools. Leadership is a complex phenomenon and not easy to describe. The findings clarify, however, that school leadership is a construct beyond the scope of the principal alone. While there is widespread support for a distributed model of leadership, the concept does not explicitly form part of the discourse in the case-study schools. It is also evident that any attempt to understand leadership practice must connect local interpretations with broader discourses. The understanding and practice of leadership is best understood in its sociohistorical context. The study reveals that, in the Irish post-primary school, the historical dimension is very influential, while the situational setting, involving a particular set of agents and agendas, strongly shapes thinking and practices. This study is novel as it synthesises two key sources of evidence. It is of great value in that it teases out the various historical and situational aspects to enhance understandings of school leadership in contemporary Ireland. It raises important questions for policy, practice and further research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain