3 resultados para Causality
em CORA - Cork Open Research Archive - University College Cork - Ireland
Resumo:
Organizations that leverage lessons learned from their experience in the practice of complex real-world activities are faced with five difficult problems. First, how to represent the learning situation in a recognizable way. Second, how to represent what was actually done in terms of repeatable actions. Third, how to assess performance taking account of the particular circumstances. Fourth, how to abstract lessons learned that are re-usable on future occasions. Fifth, how to determine whether to pursue practice maturity or strategic relevance of activities. Here, organizational learning and performance improvement are investigated in a field study using the Context-based Intelligent Assistant Support (CIAS) approach. A new conceptual framework for practice-based organizational learning and performance improvement is presented that supports researchers and practitioners address the problems evoked and contributes to a practice-based approach to activity management. The novelty of the research lies in the simultaneous study of the different levels involved in the activity. Route selection in light rail infrastructure projects involves practices at both the strategic and operational levels; it is part managerial/political and part engineering. Aspectual comparison of practices represented in Contextual Graphs constitutes a new approach to the selection of Key Performance Indicators (KPIs). This approach is free from causality assumptions and forms the basis of a new approach to practice-based organizational learning and performance improvement. The evolution of practices in contextual graphs is shown to be an objective and measurable expression of organizational learning. This diachronic representation is interpreted using a practice-based organizational learning novelty typology. This dissertation shows how lessons learned when effectively leveraged by an organization lead to practice maturity. The practice maturity level of an activity in combination with an assessment of an activity’s strategic relevance can be used by management to prioritize improvement effort.
Resumo:
Childhood asthma, allergic rhinitis and eczema are complex heterogenic chronic inflammatory allergic disorders which constitute a major burden to children, their families. The prevalence of childhood allergic disorders is increasing worldwide and merely rudimentary understanding exists regarding causality, or the influence of the environment on disease expression. Phase Three of the International Study of Asthma and Allergy in Childhood (ISAAC) reported that Irish adolescents had the 4th highest eczema and rhinoconjunctivitis prevalence and 3rd highest asthma prevalence in the world. There are no ISAAC data pertaining to young Irish children. In 2002, Sturley reported a high prevalence of current asthma in Cork primary school children aged 6-9 years. This thesis comprises of three cross-sectional studies which examined the prevalence of and associations with childhood allergy and a quasi-retrospective cohort study which observed the natural history of allergy from 6-9 until 11-13 years. Although not part of ISAAC, data was attained by parentally completed ISAAC-based questionnaires, using the ISAAC protocol. The prevalence, natural history and risk factors of childhood allergy in Ireland, as described in this thesis, echo those in worldwide allergy research. The variations of prevalence in different populations worldwide and the recurring themes of associations between childhood allergy and microbial exposures, from farming environments and/or gastrointestinal infections, as shown in this thesis, strengthen the mounting evidence that microbial exposure on GALT may hold the key to the mechanisms of allergy development. In this regard, probiotics may be an area of particular interest in allergy modification. Although their effects in relation to allergy, have been investigated now for several years, our knowledge of their diversity, complex functions and interactions with gut microflora, remain rudimentary. Birth cohort studies which include genomic and microbiomic research are recommended in order to examine the underlying mechanisms and the natural course of allergic diseases.
Resumo:
It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain