10 resultados para ethics in the real world
em CORA - Cork Open Research Archive - University College Cork - Ireland
Resumo:
This study assesses regional health patterns in early medieval Ireland and Britain by analysing and interpreting palaeopathological indicators of stress. This was achieved by incorporating the results of demographic and palaeopathological study into the specific historical contexts. Although relatively small islands, both are home to unique and diverse cultural, physical, and political landscapes, which could potentially affect the general health of the population in different ways. To accurately answer the research question, a bioarchaeological survey of six regions within both islands was carried out, specifically analysing and comparing the demographic profile and general health trends within each region with one another. Results from the analysis have demonstrated statistically significant differences within and between the islands. Inferring that even the more subtle differences observed within the cultural, physical, and political landscapes, such as in the case of Ireland and Britain, can and do affect general health trends. The health of early medieval Ireland and Britain appears to be significantly affected by the physical landscape, specifically a north/south divide. The most northerly regions, Scotland South and Ireland North, manifested higher levels of stress indicators when compared to the more southerly positioned regions. Although it can only be hypothesised what factors within these regions are causing, enhancing or buffering stress, the study has established the potential and necessity for regional work to be continued when interpreting the historical past of these two islands.
Resumo:
Since the emergence of the European Landscape Convention (ELC) in 2000, the important link between landscape and planning has greatly intensified. Now, more than ever, the fundamental role of the planning system in delivering the ELC’s requirements is recognised. This has been further substantiated within Ireland’s recently published National Landscape Strategy. However it has continually been suggested that decision-making processes need to adapt better to the holistic, valueladen and multidimensional approaches underpinning the ELC. In light of these milestones for the preservation, management and planning of landscape, this research sets out to establish synergies and disparities in the existing relationship between landscape and planning. It investigates detailed evidence of the presence and manifestations of landscape in key processes of day-to-day planning practice in Ireland, from individual planning appeals and ‘special’ cases, to the major strategic instruments that inform the making of landscape policies within development plans. This is set within wider theoretical and policy contexts where the compatibility of landscape and planning is subjected to critical scrutiny and then explored through these practical case studies. Driving this research is the intention to make a case for the planning domain to be an ideal ‘home’ for landscape – in all its deep, multidimensional meaning – and for enhancing landscape arguments and objectives in the face of conflict, competing values and power-plays in the real world. Emerging out of this research is a set of recommendations for how, at a national level, new approaches for decision making for and about landscape can be more effective and meaningful.
Resumo:
Wind energy is the energy source that contributes most to the renewable energy mix of European countries. While there are good wind resources throughout Europe, the intermittency of the wind represents a major problem for the deployment of wind energy into the electricity networks. To ensure grid security a Transmission System Operator needs today for each kilowatt of wind energy either an equal amount of spinning reserve or a forecasting system that can predict the amount of energy that will be produced from wind over a period of 1 to 48 hours. In the range from 5m/s to 15m/s a wind turbine’s production increases with a power of three. For this reason, a Transmission System Operator requires an accuracy for wind speed forecasts of 1m/s in this wind speed range. Forecasting wind energy with a numerical weather prediction model in this context builds the background of this work. The author’s goal was to present a pragmatic solution to this specific problem in the ”real world”. This work therefore has to be seen in a technical context and hence does not provide nor intends to provide a general overview of the benefits and drawbacks of wind energy as a renewable energy source. In the first part of this work the accuracy requirements of the energy sector for wind speed predictions from numerical weather prediction models are described and analysed. A unique set of numerical experiments has been carried out in collaboration with the Danish Meteorological Institute to investigate the forecast quality of an operational numerical weather prediction model for this purpose. The results of this investigation revealed that the accuracy requirements for wind speed and wind power forecasts from today’s numerical weather prediction models can only be met at certain times. This means that the uncertainty of the forecast quality becomes a parameter that is as important as the wind speed and wind power itself. To quantify the uncertainty of a forecast valid for tomorrow requires an ensemble of forecasts. In the second part of this work such an ensemble of forecasts was designed and verified for its ability to quantify the forecast error. This was accomplished by correlating the measured error and the forecasted uncertainty on area integrated wind speed and wind power in Denmark and Ireland. A correlation of 93% was achieved in these areas. This method cannot solve the accuracy requirements of the energy sector. By knowing the uncertainty of the forecasts, the focus can however be put on the accuracy requirements at times when it is possible to accurately predict the weather. Thus, this result presents a major step forward in making wind energy a compatible energy source in the future.
Resumo:
The past two decades has seen a dramatic upheaval in the international world order: the end of the Cold War, the 9/11 attacks and the subsequent 'War on Terror', increased Jihadist activities, the accelerated pace of globalization, climate change and the 2008 global financial crisis have contributed to fear, uncertainty, poverty, conflict, massive displacements of populations of asylum seekers and refugees globally and a proliferation of Protracted Refugee Situations (PRS), defined as situations in which refugees have been in exile 'for 5 years or more after their initial displacement, without immediate prospects for implementation of durable solutions. In the past two decades there has been a huge proliferation of these with more than 7.2 million refugees now trapped in these PRS, with a further 16 million internally displaced persons (IDPs) trapped in camps within their own countries. The Dadaab refugee complex in Kenya, which of as March 2012, holds over 463,000 refugees, is the most significant and extreme example in recent times of a PRS. It was established in 1991 following the collapse of the Somali Government of Dictator Siad Barre, and the disintegration of Somalia into the chaos that still exists today. PRS such as Dadaab raise particular issues about humanitarianism in terms of aid, protection, security, human rights and the actions (or inaction) of the various stakeholders on an international, national and local level. This thesis investigates these issues by the use of a case study methodology on Dadaab as a PRS, framed in the context of humanitarianism and in particular the issues that arise in terms of how the international community, the UN system and individual states provide assistance and protection to vulnerable populations. Although the refugee camps have been in existence (as of 2012) for over 20 years, there has never been such a detailed study of Dadaab (or any other PRS) undertaken to date and would be of interest to academics in the areas of international relations, refugee/migration studies and global Governance as well as practitioners in both humanitarian response and development
Resumo:
A wireless sensor network can become partitioned due to node failure, requiring the deployment of additional relay nodes in order to restore network connectivity. This introduces an optimisation problem involving a tradeoff between the number of additional nodes that are required and the costs of moving through the sensor field for the purpose of node placement. This tradeoff is application-dependent, influenced for example by the relative urgency of network restoration. In addition, minimising the number of relay nodes might lead to long routing paths to the sink, which may cause problems of data latency. This data latency is extremely important in wireless sensor network applications such as battlefield surveillance, intrusion detection, disaster rescue, highway traffic coordination, etc. where they must not violate the real-time constraints. Therefore, we also consider the problem of deploying multiple sinks in order to improve the network performance. Previous research has only parts of this problem in isolation, and has not properly considered the problems of moving through a constrained environment or discovering changes to that environment during the repair or network quality after the restoration. In this thesis, we firstly consider a base problem in which we assume the exploration tasks have already been completed, and so our aim is to optimise our use of resources in the static fully observed problem. In the real world, we would not know the radio and physical environments after damage, and this creates a dynamic problem where damage must be discovered. Therefore, we extend to the dynamic problem in which the network repair problem considers both exploration and restoration. We then add a hop-count constraint for network quality in which the desired locations can talk to a sink within a hop count limit after the network is restored. For each new problem of the network repair, we have proposed different solutions (heuristics and/or complete algorithms) which prioritise different objectives. We evaluate our solutions based on simulation, assessing the quality of solutions (node cost, movement cost, computation time, and total restoration time) by varying the problem types and the capability of the agent that makes the repair. We show that the relative importance of the objectives influences the choice of algorithm, and different speeds of movement for the repairing agent have a significant impact on performance, and must be taken into account when selecting the algorithm. In particular, the node-based approaches are the best in the node cost, and the path-based approaches are the best in the mobility cost. For the total restoration time, the node-based approaches are the best with a fast moving agent while the path-based approaches are the best with a slow moving agent. For a medium speed moving agent, the total restoration time of the node-based approaches and that of the path-based approaches are almost balanced.
Resumo:
The International Energy Agency has repeatedly identified increased end-use energy efficiency as the quickest, least costly method of green house gas mitigation, most recently in the 2012 World Energy Outlook, and urges all governing bodies to increase efforts to promote energy efficiency policies and technologies. The residential sector is recognised as a major potential source of cost effective energy efficiency gains. Within the EU this relative importance can be seen from a review of the National Energy Efficiency Action Plans (NEEAP) submitted by member states, which in all cases place a large emphasis on the residential sector. This is particularly true for Ireland whose residential sector has historically had higher energy consumption and CO2 emissions than the EU average and whose first NEEAP targeted 44% of the energy savings to be achieved in 2020 from this sector. This thesis develops a bottom-up engineering archetype modelling approach to analyse the Irish residential sector and to estimate the technical energy savings potential of a number of policy measures. First, a model of space and water heating energy demand for new dwellings is built and used to estimate the technical energy savings potential due to the introduction of the 2008 and 2010 changes to part L of the building regulations governing energy efficiency in new dwellings. Next, the author makes use of a valuable new dataset of Building Energy Rating (BER) survey results to first characterise the highly heterogeneous stock of existing dwellings, and then to estimate the technical energy savings potential of an ambitious national retrofit programme targeting up to 1 million residential dwellings. This thesis also presents work carried out by the author as part of a collaboration to produce a bottom-up, multi-sector LEAP model for Ireland. Overall this work highlights the challenges faced in successfully implementing both sets of policy measures. It points to the wide potential range of final savings possible from particular policy measures and the resulting high degree of uncertainty as to whether particular targets will be met and identifies the key factors on which the success of these policies will depend. It makes recommendations on further modelling work and on the improvements necessary in the data available to researchers and policy makers alike in order to develop increasingly sophisticated residential energy demand models and better inform policy.
Resumo:
Malaria is still one of the biggest health threats in the developing world, with an estimated 300 million episodes per year and one million deaths, most of which are in sub-Saharan Africa. Although the efficacy and cost-effectiveness of treated bed nets has been widely reported, little is known about the range, strength, or interaction between different factors that influence their demand at the household level. This study modeled the determinants of bed net ownership as well as the factors that influence the number of bed nets purchased. Data was collected from 1,700 randomly selected households in the Farafenni region of The Gambia. Interviews were also held with 129 community spokespersons to explore the extent to which community level factors such as the quality of roads and access to market centers also influence demand for bed nets. The results of each model of demand and their policy implications are discussed.
Resumo:
The electroencephalogram (EEG) is a medical technology that is used in the monitoring of the brain and in the diagnosis of many neurological illnesses. Although coarse in its precision, the EEG is a non-invasive tool that requires minimal set-up times, and is suitably unobtrusive and mobile to allow continuous monitoring of the patient, either in clinical or domestic environments. Consequently, the EEG is the current tool-of-choice with which to continuously monitor the brain where temporal resolution, ease-of- use and mobility are important. Traditionally, EEG data are examined by a trained clinician who identifies neurological events of interest. However, recent advances in signal processing and machine learning techniques have allowed the automated detection of neurological events for many medical applications. In doing so, the burden of work on the clinician has been significantly reduced, improving the response time to illness, and allowing the relevant medical treatment to be administered within minutes rather than hours. However, as typical EEG signals are of the order of microvolts (μV ), contamination by signals arising from sources other than the brain is frequent. These extra-cerebral sources, known as artefacts, can significantly distort the EEG signal, making its interpretation difficult, and can dramatically disimprove automatic neurological event detection classification performance. This thesis therefore, contributes to the further improvement of auto- mated neurological event detection systems, by identifying some of the major obstacles in deploying these EEG systems in ambulatory and clinical environments so that the EEG technologies can emerge from the laboratory towards real-world settings, where they can have a real-impact on the lives of patients. In this context, the thesis tackles three major problems in EEG monitoring, namely: (i) the problem of head-movement artefacts in ambulatory EEG, (ii) the high numbers of false detections in state-of-the-art, automated, epileptiform activity detection systems and (iii) false detections in state-of-the-art, automated neonatal seizure detection systems. To accomplish this, the thesis employs a wide range of statistical, signal processing and machine learning techniques drawn from mathematics, engineering and computer science. The first body of work outlined in this thesis proposes a system to automatically detect head-movement artefacts in ambulatory EEG and utilises supervised machine learning classifiers to do so. The resulting head-movement artefact detection system is the first of its kind and offers accurate detection of head-movement artefacts in ambulatory EEG. Subsequently, addtional physiological signals, in the form of gyroscopes, are used to detect head-movements and in doing so, bring additional information to the head- movement artefact detection task. A framework for combining EEG and gyroscope signals is then developed, offering improved head-movement arte- fact detection. The artefact detection methods developed for ambulatory EEG are subsequently adapted for use in an automated epileptiform activity detection system. Information from support vector machines classifiers used to detect epileptiform activity is fused with information from artefact-specific detection classifiers in order to significantly reduce the number of false detections in the epileptiform activity detection system. By this means, epileptiform activity detection which compares favourably with other state-of-the-art systems is achieved. Finally, the problem of false detections in automated neonatal seizure detection is approached in an alternative manner; blind source separation techniques, complimented with information from additional physiological signals are used to remove respiration artefact from the EEG. In utilising these methods, some encouraging advances have been made in detecting and removing respiration artefacts from the neonatal EEG, and in doing so, the performance of the underlying diagnostic technology is improved, bringing its deployment in the real-world, clinical domain one step closer.
Resumo:
The original solution to the high failure rate of software development projects was the imposition of an engineering approach to software development, with processes aimed at providing a repeatable structure to maintain a consistency in the ‘production process’. Despite these attempts at addressing the crisis in software development, others have argued that the rigid processes of an engineering approach did not provide the solution. The Agile approach to software development strives to change how software is developed. It does this primarily by relying on empowered teams of developers who are trusted to manage the necessary tasks, and who accept that change is a necessary part of a development project. The use of, and interest in, Agile methods in software development projects has expanded greatly, yet this has been predominantly practitioner driven. There is a paucity of scientific research on Agile methods and how they are adopted and managed. This study aims at addressing this paucity by examining the adoption of Agile through a theoretical lens. The lens used in this research is that of double loop learning theory. The behaviours required in an Agile team are the same behaviours required in double loop learning; therefore, a transition to double loop learning is required for a successful Agile adoption. The theory of triple loop learning highlights that power factors (or power mechanisms in this research) can inhibit the attainment of double loop learning. This study identifies the negative behaviours - potential power mechanisms - that can inhibit the double loop learning inherent in an Agile adoption, to determine how the Agile processes and behaviours can create these power mechanisms, and how these power mechanisms impact on double loop learning and the Agile adoption. This is a critical realist study, which acknowledges that the real world is a complex one, hierarchically structured into layers. An a priori framework is created to represent these layers, which are categorised as: the Agile context, the power mechanisms, and double loop learning. The aim of the framework is to explain how the Agile processes and behaviours, through the teams of developers and project managers, can ultimately impact on the double loop learning behaviours required in an Agile adoption. Four case studies provide further refinement to the framework, with changes required due to observations which were often different to what existing literature would have predicted. The study concludes by explaining how the teams of developers, the individual developers, and the project managers, working with the Agile processes and required behaviours, can inhibit the double loop learning required in an Agile adoption. A solution is then proposed to mitigate these negative impacts. Additionally, two new research processes are introduced to add to the Information Systems research toolkit.
Resumo:
Process guidance supports users to increase their process model understanding, process execution effectiveness as well as efficiency, and process compliance performance. This paper presents a research in progress encompassing our ongoing DSR project on Process Guidance Systems and a field evaluation of the resulting artifact in cooperation with a company. Building on three theory-grounded design principles, a Process Guidance System artifact for the company’s IT service ticketing process is developed, deployed and used. Fol-lowing a multi-method approach, we plan to evaluate the artifact in a longitudinal field study. Thereby, we will not only gather self-reported but also real usage data. This article describes the development of the artifact and discusses an innovative evaluation approach.