6 resultados para maintenance planning

em CORA - Cork Open Research Archive - University College Cork - Ireland


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since Wireless Sensor Networks (WSNs) are subject to failures, fault-tolerance becomes an important requirement for many WSN applications. Fault-tolerance can be enabled in different areas of WSN design and operation, including the Medium Access Control (MAC) layer and the initial topology design. To be robust to failures, a MAC protocol must be able to adapt to traffic fluctuations and topology dynamics. We design ER-MAC that can switch from energy-efficient operation in normal monitoring to reliable and fast delivery for emergency monitoring, and vice versa. It also can prioritise high priority packets and guarantee fair packet deliveries from all sensor nodes. Topology design supports fault-tolerance by ensuring that there are alternative acceptable routes to data sinks when failures occur. We provide solutions for four topology planning problems: Additional Relay Placement (ARP), Additional Backup Placement (ABP), Multiple Sink Placement (MSP), and Multiple Sink and Relay Placement (MSRP). Our solutions use a local search technique based on Greedy Randomized Adaptive Search Procedures (GRASP). GRASP-ARP deploys relays for (k,l)-sink-connectivity, where each sensor node must have k vertex-disjoint paths of length ≤ l. To count how many disjoint paths a node has, we propose Counting-Paths. GRASP-ABP deploys fewer relays than GRASP-ARP by focusing only on the most important nodes – those whose failure has the worst effect. To identify such nodes, we define Length-constrained Connectivity and Rerouting Centrality (l-CRC). Greedy-MSP and GRASP-MSP place minimal cost sinks to ensure that each sensor node in the network is double-covered, i.e. has two length-bounded paths to two sinks. Greedy-MSRP and GRASP-MSRP deploy sinks and relays with minimal cost to make the network double-covered and non-critical, i.e. all sensor nodes must have length-bounded alternative paths to sinks when an arbitrary sensor node fails. We then evaluate the fault-tolerance of each topology in data gathering simulations using ER-MAC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Case-Based Reasoning (CBR) uses past experiences to solve new problems. The quality of the past experiences, which are stored as cases in a case base, is a big factor in the performance of a CBR system. The system's competence may be improved by adding problems to the case base after they have been solved and their solutions verified to be correct. However, from time to time, the case base may have to be refined to reduce redundancy and to get rid of any noisy cases that may have been introduced. Many case base maintenance algorithms have been developed to delete noisy and redundant cases. However, different algorithms work well in different situations and it may be difficult for a knowledge engineer to know which one is the best to use for a particular case base. In this thesis, we investigate ways to combine algorithms to produce better deletion decisions than the decisions made by individual algorithms, and ways to choose which algorithm is best for a given case base at a given time. We analyse five of the most commonly-used maintenance algorithms in detail and show how the different algorithms perform better on different datasets. This motivates us to develop a new approach: maintenance by a committee of experts (MACE). MACE allows us to combine maintenance algorithms to produce a composite algorithm which exploits the merits of each of the algorithms that it contains. By combining different algorithms in different ways we can also define algorithms that have different trade-offs between accuracy and deletion. While MACE allows us to define an infinite number of new composite algorithms, we still face the problem of choosing which algorithm to use. To make this choice, we need to be able to identify properties of a case base that are predictive of which maintenance algorithm is best. We examine a number of measures of dataset complexity for this purpose. These provide a numerical way to describe a case base at a given time. We use the numerical description to develop a meta-case-based classification system. This system uses previous experience about which maintenance algorithm was best to use for other case bases to predict which algorithm to use for a new case base. Finally, we give the knowledge engineer more control over the deletion process by creating incremental versions of the maintenance algorithms. These incremental algorithms suggest one case at a time for deletion rather than a group of cases, which allows the knowledge engineer to decide whether or not each case in turn should be deleted or kept. We also develop incremental versions of the complexity measures, allowing us to create an incremental version of our meta-case-based classification system. Since the case base changes after each deletion, the best algorithm to use may also change. The incremental system allows us to choose which algorithm is the best to use at each point in the deletion process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using a classic grounded theory methodology (CGT), this study explores the phenomenon of moral shielding within mental health multidisciplinary teams (MDTS). The study was located within three catchment areas engaged in acute mental health service practice. The main concern identified was the maintenance of a sense of personal integrity during situational binds. Through theoretical sampling thirty two practitioners, including; doctors, nurses, social workers, occupational therapists, counsellors and psychologists, where interviewed face to face. In addition, emergent concepts were identified through observation of MDTs in clinical and research practice. Following a classic grounded theory methodology, data collection and analysis occurred simultaneously. A constant comparative approach was adopted and resulted in the immergence of three sub- core categories; moral abdication, moral hinting and pseudo-compliance. Moral abdication seeks to re-position within an event in order to avoid or deflect the initial obligation to act, it is a strategy used to remove or reduce moral ownership. Moral gauging represents the monitoring of an event with the goal of judging the congruence of personal principles and commitments with that of other practitioners. This strategy is enacted in a bid to seek allies for the support of a given moral position. Pseudo-compliance represents behaviour that hides desired principles and commitments in order to shield them from challenge. This strategy portrays agreement with the dominant position within the MDT, whilst holding a contrary position. It seeks to preserve a reservoir of emotional energy required to maintain a sense of personal integrity. Practitioners who were successful in enacting moral shielding were found to not experience significant emotional distress associated with the phenomenon of moral distress; suggesting that these practitioners had found mechanisms to manage situational binds that threatened their sense of personal integrity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The Early Development Instrument (EDI) is a population-level measure of five developmental domains at school-entry age. The overall aim of this thesis was to explore the potential of the EDI as an indicator of early development in Ireland. Methods: A cross-sectional study was conducted in 47 primary schools in 2011 using the EDI and a linked parental questionnaire. EDI (teacher completed) scores were calculated for 1,344 children in their first year of full-time education. Those scoring in the lowest 10% of the sample population in one or more domains were deemed to be 'developmentally vulnerable'. Scores were correlated with contextual data from the parental questionnaire and with indicators of area and school-level deprivation. Rasch analysis was used to determine the validity of the EDI. Results: Over one quarter (27.5%) of all children in the study were developmentally vulnerable. Individual characteristics associated with increased risk of vulnerability were being male; under 5 years old; and having English as a second language. Adjusted for these demographics, low birth weight, poor parent/child interaction and mother’s lower level of education showed the most significant odds ratios for developmental vulnerability. Vulnerability did not follow the area-level deprivation gradient as measured by a composite index of material deprivation. Children considered by the teacher to be in need of assessment also had lower scores, which were not significantly different from those of children with a clinical diagnosis of special needs. all domains showed at least reasonable fit to the Rasch model supporting the validity of the instrument. However, there was a need for further refinement of the instrument in the Irish context. Conclusion: This thesis provides a unique snapshot of early development in Ireland. The EDI and linked parental questionnaires are promising indicators of the extent, distribution and determinants of developmental vulnerability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Submission on behalf of UCC to the Government Consultation on the White paper on Irish Aid

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A growing number of software development projects successfully exhibit a mix of agile and traditional software development methodologies. Many of these mixed methodologies are organization specific and tailored to a specific project. Our objective in this research-in-progress paper is to develop an artifact that can guide the development of such a mixed methodology. Using control theory, we design a process model that provides theoretical guidance to build a portfolio of controls that can support the development of a mixed methodology for software development. Controls, embedded in methods, provide a generalizable and adaptable framework for project managers to develop their mixed methodology specific to the demands of the project. A research methodology is proposed to test the model. Finally, future directions and contributions are discussed.