833 resultados para fault accommodation
Resumo:
This paper presents an architecture (Multi-μ) being implemented to study and develop software based fault tolerant mechanisms for Real-Time Systems, using the Ada language (Ada 95) and Commercial Off-The-Shelf (COTS) components. Several issues regarding fault tolerance are presented and mechanisms to achieve fault tolerance by software active replication in Ada 95 are discussed. The Multi-μ architecture, based on a specifically proposed Fault Tolerance Manager (FTManager), is then described. Finally, some considerations are made about the work being done and essential future developments.
Resumo:
On-chip debug (OCD) features are frequently available in modern microprocessors. Their contribution to shorten the time-to-market justifies the industry investment in this area, where a number of competing or complementary proposals are available or under development, e.g. NEXUS, CJTAG, IJTAG. The controllability and observability features provided by OCD infrastructures provide a valuable toolbox that can be used well beyond the debugging arena, improving the return on investment rate by diluting its cost across a wider spectrum of application areas. This paper discusses the use of OCD features for validating fault tolerant architectures, and in particular the efficiency of various fault injection methods provided by enhanced OCD infrastructures. The reference data for our comparative study was captured on a workbench comprising the 32-bit Freescale MPC-565 microprocessor, an iSYSTEM IC3000 debugger (iTracePro version) and the Winidea 2005 debugging package. All enhanced OCD infrastructures were implemented in VHDL and the results were obtained by simulation within the same fault injection environment. The focus of this paper is on the comparative analysis of the experimental results obtained for various OCD configurations and debugging scenarios.
Resumo:
Dependability is a critical factor in computer systems, requiring high quality validation & verification procedures in the development stage. At the same time, digital devices are getting smaller and access to their internal signals and registers is increasingly complex, requiring innovative debugging methodologies. To address this issue, most recent microprocessors include an on-chip debug (OCD) infrastructure to facilitate common debugging operations. This paper proposes an enhanced OCD infrastructure with the objective of supporting the verification of fault-tolerant mechanisms through fault injection campaigns. This upgraded on-chip debug and fault injection (OCD-FI) infrastructure provides an efficient fault injection mechanism with improved capabilities and dynamic behavior. Preliminary results show that this solution provides flexibility in terms of fault triggering and allows high speed real-time fault injection in memory elements
Resumo:
Fault injection is frequently used for the verification and validation of dependable systems. When targeting real time microprocessor based systems the process becomes significantly more complex. This paper proposes two complementary solutions to improve real time fault injection campaign execution, both in terms of performance and capabilities. The methodology is based on the use of the on-chip debug mechanisms present in modern electronic devices. The main objective is the injection of faults in microprocessor memory elements with minimum delay and intrusiveness. Different configurations were implemented and compared in terms of performance gain and logic overhead.
Resumo:
The rapid increase in the use of microprocessor-based systems in critical areas, where failures imply risks to human lives, to the environment or to expensive equipment, significantly increased the need for dependable systems, able to detect, tolerate and eventually correct faults. The verification and validation of such systems is frequently performed via fault injection, using various forms and techniques. However, as electronic devices get smaller and more complex, controllability and observability issues, and sometimes real time constraints, make it harder to apply most conventional fault injection techniques. This paper proposes a fault injection environment and a scalable methodology to assist the execution of real-time fault injection campaigns, providing enhanced performance and capabilities. Our proposed solutions are based on the use of common and customized on-chip debug (OCD) mechanisms, present in many modern electronic devices, with the main objective of enabling the insertion of faults in microprocessor memory elements with minimum delay and intrusiveness. Different configurations were implemented starting from basic Components Off-The-Shelf (COTS) microprocessors, equipped with real-time OCD infrastructures, to improved solutions based on modified interfaces, and dedicated OCD circuitry that enhance fault injection capabilities and performance. All methodologies and configurations were evaluated and compared concerning performance gain and silicon overhead.
Resumo:
To increase the amount of logic available to the users in SRAM-based FPGAs, manufacturers are using nanometric technologies to boost logic density and reduce costs, making its use more attractive. However, these technological improvements also make FPGAs particularly vulnerable to configuration memory bit-flips caused by power fluctuations, strong electromagnetic fields and radiation. This issue is particularly sensitive because of the increasing amount of configuration memory cells needed to define their functionality. A short survey of the most recent publications is presented to support the options assumed during the definition of a framework for implementing circuits immune to bit-flips induction mechanisms in memory cells, based on a customized redundant infrastructure and on a detection-and-fix controller.
Resumo:
Fault injection is frequently used for the verification and validation of the fault tolerant features of microprocessors. This paper proposes the modification of a common on-chip debugging (OCD) infrastructure to add fault injection capabilities and improve performance. The proposed solution imposes a very low logic overhead and provides a flexible and efficient mechanism for the execution of fault injection campaigns, being applicable to different target system architectures.
Resumo:
The purpose of this study was to identify factors related to successful university course completion for students with disAbilities including the knowledge that faculty members and students with disAbilities have about accommodation issues; the accommodations that students with disAbilities and faculty use and find effective in the university setting; faculty members' perceptions of and attitudes toward students with disAbilities; and the attitudes that students with disAbilities have toward faculty. Fiftyseven participants were involved in the research, eight students with disabilities and forty-nine faculty members. The main objective of the research was to describe how the skills, knowledge, and attitudes of students and faculty members, and organizational supports interact to support students' academic success. The utilization and effectiveness of accommodations to overcome barriers associated with disAbility in a post-secondary setting are described in relation to students' and faculty members' perceptions of academic success.
Resumo:
Three studies comprised the current research program, in which the major goals were to propose and validate empirically the proposed two-level (universal and culture-specific) model of both autonomy and relatedness, as well as to develop reliable and valid measures for these two constructs. In Study 1, 143 mainland Chinese adolescents were asked open-ended questions about their understanding of autonomy and relatedness in three social contexts (peer, family, school). Chinese youth’s responses captured universal and culturally distinctive forms of autonomy (personal vs. social) and relatedness (accommodation vs. distinctiveness), according to a priori criteria based on the theoretical frameworks. Also, scenarios designed to reflect culture-specific forms of autonomy and relatedness suggested their relevance to Chinese adolescents. With a second sample of 201 mainland Chinese youth, in Study 2, the obtained autonomy and relatedness descriptors were formulated into scale items. Those items were subject to refinement analyses to examine their psychometric properties and centrality to Chinese youth. The findings of Study 1 scenarios were replicated in Study 2. The primary goal of Study 3 was to test empirically the proposed two-level (universal and culture-specific) models of both autonomy and relatedness, using the measures derived from Studies 1 and 2. A third sample of 465 mainland Chinese youth completed a questionnaire booklet consisting of autonomy and relatedness scales and scenarios and achievement motivation orientations measures. A series of confirmatory factor analysis (CFA) autonomy and relatedness measurement models (first-order and second-order), as well as structural models linking culture-specific forms of autonomy and relatedness and achievement motivation orientations, were conducted. The first-order measurement models based on scale and scenario scores consistently confirmed the distinction between personal autonomy and social autonomy, and that of accommodation and distinctiveness. Although the construct validity of the two culture-specific forms of autonomy gained additional support from the structural models, the associations between the two culture-specific forms of relatedness and achievement motivation orientations were relatively weak. In general, the two-level models of autonomy and relatedness were supported in two ways: conceptual analysis of scale items and second-order measurement models. In addition, across the three studies, I explored potential contextual and sex differences in Chinese youth’s endorsement of the diverse forms of autonomy and relatedness. Overall, no substantial contextual variability or sex differences were found. The current research makes an important theoretical contribution to the field of developmental psychology in general, and autonomy and relatedness in particular, by proposing and testing empirically both universal and culture-specific parts of autonomy and relatedness. The current findings have implications for the measurement of autonomy and relatedness across social contexts, as well as for socialization and education practice.
Resumo:
Dossier : In Memoriam, Iris Marion Young (1949-2006)
Resumo:
Tesis (Doctor en Ingeniería Eléctrica) UANL, 2013.
Resumo:
Depuis des années, le Kenya avait donné l’impression d’être un pays relativement stable dans la région d’Afrique sub-saharienne, régulièrement secouée par les conflits, et un « centre » autour duquel la communauté internationale coordonne ses missions vers certains pays d’Afrique comme ceux faisant partie de la Région des Grandes Lacs (Burundi, Rwanda, Ouganda, République démocratique du Congo, Kenya et Tanzanie) et ceux de la Corne de l’Afrique (Kenya, Somalie, Éthiopie, Djibouti et Ouganda). Toutefois, les élections présidentielles très contestées en 2007 et les conflits qui se sont enchaînés ont entrainé de nombreuses préoccupations en ce qui concerne la stabilité du Kenya à l’ère de l’insécurité globale. Alors que le rétablissement de la paix continue, la coexistence entre groupes est toujours délicate car le Kenya compte au moins quarante-deux ethnies qui sont toutes distinctes les unes par rapport aux autres. Par ailleurs, l’ouverture d’une enquête judiciaire, par la Cour Pénale Internationale (CPI), contre quatre des six personnes présumées être les principaux auteurs des violences postélectorales de 2007/08, s’ajoute aux problèmes liés à la coexistence pacifique entre les différents groupes avant les prochaines élections. Cette thèse examine les politiques relatives à l’accommodation des différents groupes à travers les radios vernaculaires et comment ces politiques ont influencé les relations entre les groupes lors des conflits de 2007/08 au Kenya. Partant du constat qu’un conflit est un processus communicatif, elle intègre le concept d’encadrement médiatique à la théorie de Protracted Social Conflict (PSC) définie par Azar (1990) pour tracer non seulement les changements dans les discours d’encadrement de ces conflits, mais aussi pour illustrer les mutations des attitudes à l’égard des relations entre groupes survenues avant, durant et après ces conflits. Cette étude emploie principalement les méthodes qualitatives pour rassembler les données issues des trois régions au Kenya qui sont ethniquement et linguistiquement divergentes: Nyeri (la majorité Kikuyu), Kisumu (la majorité Luo) et Eldoret (la majorité Kalenjin). L’argument central de cette thèse est que l’encadrement des relations entre groupes, notamment lors des conflits, est soit différencié soit concerté dépendamment du stade auquel le conflit se manifeste. Alors que dans l’encadrement différencié, les discours médiatiques sont articulés de façon à ce que ceux-ci soient susceptibles d’entrainer une polarisation entre groupes, l’encadrement concerté décrit les discours médiatiques négociés de manière à ce que ceux-ci reflètent les valeurs partagées au travers des différents groupes, et donc sont susceptibles d’engendrer une coopération entre groupes. J’argumente que les changements dans le discours des radios vernaculaires prennent effet lorsque de nouveaux éléments sont ajoutés aux discours caractérisant un conflit déjà existant, et les « nouveaux significations » que ces éléments apportent à la compréhension du conflit en question. J’argumente également que le changement du l’encadrement différentiée à l’encadrement concerté (et vice-versa) dépende du degré de résonance de ces discours avec la population cible. De façon générale, cette étude suggère que le langage de diffusion et la proximité culturelle induisent l’encadrement entre groupes à travers les radios vernaculaires au Kenya. La force de cette thèse se trouve donc dans les perspectives analytiques qu’elle propose pour localiser les discours changeants lors des conflits, plus particulièrement dans les états multiethniques où les politiques d’accommodation entre les différents groupes demeurent toujours fragiles et conditionnelles.
Resumo:
En 2007, le Premier ministre du Québec, monsieur Jean Charest, a établi la Commission de consultation sur les pratiques d’accommodement reliées aux différences culturelles afin de donner suite aux conflits émanant des différences ethniques et culturelles. La commission a pour mandat de dresser le bilan des pratiques d’accommodement au Québec, d’analyser la problématique, de consulter la population et de formuler des recommandations au gouvernement afin d’assurer la conformité des pratiques d’accommodement avec les valeurs de la société québécoise. En premier lieu, ce mémoire démontrera que deux facteurs, dont l’évolution de l’identité de la majorité francophone et l’évolution des pays d’origine des immigrants, ont contribué à un malaise de gestion de la diversité et, par conséquent, ont rendu l’établissement de la commission pertinent. En deuxième lieu, m’appuyant sur une revue de la méthodologie, des conclusions et des recommandations de la commission, ainsi que la réplique du Ministère de l’Immigration et des Communautés culturelles, je vais illustrer que, malgré un mandat pertinent et achevé, la réponse gouvernementale fut inadéquate. Finalement, je démontrerai que les modèles de gestion de diversité soutenus par le rapport de la Commission, la laïcité inclusive et l’interculturalisme, sont des aspects nécessaires de la gestion de la diversité. Cependant, ils en découlent des philosophies politiques de neutralisme et pluralisme dont la force et le compromis en sont les buts. Je crois que le Québec peut être meilleur gestionnaire de sa diversité et peut obtenir de vraies réconciliations en prônant la conversation; une approche patriotique de la gestion de diversité.
Resumo:
The present research problem is to study the existing encryption methods and to develop a new technique which is performance wise superior to other existing techniques and at the same time can be very well incorporated in the communication channels of Fault Tolerant Hard Real time systems along with existing Error Checking / Error Correcting codes, so that the intention of eaves dropping can be defeated. There are many encryption methods available now. Each method has got it's own merits and demerits. Similarly, many crypt analysis techniques which adversaries use are also available.
Resumo:
The hazards associated with major accident hazard (MAH) industries are fire, explosion and toxic gas releases. Of these, toxic gas release is the worst as it has the potential to cause extensive fatalities. Qualitative and quantitative hazard analyses are essential for the identitication and quantification of the hazards associated with chemical industries. This research work presents the results of a consequence analysis carried out to assess the damage potential of the hazardous material storages in an industrial area of central Kerala, India. A survey carried out in the major accident hazard (MAH) units in the industrial belt revealed that the major hazardous chemicals stored by the various industrial units are ammonia, chlorine, benzene, naphtha, cyclohexane, cyclohexanone and LPG. The damage potential of the above chemicals is assessed using consequence modelling. Modelling of pool fires for naphtha, cyclohexane, cyclohexanone, benzene and ammonia are carried out using TNO model. Vapor cloud explosion (VCE) modelling of LPG, cyclohexane and benzene are carried out using TNT equivalent model. Boiling liquid expanding vapor explosion (BLEVE) modelling of LPG is also carried out. Dispersion modelling of toxic chemicals like chlorine, ammonia and benzene is carried out using the ALOHA air quality model. Threat zones for different hazardous storages are estimated based on the consequence modelling. The distance covered by the threat zone was found to be maximum for chlorine release from a chlor-alkali industry located in the area. The results of consequence modelling are useful for the estimation of individual risk and societal risk in the above industrial area.Vulnerability assessment is carried out using probit functions for toxic, thermal and pressure loads. Individual and societal risks are also estimated at different locations. Mapping of threat zones due to different incident outcome cases from different MAH industries is done with the help of Are GIS.Fault Tree Analysis (FTA) is an established technique for hazard evaluation. This technique has the advantage of being both qualitative and quantitative, if the probabilities and frequencies of the basic events are known. However it is often difficult to estimate precisely the failure probability of the components due to insufficient data or vague characteristics of the basic event. It has been reported that availability of the failure probability data pertaining to local conditions is surprisingly limited in India. This thesis outlines the generation of failure probability values of the basic events that lead to the release of chlorine from the storage and filling facility of a major chlor-alkali industry located in the area using expert elicitation and proven fuzzy logic. Sensitivity analysis has been done to evaluate the percentage contribution of each basic event that could lead to chlorine release. Two dimensional fuzzy fault tree analysis (TDFFTA) has been proposed for balancing the hesitation factor invo1ved in expert elicitation .