871 resultados para Dependent Failures, Interactive Failures, Interactive Coefficients, Reliability, Complex System
Resumo:
The carotid bodies from adult spontaneous insulin-dependent diabetic rats (strain BB/S) were perfusion-fixed at normal arterial blood pressure with 3% phosphate-buffered glutaraldehyde and compared with the organs from control rats (strain BB/Sc) prepared in the same way. Serial 5-µm sections were cut, stained, and using an interactive image analysis system, were analysed to determine the volumes of the carotid body and its vascular and extravascular compartments. There was no evidence of systemic arterial disease in the carotid stem arteries in either group of animals, and the microvasculature of the organs appeared normal by light microscopy. The volume of the carotid body was unchanged 3 months after the onset of diabetes but was increased at 6 months. The total vascular volume of the organ was unchanged, but the volume of the small vessels (5-12 µm) was increased. In the control group the small vessels comprised 5% of the total volume of the carotid body, or about 44% of the vascular compartment. The percentage of small vessels increased at 3 months in the diabetic group, but had returned to normal at 6 months. The extravascular volume followed the same pattern as the total carotid body volume and so did not change appreciably when expressed as a percentage of the total volume of the organ. The increase in size of the carotid body in diabetic rats is due, therefore, to an augmented extravascular volume. In one diabetic specimen the carotid sinus nerve showed signs of diabetic neuropathy, axonal swelling and intramyelinic oedema. The clinical implications of these results are discussed.
Resumo:
The television and the ways it has invited the audience to take part have been changing during the last decade. Today’s interaction, or rather participation, comes from multiplatform formats, such as TV spectacles that combine TV and web platforms in order to create a wider TV experience. Multiplatform phenomena have spread television consumption and traditional coffee table discussions to several different devices and environments. Television has become a part of the bigger puzzle of interconnected devices that operates on several platforms instead of just one. This thesis examines the Finnish television (2004–2014) through the notion of audience participation and introduces the technical, thematic, and social linkages as three different phases, interactive, participatory, social, and their most characteristic features in terms of audience participation. The aim of the study is also to focus on the idea of a possible change by addressing the possible and subtler variations that have taken place through the concept of digital television. Firstly, Finnish television history has gone through numerous trials, exploring the interactive potential of television formats. Finnish SMS-based iTV had its golden era around 2005, when nearly 50% of the television formats were to some extent interactive. Nowadays, interactive television formats have vanished due to their negative reputation and this important part of recent history is mainly been neglected in the academic scope. The dissertation focuses also on the present situation and the ways television content invites the audience to take part. “TV meets the Internet” is a global expression that characterises digital TV, and the use of the Web combined with television content is also examined. Also the linkages between television and social media are identified. Since television can nowadays be described multifaceted, the research approaches are also versatile. The research is based on qualitative content analysis, media observation, and Internet inquiry. The research material also varies. It consists of primary data: taped iTV formats, website material, and social media traces both from Twitter and Facebook and secondary data: discussion forums, observations from the media and Internet inquiry data. To sum up the results, the iTV phase represented, through its content, a new possibility for audiences to take part in a TV show (through gameful and textual features) in real-time. In participatory phase, the most characteristic features from TV-related content view, is the fact that online platform(s) were used to immerse the audience with additional material and, due to this, to extend the TV watching enjoyment beyond the actual broadcast. During the Social (media) phase, both of these features, real-timeness, and extended enjoyment through additional material, are combined and Facebook & Twitter, for example, are used to immerse people in live events (in real-time) via broadcast-related tweets and extra-material offered on a Facebook page. This thesis fills in the gap in Finnish television research by examining the rapid changes taken place on the field within the last ten years. The main results is that the development of Finnish digital television has been much more diverse and subtle than has been anticipated by following only the news, media, and contemporary discourses on the subject of television. The results will benefit both practitioners and academics by identifying the recent history of Finnish television.
Resumo:
Resilience is the property of a system to remain trustworthy despite changes. Changes of a different nature, whether due to failures of system components or varying operational conditions, significantly increase the complexity of system development. Therefore, advanced development technologies are required to build robust and flexible system architectures capable of adapting to such changes. Moreover, powerful quantitative techniques are needed to assess the impact of these changes on various system characteristics. Architectural flexibility is achieved by embedding into the system design the mechanisms for identifying changes and reacting on them. Hence a resilient system should have both advanced monitoring and error detection capabilities to recognise changes as well as sophisticated reconfiguration mechanisms to adapt to them. The aim of such reconfiguration is to ensure that the system stays operational, i.e., remains capable of achieving its goals. Design, verification and assessment of the system reconfiguration mechanisms is a challenging and error prone engineering task. In this thesis, we propose and validate a formal framework for development and assessment of resilient systems. Such a framework provides us with the means to specify and verify complex component interactions, model their cooperative behaviour in achieving system goals, and analyse the chosen reconfiguration strategies. Due to the variety of properties to be analysed, such a framework should have an integrated nature. To ensure the system functional correctness, it should rely on formal modelling and verification, while, to assess the impact of changes on such properties as performance and reliability, it should be combined with quantitative analysis. To ensure scalability of the proposed framework, we choose Event-B as the basis for reasoning about functional correctness. Event-B is a statebased formal approach that promotes the correct-by-construction development paradigm and formal verification by theorem proving. Event-B has a mature industrial-strength tool support { the Rodin platform. Proof-based verification as well as the reliance on abstraction and decomposition adopted in Event-B provides the designers with a powerful support for the development of complex systems. Moreover, the top-down system development by refinement allows the developers to explicitly express and verify critical system-level properties. Besides ensuring functional correctness, to achieve resilience we also need to analyse a number of non-functional characteristics, such as reliability and performance. Therefore, in this thesis we also demonstrate how formal development in Event-B can be combined with quantitative analysis. Namely, we experiment with integration of such techniques as probabilistic model checking in PRISM and discrete-event simulation in SimPy with formal development in Event-B. Such an integration allows us to assess how changes and di erent recon guration strategies a ect the overall system resilience. The approach proposed in this thesis is validated by a number of case studies from such areas as robotics, space, healthcare and cloud domain.
Resumo:
User experience is a crucial element in interactive storytelling, and as such it is important to recognize the different aspects of a positive user experience in an interactive story. Towards that goal, in the first half of this thesis, we will go through the different elements that make up the user experience, with a strong focus on agency. Agency can be understood as the user’s ability to affect the story or the world in which the story is told with interesting and satisfying choices. The freedoms granted by agency are not completely compatible with traditional storytelling, and as such we will also go through some of the issues of agency-centric design philosophies and explore alternate schools of thought. The core purpose of this thesis is to determine the most important aspects of agency with regards to a positive user experience and attempt to find ways for authors to improve the overall quality of user experience in interactive stories. The latter half of this thesis deals with the research conducted on this matter. This research was carried out by analyzing data from an online survey coupled with data gathered by the interactive storytelling system specifically made for this research (Regicide). The most important aspects of this research deal with influencing perceived agency and facilitating an illusion of agency in different ways, and comparing user experiences in these different test environments. The most important findings based on this research include the importance of context-controlled and focused agency and settings in which the agency takes place and the importance of ensuring user-competency within an interactive storytelling system. Another essential conclusion to this research boils down to communication between the user and the system; the goal of influencing perceived agency should primarily be to ensure that the user is aware of all the theoretical agency they possess.
Resumo:
In much of the previous research into the field of interactive storytelling, the focus has been on the creation of complete systems, then evaluating the performance of those systems based on user experience. Less focus has been placed on finding general solutions to problems that manifest in many different types of interactive storytelling systems. The goal of this thesis was to identify potential candidates for metrics that a system could use to predict player behavior or how players experience the story they are presented with, and to put these metrics to an empirical test. The three metrics that were used were morality, relationships and conflict. The game used for user testing of the metrics, Regicide is an interactive storytelling experience that was created in conjunction with Eero Itkonen. Data, in the forms of internal system data and survey answers, collected through user testing, was used to evaluate hypotheses for each metric. Out of the three chosen metrics, morality performed the best in this study. Though further research and refinement may be required, the results were promising, and point to the conclusion that user responses to questions of morality are a strong predictor for their choices in similar situations later on in the course of an interactive story. A similar examination for user relationships with other characters in the story did not produce promising results, but several problems were recognized in terms of methodology and further research with a better optimized system may yield different results. On the subject of conflict, several aspects, proposed by Ware et al. (2012), were evaluated separately. Results were inconclusive, with the aspect of directness showing the most promise.
Resumo:
The purpose of this study was to determine the relative contributions of psychopathy and self-monitoring to the prediction of self-presentation tactics (behaviours that individuals use to manipulate their self-image). Psychopathy is composed of two main factors: Factor 1, which includes manipulativeness and shallow affect, and Factor 2, which includes irresponsibility and anti-social behaviours. Self-monitoring is a personality trait that distinguishes between those who adapt their behaviour to fit different social situations (high self-monitors) and those who behave as they feel regardless of social expectations (low selfmonitors). It was hypothesized that self-monitoring would moderate the relationship between psychopathy and self-presentation tactics. One hundred and forty-nine university students completed the Self-Monitoring Scale (Snyder, 1974), the Self-Report Psychopathy Scale - Version III (Paulhus et aI., in press), the Self-Presentation Tactics scale (Lee, S., et aI., 1999), the HEXACO-PI (a measure ofthe six major factors of personality; Lee, K., & Ashton, 2004), and six scenarios that were created as a supplementary measure of the selfpresentation tactics. Results of the hierarchical multiple regression analyses showed that self-monitoring did moderate the relationship between psychopathy and three of the selfpresentation tactics: apologies, disclaimers, and exemplification. Further, significant interactions were observed between Factor 1 and self-monitoring on apologies and the defensive tactics subscale, between Factor 2 and self-monitoring on self-handicapping, and between Factor 1 and Factor 2 on exemplification. Contrary to expectations, the main effect of self-monitoring was significant for the prediction of nine tactics, while psychopathy was significant for the prediction of seven tactics. This indicates that the role of these two personality traits in the explanation of self-presentation tactics tends to be additive in nature rather than interactive. In addition. Factor 2 alone did not account for a significant amount of variance in any of the tactics, while Factor 1 significantly predicted nine tactics. Results are discussed with regard to implications and possible directions for future research.
Resumo:
This study was undertaken in order to determine the
effects of playing computer based text adventure games on
the reading comprehension gains of students. Forty-five
grade five students from one elementary school were
randomly assigned to experimental and control groups, and
were tested with regard to ability, achievement and reading
skills. An experimental treatment, consisting of playing
computer based interactive fiction games of the student's
choice for fifteen minutes each day over an eight-week
period, was administered. A comparison treatment engaged
the control group in sustained silent reading of materials of
the student's choice for an equal period of time. Following
the experimental period all students were post-tested with an
alternate form of the pre-test in reading skills, and gain
scores were analysed. It was found that there were no
significant differences in the gain scores of the experimental
and control groups for overall reading comprehenSion, but the
experimental group showed greater gains than the control
group in the structural analysis reading sub-skill. Extreme
variance in the data made generalization very difficult, but
the findings indicated a potential for computer based
interactive fiction as a useful tool for developing reading
sl
Resumo:
There is a great deal of evidence to support the examination of an interactive relationship between the medium and the viewer in the interpretation of mainstream media. The exact nature of this relationship, however, is not well understood. The current study was carried out to assess the variables that may help explain why certain people interpret media, such as music videos, differently than others. Jensen's concept of reception analysis describes the relationship between the medium and the audience, and thus remains a strong focus within this study. Differences in the interpretation of music videos were investigated as a function of Absorption, gender role, screen size, age and viewing experience. Multiple regression analyses uncovered independent predictions of sexuality and violence scores by absorption and experience, as well as an interaction between absorption and screen size in the sexuality rating of the music videos.
Resumo:
Dynamic logic is an extension of modal logic originally intended for reasoning about computer programs. The method of proving correctness of properties of a computer program using the well-known Hoare Logic can be implemented by utilizing the robustness of dynamic logic. For a very broad range of languages and applications in program veri cation, a theorem prover named KIV (Karlsruhe Interactive Veri er) Theorem Prover has already been developed. But a high degree of automation and its complexity make it di cult to use it for educational purposes. My research work is motivated towards the design and implementation of a similar interactive theorem prover with educational use as its main design criteria. As the key purpose of this system is to serve as an educational tool, it is a self-explanatory system that explains every step of creating a derivation, i.e., proving a theorem. This deductive system is implemented in the platform-independent programming language Java. In addition, a very popular combination of a lexical analyzer generator, JFlex, and the parser generator BYacc/J for parsing formulas and programs has been used.
Resumo:
Affiliation: Margaret Cargo : Département de médecine sociale et préventive, Faculté de médecine, Université de Montréal
Resumo:
Le Réseau de recherche E-Inclusion a pour but de permettre à tous les Canadiens d’accéder au contenu informationnel de documents audiovisuels. Le thème 3 du projet, Audiovision interactive et adaptable, avait pour but d'offrir des lignes directrices à l'intention de producteurs de films et d'émissions de télévision concernant le contenu de textes d'audiovision, et de mesurer l'utilité potentielle, pour la production de textes d'audiovision à partir de mots-clés générés dans d'autres contextes.
Resumo:
Les transformations économiques visant la création d’un marché mondial unique, le progrès technologique et la disponibilité d’une main-d’œuvre qualifiée dans les pays à bas salaire amènent les dirigeants des entreprises à réexaminer l’organisation et la localisation de leurs capacités productives de façon à en accroître la flexibilité qui est, selon plusieurs, seule garante de la pérennité de l’organisation (Atkinson, 1987; Patry, 1994; Purcell et Purcell, 1998; Kennedy 2002; Kallaberg, Reynolds, Marsden, 2003; Berger, 2006). Une stratégie déployée par les entreprises pour parvenir à cette fin est la délocalisation (Kennedy, 2002; Amiti et Wei, 2004; Barthélemy, 2004; Trudeau et Martin, 2006; Olsen, 2006). La technologie, l’ouverture des marchés et l’accès à des bassins nouveaux de main-d’œuvre qualifiée rendent possible une fragmentation de la chaîne de production bien plus grande qu’auparavant, et chaque maillon de cette chaîne fait l’objet d’un choix de localisation optimale (Hertveldt et al., 2005). Dans ces conditions, toutes les activités qui ne requièrent aucune interaction complexe ou physique entre collègues ou entre un employé et un client, sont sujettes à être transférées chez un sous-traitant, ici ou à l’étranger (Farrell, 2005). La plupart des recherches traitant de l’impartition et des délocalisations se concentrent essentiellement sur les motivations patronales d’y recourir (Lauzon-Duguay, 2005) ou encore sur les cas de réussites ou d’échecs des entreprises ayant implanté une stratégie de cette nature (Logan, Faught et Ganster, 2004). Toutefois, les impacts sur les employés de telles pratiques ont rarement été considérés systématiquement dans les recherches (Benson, 1998; Kessler, Coyle-Shapiro et Purcell, 1999; Logan et al., 2004). Les aspects humains doivent pourtant être considérés sérieusement, car ils sont à même d’être une cause d’échec ou de réussite de ces processus. La gestion des facteurs humains entourant le processus de délocalisation semble jouer un rôle dans l’impact de l’impartition sur les employés. Ainsi, selon Kessler et al. (1999), la façon dont les employés perçoivent la délocalisation serait influencée par trois facteurs : la manière dont ils étaient gérés par leur ancien employeur (context), ce que leur offre leur nouvel employeur (pull factor) et la façon dont ils sont traités suite au transfert (landing). La recherche vise à comprendre l’impact de la délocalisation d’activités d’une entreprise sur les employés ayant été transférés au fournisseur. De façon plus précise, nous souhaitons comprendre les effets que peut entraîner la délocalisation d’une entreprise « source » (celle qui cède les activités et les employés) à une entreprise « destination » (celle qui reprend les activités cédées et la main-d’œuvre) sur les employés transférés lors de ce processus au niveau de leur qualité de vie au travail et de leurs conditions de travail. Plusieurs questions se posent. Qu’est-ce qu’un transfert réussi du point de vue des employés? Les conditions de travail ou la qualité de vie au travail sont-elles affectées? À quel point les aspects humains influencent-t-ils les effets de la délocalisation sur les employés? Comment gérer un tel transfert de façon optimale du point de vue du nouvel employeur? Le modèle d’analyse est composé de quatre variables. La première variable dépendante (VD1) de notre modèle correspond à la qualité de vie au travail des employés transférés. La seconde variable dépendante (VD2) correspond aux conditions de travail des employés transférés. La troisième variable, la variable indépendante (VI) renvoie à la délocalisation d’activités qui comporte deux dimensions soit (1) la décision de délocalisation et (2) le processus d’implantation. La quatrième variable, la variable modératrice (VM) est les aspects humains qui sont examinés selon trois dimensions soit (1) le contexte dans l’entreprise « source » (Context), (2) l’attrait du nouvel employeur (pull factor) et (3) la réalité chez le nouvel employeur (landing). Trois hypothèses de recherche découlent de notre modèle d’analyse. Les deux premières sont à l’effet que la délocalisation entraîne une détérioration de la qualité de vie au travail (H1) et des conditions de travail (H2). La troisième hypothèse énonce que les aspects humains ont un effet modérateur sur l’impact de la délocalisation sur les employés transférés (H3). La recherche consiste en une étude de cas auprès d’une institution financière (entreprise « source ») qui a délocalisé ses activités technologiques à une firme experte en technologies de l’information (entreprise « destination »). Onze entrevues semi-dirigées ont été réalisées avec des acteurs-clés (employés transférés et gestionnaires des deux entreprises). Les résultats de la recherche indiquent que la délocalisation a de façon générale un impact négatif sur les employés transférés. Par contre, cette affirmation n’est pas généralisable à tous les indicateurs étudiés de la qualité de vie au travail et des conditions de travail. Les résultats mettent en évidence des conséquences négatives en ce qui a trait à la motivation intrinsèque au travail, à l’engagement organisationnel ainsi qu’à la satisfaction en lien avec l’aspect relationnel du travail. La délocalisation a également entraîné une détérioration des conditions de travail des employés transférés soit au niveau de la sécurité d’emploi, du contenu et de l’évaluation des tâches, de la santé et sécurité au travail et de la durée du travail. Mais, d’après les propos des personnes interviewées, les conséquences les plus importantes sont sans aucun doute au niveau du salaire et des avantages sociaux. Les conséquences de la délocalisation s’avèrent par contre positives lorsqu’il est question de l’accomplissement professionnel et de la satisfaction de l’aspect technique du travail. Au niveau de la confiance interpersonnelle au travail, l’organisation du travail, la formation professionnelle ainsi que les conditions physiques de l’emploi, les effets ne semblent pas significatifs d’après les propos recueillis lors des entrevues. Enfin, les résultats mettent en évidence l’effet modérateur significatif des aspects humains sur les conséquences de la délocalisation pour les employés transférés. L’entreprise « source » a tenté d’amoindrir l’impact de la délocalisation, mais ce ne fut pas suffisant. Comme les employés étaient fortement attachés à l’entreprise « source » et qu’ils ne désiraient pas la quitter pour une entreprise avec une culture d’entreprise différente qui leur paraissait peu attrayante, ces dimensions des aspects humains ont en fait contribué à amplifier les impacts négatifs de la délocalisation, particulièrement sur la qualité de vie au travail des employés transférés. Mots clés : (1) délocalisation, (2) impartition, (3) transfert d’employés, (4) qualité de vie au travail, (5) conditions de travail, (6) technologies de l’information, (7) entreprise, (8) gestion des ressources humaines.