990 resultados para Normal Accident Theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new approach is developed to analyze the thermodynamic properties of a sub-critical fluid adsorbed in a slit pore of activated carbon. The approach is based on a representation that an adsorbed fluid forms an ordered structure close to a smoothed solid surface. This ordered structure is modelled as a collection of parallel molecular layers. Such a structure allows us to express the Helmholtz free energy of a molecular layer as the sum of the intrinsic Helmholtz free energy specific to that layer and the potential energy of interaction of that layer with all other layers and the solid surface. The intrinsic Helmholtz free energy of a molecular layer is a function (at given temperature) of its two-dimensional density and it can be readily obtained from bulk-phase properties, while the interlayer potential energy interaction is determined by using the 10-4 Lennard-Jones potential. The positions of all layers close to the graphite surface or in a slit pore are considered to correspond to the minimum of the potential energy of the system. This model has led to accurate predictions of nitrogen and argon adsorption on carbon black at their normal boiling points. In the case of adsorption in slit pores, local isotherms are determined from the minimization of the grand potential. The model provides a reasonable description of the 0-1 monolayer transition, phase transition and packing effect. The adsorption of nitrogen at 77.35 K and argon at 87.29 K on activated carbons is analyzed to illustrate the potential of this theory, and the derived pore-size distribution is compared favourably with that obtained by the Density Functional Theory (DFT). The model is less time-consuming than methods such as the DFT and Monte-Carlo simulation, and most importantly it can be readily extended to the adsorption of mixtures and capillary condensation phenomena.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To validate the unidimensionality of the Action Research Arm Test (ARAT) using Mokken analysis and to examine whether scores of the ARAT can be transformed into interval scores using Rasch analysis. Subjects and methods: A total of 351 patients with stroke were recruited from 5 rehabilitation departments located in 4 regions of Taiwan. The 19-item ARAT was administered to all the subjects by a physical therapist. The data were analysed using item response theory by non-parametric Mokken analysis followed by Rasch analysis. Results: The results supported a unidimensional scale of the 19-item ARAT by Mokken analysis, with the scalability coefficient H = 0.95. Except for the item pinch ball bearing 3rd finger and thumb'', the remaining 18 items have a consistently hierarchical order along the upper extremity function's continuum. In contrast, the Rasch analysis, with a stepwise deletion of misfit items, showed that only 4 items (grasp ball'', grasp block 5 cm(3)'', grasp block 2.5 cm(3)'', and grip tube 1 cm(3)'') fit the Rasch rating scale model's expectations. Conclusion: Our findings indicated that the 19-item ARAT constituted a unidimensional construct measuring upper extremity function in stroke patients. However, the results did not support the premise that the raw sum scores of the ARAT can be transformed into interval Rasch scores. Thus, the raw sum scores of the ARAT can provide information only about order of patients on their upper extremity functional abilities, but not represent each patient's exact functioning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis has two aims. First, it sets out to develop an alternative methodology for the investigation of risk homeostasis theory (RHT). It is argued that the current methodologies of the pseudo-experimental design and post hoc analysis of road-traffic accident data both have their limitations, and that the newer 'game' type simulation exercises are also, but for different reasons, incapable of testing RHT predictions. The alternative methodology described here is based on the simulation of physical risk with intrinsic reward rather than a 'points pay-off'. The second aim of the thesis is to examine a number of predictions made by RHT through the use of this alternative methodology. Since the pseudo-experimental design and post hoc analysis of road-traffic data are both ill-suited to the investigation of that part of RHT which deals with the role of utility in determining risk-taking behaviour in response to a change in environmental risk, and since the concept of utility is critical to RHT, the methodology reported here is applied to the specific investigation of utility. Attention too is given to the question of which behavioural pathways carry the homeostasis effect, and whether those pathways are 'local' to the nature of the change in environmental risk. It is suggested that investigating RHT through this new methodology holds a number of advantages and should be developed further in an attempt to answer the RHT question. It is suggested too that the methodology allows RHT to be seen in a psychological context, rather than the statistical context that has so far characterised its investigation. The experimental findings reported here are in support of hypotheses derived from RHT and would therefore seem to argue for the importance of the individual and collective target level of risk, as opposed to the level of environmental risk, as the major determinant of accident loss.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given evidence of effects of mobile phone use on driving, and also legislation, many careful drivers refrain from answering their phones when driving. However, the distracting influence of a call on driving, even in the context of not answering, has not been examined. Furthermore, given that not answering may be contrary to an individual’s normal habits, this study examined whether distraction caused by the ignored call varies according to normal intention to answer whilst driving. That is, determining whether the effect is more than a simple matter of noise distraction. Participants were 27 young drivers (18-29 years), all regular mobile users. A Theory of Planned Behaviour questionnaire examined predictors of intention to refrain from answering calls whilst driving. Participants provided their mobile phone number and were instructed not to answer their phone if it were to ring during a driving simulation. The simulation scenario had seven hazards (e.g. car pulling out, pedestrian crossing) with three being immediately preceded by a call. Infractions (e.g. pedestrian collisions, vehicle collisions, speed exceedances) were significantly greater when distracted by call tones than with no distraction. Lower intention to ignore calls whilst driving correlated with a larger effect of distraction, as was feeling unable to control whether one answered whilst driving (Perceived Behavioural Control). The study suggests that even an ignored call can cause significantly increased infractions in simulator driving, with pedestrian collisions and speed exceedances being striking examples. Results are discussed in relation to cognitive demands of inhibiting normal behaviour and to drivers being advised to switch phones off whilst driving.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We examine similarities and differences between high-power parabolic pulse generation in an active medium and in tapered fiber with decreasing normal dispersion. Using a realistic tapered fiber design, we demonstrate the possibility of parabolic pulse generation without an external pump and determine the limitations of this approach. © 2007 Optical Society of America.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the specific area of software engineering (SE) for self-adaptive systems (SASs) there is a growing research awareness about the synergy between SE and artificial intelligence (AI). However, just few significant results have been published so far. In this paper, we propose a novel and formal Bayesian definition of surprise as the basis for quantitative analysis to measure degrees of uncertainty and deviations of self-adaptive systems from normal behavior. A surprise measures how observed data affects the models or assumptions of the world during runtime. The key idea is that a "surprising" event can be defined as one that causes a large divergence between the belief distributions prior to and posterior to the event occurring. In such a case the system may decide either to adapt accordingly or to flag that an abnormal situation is happening. In this paper, we discuss possible applications of Bayesian theory of surprise for the case of self-adaptive systems using Bayesian dynamic decision networks. Copyright © 2014 ACM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2010 Mathematics Subject Classification: 53A07, 53A35, 53A10.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a superconductor pair occupancy probabilities are doubly defined with conflicting values when normal and umklapp scattering coexist with the same destination momentum. To resolve this issue a new pairing scheme is introduced to assert normal–umklapp frustration under such circumstances. Superconductivity then arises solely from residual umklapp scattering to destination momenta not reached by normal scattering. Consequent Tc calculations from first principles for niobium, tantalum, lead and aluminum turn out to be accurate within a few percent. A new perspective is revealed to support Matthias׳ rule. New light is also shed relevant to the future study of metallic hydrogen.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les enfants d’âge préscolaire (≤ 5 ans) sont plus à risque de subir un traumatisme crânio-cérébral (TCC) que les enfants plus agés, et 90% de ces TCC sont de sévérité légère (TCCL). De nombreuses études publiées dans les deux dernières décennies démontrent que le TCCL pédiatrique peut engendrer des difficultés cognitives, comportementales et psychiatriques en phase aigüe qui, chez certains enfants, peuvent perdurer à long terme. Il existe une littérature florissante concernant l'impact du TCCL sur le fonctionnement social et sur la cognition sociale (les processus cognitifs qui sous-tendent la socialisation) chez les enfants d'âge scolaire et les adolescents. Or, seulement deux études ont examiné l'impact d'un TCCL à l'âge préscolaire sur le développement social et aucune étude ne s'est penchée sur les répercussions socio-cognitives d'un TCCL précoce (à l’âge préscolaire). L'objectif de la présente thèse était donc d'étudier les conséquences du TCCL en bas âge sur la cognition sociale. Pour ce faire, nous avons examiné un aspect de la cognition sociale qui est en plein essor à cet âge, soit la théorie de l'esprit (TE), qui réfère à la capacité de se mettre à la place d'autrui et de comprendre sa perspective. Le premier article avait pour but d'étudier deux sous-composantes de la TE, soit la compréhension des fausses croyances et le raisonnement des désirs et des émotions d'autrui, six mois post-TCCL. Les résultats indiquent que les enfants d'âge préscolaire (18 à 60 mois) qui subissent un TCCL ont une TE significativement moins bonne 6 mois post-TCCL comparativement à un groupe contrôle d'enfants n'ayant subi aucune blessure. Le deuxième article visait à éclaircir l'origine de la diminution de la TE suite à un TCCL précoce. Cet objectif découle du débat qui existe actuellement dans la littérature. En effet, plusieurs scientifiques sont d'avis que l'on peut conclure à un effet découlant de la blessure au cerveau seulement lorsque les enfants ayant subi un TCCL sont comparés à des enfants ayant subi une blessure n'impliquant pas la tête (p.ex., une blessure orthopédique). Cet argument est fondé sur des études qui démontrent qu'en général, les enfants qui sont plus susceptibles de subir une blessure, peu importe la nature de celle-ci, ont des caractéristiques cognitives pré-existantes (p.ex. impulsivité, difficultés attentionnelles). Il s'avère donc possible que les difficultés que nous croyons attribuables à la blessure cérébrale étaient présentes avant même que l'enfant ne subisse un TCCL. Dans cette deuxième étude, nous avons donc comparé les performances aux tâches de TE d'enfants ayant subi un TCCL à ceux d'enfants appartenant à deux groupes contrôles, soit des enfants n'ayant subi aucune blessure et à des pairs ayant subi une blessure orthopédique. De façon générale, les enfants ayant subi un TCCL ont obtenu des performances significativement plus faibles à la tâche évaluant le raisonnement des désirs et des émotions d'autrui, 6 mois post-blessure, comparativement aux deux groupes contrôles. Cette étude visait également à examiner l'évolution de la TE suite à un TCCL, soit de 6 mois à 18 mois post-blessure. Les résultats démontrent que les moindres performances sont maintenues 18 mois post-TCCL. Enfin, le troisième but de cette étude était d’investiguer s’il existe un lien en la performance aux tâches de TE et les habiletés sociales, telles qu’évaluées à l’aide d’un questionnaire rempli par le parent. De façon intéressante, la TE est associée aux habiletés sociales seulement chez les enfants ayant subi un TCCL. Dans l'ensemble, ces deux études mettent en évidence des répercussions spécifiques du TCCL précoce sur la TE qui persistent à long terme, et une TE amoindrie seraient associée à de moins bonnes habiletés sociales. Cette thèse démontre qu'un TCCL en bas âge peut faire obstacle au développement sociocognitif, par le biais de répercussions sur la TE. Ces résultats appuient la théorie selon laquelle le jeune cerveau immature présente une vulnérabilité accrue aux blessures cérébrales. Enfin, ces études mettent en lumière la nécessité d'étudier ce groupe d'âge, plutôt que d'extrapoler à partir de résultats obtenus avec des enfants plus âgés, puisque les enjeux développementaux s'avèrent différents, et que ceux-ci ont potentiellement une influence majeure sur les répercussions d'une blessure cérébrale sur le fonctionnement sociocognitif.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Verbal fluency is the ability to produce a satisfying sequence of spoken words during a given time interval. The core of verbal fluency lies in the capacity to manage the executive aspects of language. The standard scores of the semantic verbal fluency test are broadly used in the neuropsychological assessment of the elderly, and different analytical methods are likely to extract even more information from the data generated in this test. Graph theory, a mathematical approach to analyze relations between items, represents a promising tool to understand a variety of neuropsychological states. This study reports a graph analysis of data generated by the semantic verbal fluency test by cognitively healthy elderly (NC), patients with Mild Cognitive Impairment – subtypes amnestic(aMCI) and amnestic multiple domain (a+mdMCI) - and patients with Alzheimer’s disease (AD). Sequences of words were represented as a speech graph in which every word corresponded to a node and temporal links between words were represented by directed edges. To characterize the structure of the data we calculated 13 speech graph attributes (SGAs). The individuals were compared when divided in three (NC – MCI – AD) and four (NC – aMCI – a+mdMCI – AD) groups. When the three groups were compared, significant differences were found in the standard measure of correct words produced, and three SGA: diameter, average shortest path, and network density. SGA sorted the elderly groups with good specificity and sensitivity. When the four groups were compared, the groups differed significantly in network density, except between the two MCI subtypes and NC and aMCI. The diameter of the network and the average shortest path were significantly different between the NC and AD, and between aMCI and AD. SGA sorted the elderly in their groups with good specificity and sensitivity, performing better than the standard score of the task. These findings provide support for a new methodological frame to assess the strength of semantic memory through the verbal fluency task, with potential to amplify the predictive power of this test. Graph analysis is likely to become clinically relevant in neurology and psychiatry, and may be particularly useful for the differential diagnosis of the elderly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Value and reasons for action are often cited by rationalists and moral realists as providing a desire-independent foundation for normativity. Those maintaining instead that normativity is dependent upon motivation often deny that anything called '"value" or "reasons" exists. According to the interest-relational theory, something has value relative to some perspective of desire just in case it satisfies those desires, and a consideration is a reason for some action just in case it indicates that something of value will be accomplished by that action. Value judgements therefore describe real properties of objects and actions, but have no normative significance independent of desires. It is argued that only the interest-relational theory can account for the practical significance of value and reasons for action. Against the Kantian hypothesis of prescriptive rational norms, I attack the alleged instrumental norm or hypothetical imperative, showing that the normative force for taking the means to our ends is explicable in terms of our desire for the end, and not as a command of reason. This analysis also provides a solution to the puzzle concerning the connection between value judgement and motivation. While it is possible to hold value judgements without motivation, the connection is more than accidental. This is because value judgements are usually but not always made from the perspective of desires that actually motivate the speaker. In the normal case judgement entails motivation. But often we conversationally borrow external perspectives of desire, and subsequent judgements do not entail motivation. This analysis drives a critique of a common practice as a misuse of normative language. The "absolutist" attempts to use and, as philosopher, analyze normative language in such a way as to justify the imposition of certain interests over others. But these uses and analyses are incoherent - in denying relativity to particular desires they conflict with the actual meaning of these utterances, which is always indexed to some particular set of desires.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new 'Danger Theory' (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of 'grounding' the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Oesophageal adenocarcinoma has increased dramatically in incidence over the past three decades with a particularly high burden of disease at the gastro-oesophageal junction. Many cases occur in individuals without known gastro-oesophageal reflux disease and in the absence of Barrett’s oesophagus suggesting that mechanisms other than traditional reflux may be important. Distal squamous mucosa may be prone to acid damage even in the absence of traditional reflux by the mechanism of distal opening of the lower oesophageal sphincter. This is splaying of the distal segment of lower oesophageal sphincter allowing acid ingress without traditional reflux. It has been suggested that the cardiac mucosa at the gastro-oesophageal junction, separating oesophageal squamous mucosa and acid secreting columnar mucosa of the stomach may be an abnormal mucosa arising as a consequence of acid damage. By this theory the cardiac mucosa is metaplastic and akin to ultra-short Barrett’s oesophagus. Obesity is a known risk factor for adenocarcinoma at the gastro-oesophageal junction and its rise has paralleled that of oesophageal cancer. Some of this excess risk undoubtedly operates through stress on the gastro-oesophageal junction and a predisposition to reflux. However we sought to explore the impact of obesity on the gastro-oesophageal junction in healthy volunteers without reflux and in particular to determine the characteristics of the cardiac mucosa and mechanisms of reflux in this group. Methods: 61 healthy volunteers with normal and increased waist circumference were recruited. 15 were found to have a hiatus hernia during the study protocol and were analysed separately. Volunteers had comprehensive pathological, physiological and anatomical assessments of the gastro-oesophageal junction including endoscopy with biopsies, MRI scanning before and after a standardised meal, prolonged recording of pH and manometry before and after a meal and screening by fluoroscopy to identify the squamo-columnar junction. In the course of the early manometric assessments a potential error associated with the manometry system recordings was identified. We therefore also sought to document and address this on the benchtop and in vivo. Key Findings: 1. In documenting the behaviour of the manoscan we described an immediate effect of temperature change on the pressure recorded by the sensors; ‘thermal effect’ and an ongoing drift of the recorded pressure with time; ‘baseline drift’. Thermal effect was well compensated within the standard operation of the system but baseline drift not addressed. Applying a linear correction to recorded data substantially reduced the error associated with baseline drift. 2. In asymptomatic healthy volunteers there was lengthening of the cardiac mucosa in association with central obesity and age. Furthermore, the cardiac mucosa in healthy volunteers demonstrated an almost identical immunophenotype to non-IM Barrett’s mucosa, which is considered to arise by metaplasia of oesophageal squamous mucosa. These findings support the hypothesis that the cardia is metaplastic in origin. 3. We have demonstrated a plausible mechanism of damage to distal squamous mucosa in association with obesity. In those with a large waist circumference we observed increased ingress of acid within but not across the lower oesophageal sphincter; ‘intrasphincteric reflux’ 4. The 15 healthy volunteers with a hiatus hernia were compared to 15 controls matched for age, gender and waist circumference. Those with a hiatus hernia had a longer cardiac mucosa and although they did not have excess traditional reflux they had excess distal acid exposure by short segment acid reflux and intrasphincteric acid reflux. Conclusions: These findings are likely to be relevant to adenocarcinoma of the gastro-oesophageal junction

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Verbal fluency is the ability to produce a satisfying sequence of spoken words during a given time interval. The core of verbal fluency lies in the capacity to manage the executive aspects of language. The standard scores of the semantic verbal fluency test are broadly used in the neuropsychological assessment of the elderly, and different analytical methods are likely to extract even more information from the data generated in this test. Graph theory, a mathematical approach to analyze relations between items, represents a promising tool to understand a variety of neuropsychological states. This study reports a graph analysis of data generated by the semantic verbal fluency test by cognitively healthy elderly (NC), patients with Mild Cognitive Impairment – subtypes amnestic(aMCI) and amnestic multiple domain (a+mdMCI) - and patients with Alzheimer’s disease (AD). Sequences of words were represented as a speech graph in which every word corresponded to a node and temporal links between words were represented by directed edges. To characterize the structure of the data we calculated 13 speech graph attributes (SGAs). The individuals were compared when divided in three (NC – MCI – AD) and four (NC – aMCI – a+mdMCI – AD) groups. When the three groups were compared, significant differences were found in the standard measure of correct words produced, and three SGA: diameter, average shortest path, and network density. SGA sorted the elderly groups with good specificity and sensitivity. When the four groups were compared, the groups differed significantly in network density, except between the two MCI subtypes and NC and aMCI. The diameter of the network and the average shortest path were significantly different between the NC and AD, and between aMCI and AD. SGA sorted the elderly in their groups with good specificity and sensitivity, performing better than the standard score of the task. These findings provide support for a new methodological frame to assess the strength of semantic memory through the verbal fluency task, with potential to amplify the predictive power of this test. Graph analysis is likely to become clinically relevant in neurology and psychiatry, and may be particularly useful for the differential diagnosis of the elderly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract We present ideas about creating a next generation Intrusion Detection System (IDS) based on the latest immunological theories. The central challenge with computer security is determining the difference between normal and potentially harmful activity. For half a century, developers have protected their systems by coding rules that identify and block specific events. However, the nature of current and future threats in conjunction with ever larger IT systems urgently requires the development of automated and adaptive defensive tools. A promising solution is emerging in the form of Artificial Immune Systems (AIS): The Human Immune System (HIS) can detect and defend against harmful and previously unseen invaders, so can we not build a similar Intrusion Detection System (IDS) for our computers? Presumably, those systems would then have the same beneficial properties as HIS like error tolerance, adaptation and self-monitoring. Current AIS have been successful on test systems, but the algorithms rely on self-nonself discrimination, as stipulated in classical immunology. However, immunologist are increasingly finding fault with traditional self-nonself thinking and a new 'Danger Theory' (DT) is emerging. This new theory suggests that the immune system reacts to threats based on the correlation of various (danger) signals and it provides a method of 'grounding' the immune response, i.e. linking it directly to the attacker. Little is currently understood of the precise nature and correlation of these signals and the theory is a topic of hot debate. It is the aim of this research to investigate this correlation and to translate the DT into the realms of computer security, thereby creating AIS that are no longer limited by self-nonself discrimination. It should be noted that we do not intend to defend this controversial theory per se, although as a deliverable this project will add to the body of knowledge in this area. Rather we are interested in its merits for scaling up AIS applications by overcoming self-nonself discrimination problems.