938 resultados para Nexo causal


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Ciências Farmacêuticas

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many kinds of human states of consciousness have been distinguished, including colourful or anomalous experiences that are felt to have spiritual significance by most people who have them. The neurosciences have isolated brain-state correlates for some of these colourful states of consciousness, thereby strengthening the hypothesis that these experiences are mediated by the brain. This result both challenges metaphysically dualist accounts of human nature and suggests that any adequate causal explanation of colourful experiences would have to make detailed reference to the evolutionary and genetic conditions that give rise to brains capable of such conscious phenomena. This paper quickly surveys types of conscious states and neurological interpretations of them. In order to deal with the question of the significance of such experiences, the paper then attempts to identify evolutionary and genetic constraints on proposals for causal explanations of such experiences. The conclusion is that a properly sensitive evolutionary account of human consciousness supports a rebuttal of the argument that the cognitive content of colourful experiences is pure delusion, but that this evolutionary account also heavily constrains what might be inferred theologically from such experiences. They are not necessarily delusory, therefore, but they are often highly misleading. Their significance must be construed consistently with this conclusion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Predictability - the ability to foretell that an implementation will not violate a set of specified reliability and timeliness requirements - is a crucial, highly desirable property of responsive embedded systems. This paper overviews a development methodology for responsive systems, which enhances predictability by eliminating potential hazards resulting from physically-unsound specifications. The backbone of our methodology is the Time-constrained Reactive Automaton (TRA) formalism, which adopts a fundamental notion of space and time that restricts expressiveness in a way that allows the specification of only reactive, spontaneous, and causal computation. Using the TRA model, unrealistic systems - possessing properties such as clairvoyance, caprice, in finite capacity, or perfect timing - cannot even be specified. We argue that this "ounce of prevention" at the specification level is likely to spare a lot of time and energy in the development cycle of responsive systems - not to mention the elimination of potential hazards that would have gone, otherwise, unnoticed. The TRA model is presented to system developers through the CLEOPATRA programming language. CLEOPATRA features a C-like imperative syntax for the description of computation, which makes it easier to incorporate in applications already using C. It is event-driven, and thus appropriate for embedded process control applications. It is object-oriented and compositional, thus advocating modularity and reusability. CLEOPATRA is semantically sound; its objects can be transformed, mechanically and unambiguously, into formal TRA automata for verification purposes, which can be pursued using model-checking or theorem proving techniques. Since 1989, an ancestor of CLEOPATRA has been in use as a specification and simulation language for embedded time-critical robotic processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Predictability — the ability to foretell that an implementation will not violate a set of specified reliability and timeliness requirements - is a crucial, highly desirable property of responsive embedded systems. This paper overviews a development methodology for responsive systems, which enhances predictability by eliminating potential hazards resulting from physically-unsound specifications. The backbone of our methodology is a formalism that restricts expressiveness in a way that allows the specification of only reactive, spontaneous, and causal computation. Unrealistic systems — possessing properties such as clairvoyance, caprice, infinite capacity, or perfect timing — cannot even be specified. We argue that this "ounce of prevention" at the specification level is likely to spare a lot of time and energy in the development cycle of responsive systems - not to mention the elimination of potential hazards that would have gone, otherwise, unnoticed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Predictability -- the ability to foretell that an implementation will not violate a set of specified reliability and timeliness requirements -- is a crucial, highly desirable property of responsive embedded systems. This paper overviews a development methodology for responsive systems, which enhances predictability by eliminating potential hazards resulting from physically-unsound specifications. The backbone of our methodology is the Time-constrained Reactive Automaton (TRA) formalism, which adopts a fundamental notion of space and time that restricts expressiveness in a way that allows the specification of only reactive, spontaneous, and causal computation. Using the TRA model, unrealistic systems – possessing properties such as clairvoyance, caprice, infinite capacity, or perfect timing -- cannot even be specified. We argue that this "ounce of prevention" at the specification level is likely to spare a lot of time and energy in the development cycle of responsive systems -- not to mention the elimination of potential hazards that would have gone, otherwise, unnoticed. The TRA model is presented to system developers through the Cleopatra programming language. Cleopatra features a C-like imperative syntax for the description of computation, which makes it easier to incorporate in applications already using C. It is event-driven, and thus appropriate for embedded process control applications. It is object-oriented and compositional, thus advocating modularity and reusability. Cleopatra is semantically sound; its objects can be transformed, mechanically and unambiguously, into formal TRA automata for verification purposes, which can be pursued using model-checking or theorem proving techniques. Since 1989, an ancestor of Cleopatra has been in use as a specification and simulation language for embedded time-critical robotic processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent measurements of local-area and wide-area traffic have shown that network traffic exhibits variability at a wide range of scales self-similarity. In this paper, we examine a mechanism that gives rise to self-similar network traffic and present some of its performance implications. The mechanism we study is the transfer of files or messages whose size is drawn from a heavy-tailed distribution. We examine its effects through detailed transport-level simulations of multiple TCP streams in an internetwork. First, we show that in a "realistic" client/server network environment i.e., one with bounded resources and coupling among traffic sources competing for resources the degree to which file sizes are heavy-tailed can directly determine the degree of traffic self-similarity at the link level. We show that this causal relationship is not significantly affected by changes in network resources (bottleneck bandwidth and buffer capacity), network topology, the influence of cross-traffic, or the distribution of interarrival times. Second, we show that properties of the transport layer play an important role in preserving and modulating this relationship. In particular, the reliable transmission and flow control mechanisms of TCP (Reno, Tahoe, or Vegas) serve to maintain the long-range dependency structure induced by heavy-tailed file size distributions. In contrast, if a non-flow-controlled and unreliable (UDP-based) transport protocol is used, the resulting traffic shows little self-similar characteristics: although still bursty at short time scales, it has little long-range dependence. If flow-controlled, unreliable transport is employed, the degree of traffic self-similarity is positively correlated with the degree of throttling at the source. Third, in exploring the relationship between file sizes, transport protocols, and self-similarity, we are also able to show some of the performance implications of self-similarity. We present data on the relationship between traffic self-similarity and network performance as captured by performance measures including packet loss rate, retransmission rate, and queueing delay. Increased self-similarity, as expected, results in degradation of performance. Queueing delay, in particular, exhibits a drastic increase with increasing self-similarity. Throughput-related measures such as packet loss and retransmission rate, however, increase only gradually with increasing traffic self-similarity as long as reliable, flow-controlled transport protocol is used.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An aim of proactive risk management strategies is the timely identification of safety related risks. One way to achieve this is by deploying early warning systems. Early warning systems aim to provide useful information on the presence of potential threats to the system, the level of vulnerability of a system, or both of these, in a timely manner. This information can then be used to take proactive safety measures. The United Nation’s has recommended that any early warning system need to have four essential elements, which are the risk knowledge element, a monitoring and warning service, dissemination and communication and a response capability. This research deals with the risk knowledge element of an early warning system. The risk knowledge element of an early warning system contains models of possible accident scenarios. These accident scenarios are created by using hazard analysis techniques, which are categorised as traditional and contemporary. The assumption in traditional hazard analysis techniques is that accidents are occurred due to a sequence of events, whereas, the assumption of contemporary hazard analysis techniques is that safety is an emergent property of complex systems. The problem is that there is no availability of a software editor which can be used by analysts to create models of accident scenarios based on contemporary hazard analysis techniques and generate computer code that represent the models at the same time. This research aims to enhance the process of generating computer code based on graphical models that associate early warning signs and causal factors to a hazard, based on contemporary hazard analyses techniques. For this purpose, the thesis investigates the use of Domain Specific Modeling (DSM) technologies. The contributions of this thesis is the design and development of a set of three graphical Domain Specific Modeling languages (DSML)s, that when combined together, provide all of the necessary constructs that will enable safety experts and practitioners to conduct hazard and early warning analysis based on a contemporary hazard analysis approach. The languages represent those elements and relations necessary to define accident scenarios and their associated early warning signs. The three DSMLs were incorporated in to a prototype software editor that enables safety scientists and practitioners to create and edit hazard and early warning analysis models in a usable manner and as a result to generate executable code automatically. This research proves that the DSM technologies can be used to develop a set of three DSMLs which can allow user to conduct hazard and early warning analysis in more usable manner. Furthermore, the three DSMLs and their dedicated editor, which are presented in this thesis, may provide a significant enhancement to the process of creating the risk knowledge element of computer based early warning systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research study investigates the image of mathematics held by 5th-year post-primary students in Ireland. For this study, “image of mathematics” is conceptualized as a mental representation or view of mathematics, presumably constructed as a result of past experiences, mediated through school, parents, peers or society. It is also understood to include attitudes, beliefs, emotions, self-concept and motivation in relation to mathematics. This study explores the image of mathematics held by a sample of 356 5th-year students studying ordinary level mathematics. Students were aged between 15 and 18 years. In addition, this study examines the factors influencing students‟ images of mathematics and the possible reasons for students choosing not to study higher level mathematics for the Leaving Certificate. The design for this study is chiefly explorative. A questionnaire survey was created containing both quantitative and qualitative methods to investigate the research interest. The quantitative aspect incorporated eight pre-established scales to examine students‟ attitudes, beliefs, emotions, self-concept and motivation regarding mathematics. The qualitative element explored students‟ past experiences of mathematics, their causal attributions for success or failure in mathematics and their influences in mathematics. The quantitative and qualitative data was analysed for all students and also for students grouped by gender, prior achievement, type of post-primary school attending, co-educational status of the post-primary school and the attendance of a Project Maths pilot school. Students‟ images of mathematics were seen to be strongly indicated by their attitudes (enjoyment and value), beliefs, motivation, self-concept and anxiety, with each of these elements strongly correlated with each other, particularly self-concept and anxiety. Students‟ current images of mathematics were found to be influenced by their past experiences of mathematics, by their mathematics teachers, parents and peers, and by their prior mathematical achievement. Gender differences occur for students in their images of mathematics, with males having more positive images of mathematics than females and this is most noticeable with regards to anxiety about mathematics. Mathematics anxiety was identified as a possible reason for the low number of students continuing with higher level mathematics for the Leaving Certificate. Some students also expressed low mathematical self-concept with regards to higher level mathematics specifically. Students with low prior achievement in mathematics tended to believe that mathematics requires a natural ability which they do not possess. Rote-learning was found to be common among many students in the sample. The most positive image of mathematics held by students was the “problem-solving image”, with resulting implications for the new Project Maths syllabus in post-primary education. Findings from this research study provide important insights into the image of mathematics held by the sample of Irish post-primary students and make an innovative contribution to mathematics education research. In particular, findings contribute to the current national interest in Ireland in post-primary mathematics education, highlighting issues regarding the low uptake of higher level mathematics for the Leaving Certificate and also making a preliminary comparison between students who took part in the piloting of Project Maths and students who were more recently introduced to the new syllabus. This research study also holds implications for mathematics teachers, parents and the mathematics education community in Ireland, with some suggestions made on improving students‟ images of mathematics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation sets out to provide immanent critique and deconstruction of ecological modernisation or ecomodernism.It does so, from a critical social theory approach, in order to correctly address the essential issues at the heart of the environmental crisis that ecomodernism purports to address. This critical approach argues that the solution to the environmental crisis can only be concretely achieved by recognising its root cause as being foremost the issue of material interaction between classes in society, and not simply between society and nature in any structurally meaningful way. Based on a metaphysic of false dualism, ecological modernisation attributes a materiality of exchange value relations to issues of society, while simultaneously offering a non- material ontology to issues of nature. Thus ecomodernism serves asymmetrical relations of power whereby, as a polysemic policy discourse, it serves the material interests of those who have the power to impose abstract interpretations on the materiality of actual phenomena. The research of this dissertation is conducted by the critical evaluation of the empirical data from two exemplary Irish case studies. Discovery of the causal processes of the various public issues in the case studies and thereafter the revelation of the meaning structures under- pinning such causal processes, is a theoretically- driven task requiring analysis of those social practices found in the cognitive, cultural and structural constitutions respectively of actors, mediations and systems.Therefore, the imminent critique of the case study paradigms serves as a research strategy for comprehending Ireland’s nature- society relations as influenced essentially by a systems (techno- corporatist) ecomodernist discourse. Moreover, the deconstruction of this systems ideological discourse serves not only to demonstrate how weak ecomodernism practically undermines its declared ecological objectives, but also indicates how such objectives intervene as systemic contradictions at the cultural heart of Ireland’s late modernisation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Poor oxygenation (hypoxia) is a common characteristic of human solid tumours, and is associated with cell survival, metastasis and resistance to radio- and chemotherapies. Hypoxia-induced stabilisation of hypoxia-inducible factor-1α (HIF-1α) leads to changes in expression of various genes associated with growth, vascularisation and metabolism. However whether HIF-1α plays a causal role in promoting hypoxic resistance to antitumour therapies remains unclear. In this study we used pharmacological and genetic methods to investigate the HIF-1α contribution to radio- and chemoresistance in four cancer cell lines derived from cervical, breast, prostate and melanoma human tumours. Under normoxia or hypoxia (<0.2% or 0.5% oxygen) the cells were exposed to either a standard irradiation dose (6.2 Gy) or chemotherapeutic drug (cisplatin), and subsequent cell proliferation (after 7 days) was measured in terms of resazurin reduction. Oxygen-dependent radio- and chemosensitivity was evident in all wild type whereas it was reduced or abolished in HIF-1α (siRNA) knockdown cells. The effects of HIF-1α-modulating drugs (EDHB, CoCl2, deferoxamine to stabilise and R59949 to destabilise it) reflected both HIF-1α-dependent and independent mechanisms. Collectively the data show that HIF-1α played a causal role in our in vitro model of hypoxia-induced radioresistance whereas its contribution to oxygendependent sensitivity to cisplatin was less clear-cut. Although this behavior is likely to be conditioned by further biological and physical factors operating in vivo, it is consistent with the hypothesis that interventions directed at HIF-1α may improve the clinical effectiveness of tumour treatments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the implications of the effectuation concept for socio-technical artifact design as part of the design science research (DSR) process in information systems (IS). Effectuation logic is the opposite of causal logic. Ef-fectuation does not focus on causes to achieve a particular effect, but on the possibilities that can be achieved with extant means and resources. Viewing so-cio-technical IS DSR through an effectuation lens highlights the possibility to design the future even without set goals. We suggest that effectuation may be a useful perspective for design in dynamic social contexts leading to a more dif-ferentiated view on the instantiation of mid-range artifacts for specific local ap-plication contexts. Design science researchers can draw on this paper’s conclu-sions to view their DSR projects through a fresh lens and to reexamine their re-search design and execution. The paper also offers avenues for future research to develop more concrete application possibilities of effectuation in socio-technical IS DSR and, thus, enrich the discourse.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis is an investigation into the US response to the Khmer Rouge regime in Cambodia between 1974 and 1981. It argues that the US experience in the Vietnam War acted as a causal factor in the formulation of its Cambodian policy during the presidencies of Gerald Ford and Jimmy Carter. From taking power in April 1975 to their removal by the Vietnamese in January 1979, the Khmer Rouge initiated a revolution unrivalled in the 20th Century for its brutality and for the total eradication of modern society. This thesis demonstrates that the Ford administration viewed Cambodia only as it pertained to their strategy in Vietnam and, following US disengagement from Indochina all but ignored the atrocities occurring there as they instead pursued informal relations with the Khmer Rouge as a means of punishing the Vietnamese. The Carter administration formulated a foreign policy based on human rights yet failed to adequately address the genocide that occurred in Cambodia due to its temporal and regional proximity to Vietnam. Instead, this collective reluctance to reengage with the region and the resulting anti-Vietnamese attitude reinforced Brzezinski’s broader global strategy that allied the US with China in support of an independent Cambodia to further isolate Hanoi. Thus this thesis argues that the distorting impact of the Vietnam War, as well as global Cold War calculations, undermined any appreciation of the Cambodian conflict and caused both administrations to pursue policies in Cambodia that ultimately supported the Khmer Rouge regime. This project incorporates declassified material from the Ford and Carter Presidential Libraries, supplemented by the material from the National Archives and Library of Congress, and relevant newspapers and periodicals. It demonstrates that the limitations placed upon US foreign policy by their experience in the Vietnam War may be used to reveal unexplored elements in US-Cambodian relations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective-To characterize a subpopulation of complicated cases of ovarian hyperstimulation syndrome (OHSS). Method-Descriptive retrospective study. Results-0.75% of our IVF-ET population suffered from OHSS. Among this group, 33% did not exhibit any recognized risk criteria of OHSS in terms of infertility characteristics and ovarian response to exogenous gonadotrophins. Only severe (ascites) OHSS cases were considered (n = 5) in this study. Previous IVF-ET attempts had been uneventful and during the complicated trial, estradiol peak levels and numbers of oocytes retrieved remained below 2,500 pg/mL (conversion factor to SI unit, 3.671) and 10, respectively. In all cases, the luteal phase was supplemented by hCG and all patients became pregnant. A threshold level of exogenous and/or endogenous hCG seems to be responsible for the occurrence of OHSS. Conclusion-One-third of the patients developing an ovarian hyperstimulation syndrome after IVF-ET had not previously shown risk criteria. A causal role of exogenous and/or endogenous hCG is suggested.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Serotonin is a neurotransmitter that has been linked to a wide variety of behaviors including feeding and body-weight regulation, social hierarchies, aggression and suicidality, obsessive compulsive disorder, alcoholism, anxiety, and affective disorders. Full understanding of serotonergic systems in the central nervous system involves genomics, neurochemistry, electrophysiology, and behavior. Though associations have been found between functions at these different levels, in most cases the causal mechanisms are unknown. The scientific issues are daunting but important for human health because of the use of selective serotonin reuptake inhibitors and other pharmacological agents to treat disorders in the serotonergic signaling system. METHODS: We construct a mathematical model of serotonin synthesis, release, and reuptake in a single serotonergic neuron terminal. The model includes the effects of autoreceptors, the transport of tryptophan into the terminal, and the metabolism of serotonin, as well as the dependence of release on the firing rate. The model is based on real physiology determined experimentally and is compared to experimental data. RESULTS: We compare the variations in serotonin and dopamine synthesis due to meals and find that dopamine synthesis is insensitive to the availability of tyrosine but serotonin synthesis is sensitive to the availability of tryptophan. We conduct in silico experiments on the clearance of extracellular serotonin, normally and in the presence of fluoxetine, and compare to experimental data. We study the effects of various polymorphisms in the genes for the serotonin transporter and for tryptophan hydroxylase on synthesis, release, and reuptake. We find that, because of the homeostatic feedback mechanisms of the autoreceptors, the polymorphisms have smaller effects than one expects. We compute the expected steady concentrations of serotonin transporter knockout mice and compare to experimental data. Finally, we study how the properties of the the serotonin transporter and the autoreceptors give rise to the time courses of extracellular serotonin in various projection regions after a dose of fluoxetine. CONCLUSIONS: Serotonergic systems must respond robustly to important biological signals, while at the same time maintaining homeostasis in the face of normal biological fluctuations in inputs, expression levels, and firing rates. This is accomplished through the cooperative effect of many different homeostatic mechanisms including special properties of the serotonin transporters and the serotonin autoreceptors. Many difficult questions remain in order to fully understand how serotonin biochemistry affects serotonin electrophysiology and vice versa, and how both are changed in the presence of selective serotonin reuptake inhibitors. Mathematical models are useful tools for investigating some of these questions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Genome-wide association studies (GWAS) have now identified at least 2,000 common variants that appear associated with common diseases or related traits (http://www.genome.gov/gwastudies), hundreds of which have been convincingly replicated. It is generally thought that the associated markers reflect the effect of a nearby common (minor allele frequency >0.05) causal site, which is associated with the marker, leading to extensive resequencing efforts to find causal sites. We propose as an alternative explanation that variants much less common than the associated one may create "synthetic associations" by occurring, stochastically, more often in association with one of the alleles at the common site versus the other allele. Although synthetic associations are an obvious theoretical possibility, they have never been systematically explored as a possible explanation for GWAS findings. Here, we use simple computer simulations to show the conditions under which such synthetic associations will arise and how they may be recognized. We show that they are not only possible, but inevitable, and that under simple but reasonable genetic models, they are likely to account for or contribute to many of the recently identified signals reported in genome-wide association studies. We also illustrate the behavior of synthetic associations in real datasets by showing that rare causal mutations responsible for both hearing loss and sickle cell anemia create genome-wide significant synthetic associations, in the latter case extending over a 2.5-Mb interval encompassing scores of "blocks" of associated variants. In conclusion, uncommon or rare genetic variants can easily create synthetic associations that are credited to common variants, and this possibility requires careful consideration in the interpretation and follow up of GWAS signals.