19 resultados para eventuality
Resumo:
In this thesis we propose a new approach to deduction methods for temporal logic. Our proposal is based on an inductive definition of eventualities that is different from the usual one. On the basis of this non-customary inductive definition for eventualities, we first provide dual systems of tableaux and sequents for Propositional Linear-time Temporal Logic (PLTL). Then, we adapt the deductive approach introduced by means of these dual tableau and sequent systems to the resolution framework and we present a clausal temporal resolution method for PLTL. Finally, we make use of this new clausal temporal resolution method for establishing logical foundations for declarative temporal logic programming languages. The key element in the deduction systems for temporal logic is to deal with eventualities and hidden invariants that may prevent the fulfillment of eventualities. Different ways of addressing this issue can be found in the works on deduction systems for temporal logic. Traditional tableau systems for temporal logic generate an auxiliary graph in a first pass.Then, in a second pass, unsatisfiable nodes are pruned. In particular, the second pass must check whether the eventualities are fulfilled. The one-pass tableau calculus introduced by S. Schwendimann requires an additional handling of information in order to detect cyclic branches that contain unfulfilled eventualities. Regarding traditional sequent calculi for temporal logic, the issue of eventualities and hidden invariants is tackled by making use of a kind of inference rules (mainly, invariant-based rules or infinitary rules) that complicates their automation. A remarkable consequence of using either a two-pass approach based on auxiliary graphs or aone-pass approach that requires an additional handling of information in the tableau framework, and either invariant-based rules or infinitary rules in the sequent framework, is that temporal logic fails to carry out the classical correspondence between tableaux and sequents. In this thesis, we first provide a one-pass tableau method TTM that instead of a graph obtains a cyclic tree to decide whether a set of PLTL-formulas is satisfiable. In TTM tableaux are classical-like. For unsatisfiable sets of formulas, TTM produces tableaux whose leaves contain a formula and its negation. In the case of satisfiable sets of formulas, TTM builds tableaux where each fully expanded open branch characterizes a collection of models for the set of formulas in the root. The tableau method TTM is complete and yields a decision procedure for PLTL. This tableau method is directly associated to a one-sided sequent calculus called TTC. Since TTM is free from all the structural rules that hinder the mechanization of deduction, e.g. weakening and contraction, then the resulting sequent calculus TTC is also free from this kind of structural rules. In particular, TTC is free of any kind of cut, including invariant-based cut. From the deduction system TTC, we obtain a two-sided sequent calculus GTC that preserves all these good freeness properties and is finitary, sound and complete for PLTL. Therefore, we show that the classical correspondence between tableaux and sequent calculi can be extended to temporal logic. The most fruitful approach in the literature on resolution methods for temporal logic, which was started with the seminal paper of M. Fisher, deals with PLTL and requires to generate invariants for performing resolution on eventualities. In this thesis, we present a new approach to resolution for PLTL. The main novelty of our approach is that we do not generate invariants for performing resolution on eventualities. Our method is based on the dual methods of tableaux and sequents for PLTL mentioned above. Our resolution method involves translation into a clausal normal form that is a direct extension of classical CNF. We first show that any PLTL-formula can be transformed into this clausal normal form. Then, we present our temporal resolution method, called TRS-resolution, that extends classical propositional resolution. Finally, we prove that TRS-resolution is sound and complete. In fact, it finishes for any input formula deciding its satisfiability, hence it gives rise to a new decision procedure for PLTL. In the field of temporal logic programming, the declarative proposals that provide a completeness result do not allow eventualities, whereas the proposals that follow the imperative future approach either restrict the use of eventualities or deal with them by calculating an upper bound based on the small model property for PLTL. In the latter, when the length of a derivation reaches the upper bound, the derivation is given up and backtracking is used to try another possible derivation. In this thesis we present a declarative propositional temporal logic programming language, called TeDiLog, that is a combination of the temporal and disjunctive paradigms in Logic Programming. We establish the logical foundations of our proposal by formally defining operational and logical semantics for TeDiLog and by proving their equivalence. Since TeDiLog is, syntactically, a sublanguage of PLTL, the logical semantics of TeDiLog is supported by PLTL logical consequence. The operational semantics of TeDiLog is based on TRS-resolution. TeDiLog allows both eventualities and always-formulas to occur in clause heads and also in clause bodies. To the best of our knowledge, TeDiLog is the first declarative temporal logic programming language that achieves this high degree of expressiveness. Since the tableau method presented in this thesis is able to detect that the fulfillment of an eventuality is prevented by a hidden invariant without checking for it by means of an extra process, since our finitary sequent calculi do not include invariant-based rules and since our resolution method dispenses with invariant generation, we say that our deduction methods are invariant-free.
Resumo:
The epidemic of HIV/AIDS in the United States is constantly changing and evolving, starting from patient zero to now an estimated 650,000 to 900,000 Americans infected. The nature and course of HIV changed dramatically with the introduction of antiretrovirals. This discourse examines many different facets of HIV from the beginning where there wasn't any treatment for HIV until the present era of highly active antiretroviral therapy (HAART). By utilizing statistical analysis of clinical data, this paper examines where we were, where we are and projections as to where treatment of HIV/AIDS is headed.
Chapter Two describes the datasets that were used for the analyses. The primary database utilized was collected by myself from an outpatient HIV clinic. The data included dates from 1984 until the present. The second database was from the Multicenter AIDS Cohort Study (MACS) public dataset. The data from the MACS cover the time between 1984 and October 1992. Comparisons are made between both datasets.
Chapter Three discusses where we were. Before the first anti-HIV drugs (called antiretrovirals) were approved, there was no treatment to slow the progression of HIV. The first generation of antiretrovirals, reverse transcriptase inhibitors such as AZT (zidovudine), DDI (didanosine), DDC (zalcitabine), and D4T (stavudine) provided the first treatment for HIV. The first clinical trials showed that these antiretrovirals had a significant impact on increasing patient survival. The trials also showed that patients on these drugs had increased CD4+ T cell counts. Chapter Three examines the distributions of CD4 T cell counts. The results show that the estimated distributions of CD4 T cell counts are distinctly non-Gaussian. Thus distributional assumptions regarding CD4 T cell counts must be taken, into account when performing analyses with this marker. The results also show the estimated CD4 T cell distributions for each disease stage: asymptomatic, symptomatic and AIDS are non-Gaussian. Interestingly, the distribution of CD4 T cell counts for the asymptomatic period is significantly below that of the CD4 T cell distribution for the uninfected population suggesting that even in patients with no outward symptoms of HIV infection, there exists high levels of immunosuppression.
Chapter Four discusses where we are at present. HIV quickly grew resistant to reverse transcriptase inhibitors which were given sequentially as mono or dual therapy. As resistance grew, the positive effects of the reverse transcriptase inhibitors on CD4 T cell counts and survival dissipated. As the old era faded a new era characterized by a new class of drugs and new technology changed the way that we treat HIV-infected patients. Viral load assays were able to quantify the levels of HIV RNA in the blood. By quantifying the viral load, one now had a faster, more direct way to test antiretroviral regimen efficacy. Protease inhibitors, which attacked a different region of HIV than reverse transcriptase inhibitors, when used in combination with other antiretroviral agents were found to dramatically and significantly reduce the HIV RNA levels in the blood. Patients also experienced significant increases in CD4 T cell counts. For the first time in the epidemic, there was hope. It was hypothesized that with HAART, viral levels could be kept so low that the immune system as measured by CD4 T cell counts would be able to recover. If these viral levels could be kept low enough, it would be possible for the immune system to eradicate the virus. The hypothesis of immune reconstitution, that is bringing CD4 T cell counts up to levels seen in uninfected patients, is tested in Chapter Four. It was found that for these patients, there was not enough of a CD4 T cell increase to be consistent with the hypothesis of immune reconstitution.
In Chapter Five, the effectiveness of long-term HAART is analyzed. Survival analysis was conducted on 213 patients on long-term HAART. The primary endpoint was presence of an AIDS defining illness. A high level of clinical failure, or progression to an endpoint, was found.
Chapter Six yields insights into where we are going. New technology such as viral genotypic testing, that looks at the genetic structure of HIV and determines where mutations have occurred, has shown that HIV is capable of producing resistance mutations that confer multiple drug resistance. This section looks at resistance issues and speculates, ceterus parabis, where the state of HIV is going. This section first addresses viral genotype and the correlates of viral load and disease progression. A second analysis looks at patients who have failed their primary attempts at HAART and subsequent salvage therapy. It was found that salvage regimens, efforts to control viral replication through the administration of different combinations of antiretrovirals, were not effective in 90 percent of the population in controlling viral replication. Thus, primary attempts at therapy offer the best change of viral suppression and delay of disease progression. Documentation of transmission of drug-resistant virus suggests that the public health crisis of HIV is far from over. Drug resistant HIV can sustain the epidemic and hamper our efforts to treat HIV infection. The data presented suggest that the decrease in the morbidity and mortality due to HIV/AIDS is transient. Deaths due to HIV will increase and public health officials must prepare for this eventuality unless new treatments become available. These results also underscore the importance of the vaccine effort.
The final chapter looks at the economic issues related to HIV. The direct and indirect costs of treating HIV/AIDS are very high. For the first time in the epidemic, there exists treatment that can actually slow disease progression. The direct costs for HAART are estimated. It is estimated that the direct lifetime costs for treating each HIV infected patient with HAART is between $353,000 to $598,000 depending on how long HAART prolongs life. If one looks at the incremental cost per year of life saved it is only $101,000. This is comparable with the incremental costs per year of life saved from coronary artery bypass surgery.
Policy makers need to be aware that although HAART can delay disease progression, it is not a cure and HIV is not over. The results presented here suggest that the decreases in the morbidity and mortality due to HIV are transient. Policymakers need to be prepared for the eventual increase in AIDS incidence and mortality. Costs associated with HIV/AIDS are also projected to increase. The cost savings seen recently have been from the dramatic decreases in the incidence of AIDS defining opportunistic infections. As patients who have been on HAART the longest start to progress to AIDS, policymakers and insurance companies will find that the cost of treating HIV/AIDS will increase.
Resumo:
27 p.
Resumo:
Há cerca de 20 anos, inicia-se no Brasil uma atividade de trabalho nas áreas urbanas: o serviço de motoboys, que vem se tornando comum na distribuição de produtos aos clientes. Em paralelo ao crescimento dessa atividade ocupacional, a quantidade de motociclistas que se acidentam grave ou fatalmente nas vias brasileiras está em pleno crescimento, razão pela qual os motoboys vêm chamando atenção das autoridades de Saúde Pública. Tentando compreender as questões que estão na base e em torno desse fenômeno, inicia-se no país um pequeno, mas consistente, conjunto de produções acadêmicas sobre a profissão. Porém, ainda são poucos os estudos que procuram compreender a atividade de trabalho dos motoboys. Menos ainda, são os que investigam dimensões do coletivo produzidas pelos profissionais (tais como o coletivo de trabalho ou o gênero profissional), bem como seus efeitos na constituição de saberes, discursos, valores e estratégias de enfrentamentos aos diversos contraintes da atividade, em particular as dimensões do risco de acidentes de trabalho. Visando responder especificamente a essa questão, é que se propõe esse trabalho. Para tanto, desenvolveu-se uma pesquisa exploratória em duas perspectivas metodológicas: por um lado um levantamento quantitativo sobre diversos aspectos do trabalho dos motoboys, tais como o perfil do trabalhador, a organização do trabalho e alguns efeitos no trabalhador; essa etapa se deu por meio da aplicação de 189 questionários aplicados em uma amostra proporcional à população identificada de trabalhadores nos principais corredores viários do município de Vitória. Por outro lado, se empreendeu um estudo, baseado nos princípios da Ergologia e em diferentes abordagens clínicas do trabalho, especialmente a Clínica da Atividade e a Psicodinâmica do Trabalho, por meio do qual se pretendeu realizar uma análise da atividade de trabalho em parceria com os trabalhadores, procurando identificar as dimensões do coletivo que são produzidas por meio do trabalho, bem como os saberes e estratégias individuais e coletivas para lidar com as exigências, as pressões, as contradições e as eventualidades do cotidiano. Destaca-se, nessa etapa qualitativa, a realização de uma aproximação etnográfica dos trabalhadores e a utilização das técnicas da autoconfrontação e das instruções ao sósia, essas duas obtidas via Clínica da Atividade. Como resultado, observou-se a existência de inúmeros saberes produzidos e/ou partilhados pelo coletivo, tais como a avaliação dos serviços, a gestão do tempo, o planejamento da rota, a mobilização da rede solidária, a gestão das transgressões, os modos de conduzir, bem como as estratégias coletivas para lidar com o risco, dentre os quais se destacam a exploração positiva do risco e as estratégias baseadas na potência da virilidade, a capacidade de antecipação, ou o cuidado na proteção de si por meio da sinalização da presença do trabalhador em trânsito nas vias. Conclui-se, dessa análise, a existência de um coletivo de trabalho e, mais particularmente, de um gênero profissional em franca constituição. Este, em contrapartida, está eivado de inúmeras contradições e embates que, potencialmente, podem estar atuando para o impedimento da manifestação desse coletivo de trabalho em toda a sua potência
Resumo:
A grande motivação para este trabalho resultou na avaliação da vulnerabilidade sísmica das escolas básicas e secundárias que fazem parte integrante do parque escolar de Portugal Continental. Apesar de até ao momento apenas se terem estudado a vulnerabilidade de algumas escolas em algumas zonas particulares do nosso país, para este trabalho de investigação tivemos uma ambição muito maior e obviamente fizemos um esforço muito maior. Estabelecemos assim como meta, a avaliação de todo o parque escolar. Embora todo o parque escolar possua na ordem das três centenas de escolas em todo o território nacional e sendo este projeto de reabilitação, um projeto com a duração de 2007 a 2015. Em 2011, por motivos da crise económica, todo o projeto congelou sendo reabilitadas até à data cerca de apenas um terço das escolas. Esta quantidade foi o número de escolas que avaliamos em todo o país. As escolas, sendo edifícios públicos com uma importância fundamental, tanto pela elevada concentração de jovens, como pela função essencial como centros de aprendizagem para as gerações vindouras, como também pela ameaça que representam na eventualidade dum cenário sísmico pela enorme densidade de utilizadores, e pela vantagem de nesse cenário de catástrofe a importância estrutural ser superior em relação à maior parte dos edifícios correntes, devidamente demonstrado pelos argumentos enumerados, consequentemente as escolas podem servir como instalações de proteção civil perante uma catástrofe sísmica para apoio das populações circundantes afetadas. Portanto para cada uma das escolas cordialmente fornecidas pelo Parque Escolar, E.P.E., foi feito um estudo exaustivo e muito individual de cada uma das escolas, onde cada análise foi desenvolvida por uma metodologia simplificada, sendo cada análise sempre individual, e nunca tendo sida aplicada em série, este fator melhora substancialmente a eficácia da avaliação para a quantificação das vulnerabilidades e da determinação do grau de dano e das frações de perda para os requisitos fundamentais de limitação de danos, de não colapso, e de colapso iminente que correspondem a ações com períodos de retorno de 95, 475 e 975 anos. Este trabalho é fundamental para as entidades competentes terem a consciência da vulnerabilidade das escolas secundárias, para puderem atuar a nível estrutural e diminuir assim a vulnerabilidade sísmica, e mesmo que por impossibilidade económica o poder governamental não intervenha, então pode e principalmente deve, elaborar planos de emergência tanto com engenheiros civis qualificados como com a total colaboração das corporações de bombeiros que fazem parte das forças de operação e socorro da Autoridade Nacional de Proteção Civil (ANPC).
Resumo:
Cette thèse porte sur le rapport université/entreprise au Mexique après 1990. Il s’agit d’une étude de cas sur l’Université Nationale Autonome du Mexique (UNAM), la plus grande université mexicaine et la plus importante institution productrice de connaissances scientifiques au pays. À partir de 1988, l’introduction au Mexique d’une économie du marché a été le point de départ des nombreux changements politiques et économiques qui ont modifié les conditions d’exploitation des organisations et des institutions au pays. Ainsi, depuis 1990, le nouveau contexte politique et économique du Mexique a modifié les politiques gouvernementales vers les institutions publiques y compris celles de la santé et de l’éducation. Pour ce qui est des universités publiques mexicaines, ces politiques ont réduit leur financement et leur ont demandé une participation plus active à l’économie nationale, par la production de connaissances pouvant se traduire en innovation dans le secteur de la production. Ces nouvelles conditions économiques et politiques constituent des contingences auxquelles les universitaires font face de diverses façons, y compris l’établissement des relations avec les entreprises, comme le prescrivent les politiques du gouvernement fédéral élaborées sur la base des recommandations de l’OCDE. En vue de contribuer à la connaissance des relations université/entreprise développées au Mexique, nous avons réalisé notre étude de cas fondée sur une approche méthodologique qualitative à caractère exploratoire qui a recueilli des données provenant de sources documentaires et perceptuelles. Nous avons encadré notre recherche du point de vue de l’organisation par la théorie de la contingence, et pour l’analyse de la production de la connaissance sur la base des modèles de la Triple hélice et du Mode 2. Différents documents de sources diverses, y compris l’Internet, ont été consultés pour l’encadrement des rapports université/entreprise au Mexique et à l’UNAM. Les sources perceptuelles ont été 51 entrevues semi-structurées auprès de professeurs et de chercheurs à temps plein ayant établi des rapports avec des entreprises (dans les domaines de la biomédecine, la biotechnologie, la chimie et l’ingénierie) et de personnes ayant un rôle de gestion dans les rapports des entreprises avec l’institution. Les données recueillies ont montré que la politique de l’UNAM sur les rapports université/entreprise a été aussi flottante que la structure organisationnelle soutenant sa création et formalisation. Toutes sortes d’entreprises, publiques ou privées collaborent avec les chercheurs de l’UNAM, mais ce sont les entreprises parastatales et gouvernementales qui prédominent. À cause du manque d’infrastructure scientifique et technologique de la plupart des entreprises au Mexique, les principales demandes adressées à l’UNAM sont pour des services techniques ou professionnels qui aident les entreprises à résoudre des problèmes ponctuels. Le type de production de connaissance à l’UNAM continue d’être celui du Mode 1 ou traditionnel. Néanmoins, particulièrement dans le domaine de la biotechnologie, nous avons identifié certains cas de collaboration plus étroite qui pointaient vers l’innovation non linéaire proposée par le Mode 2 et la Triple hélice. Parmi les principaux avantages découlant des rapports avec les entreprises les interviewés ont cité l’obtention de ressources additionnelles pour la recherche, y compris de l’équipement et des fonds pour les bourses d’étudiants, mais souvent ils ont observé que l’un des plus gros avantages était la connaissance qu’ils obtenaient des contacts avec les firmes et le sens du réel qu’ils pouvaient intégrer dans la formation des étudiants. Les programmes gouvernementaux du CONACYT pour la science, la technologie et l’innovation ne semblent pas réussir à renforcer les rapports entre les institutions génératrices de la connaissance et le secteur de la production du Mexique.
Resumo:
Since around twenty years, the Saguenay CMA seems to have underwent a population decline and important economic transformations, wich would have confronted citizens and local actors to a situation of a possible decline. In a context of an ageing population generalized to the whole Quebec, the Saguenay CMA can be seen as a precursor territory of the population decline phenomenon for a medium-sized city. It’s the scale and the extent of the phenomenon wich seem to have become more important. In this context, is it possible to reverse the situation from an urban planning based on growth, to a planning that takes into account the possiblity of the decrease and the ageing of the population, as well as the reorganization of econimic activities? The analysis of the actors’s speech, who are involved in planning, economic development and politics, raise the question of the difficulty to conceive the decrease of the population and the economic tranformations, not as an occasional phenomenon, but as a possibly structural phenomenon that may last over time. The subject of the decline seems to generate a form of discomfort among the actors, going even to the complete reject of the situation as a possible reality. For several, the eventuality of a generalized decline is inconceivable, the decrease can be perceived as a political failure. It appears that most of the strategies put in place to correct the situation, are based on the goal of a return to the growth. From the signs in the built framework, through the strategy of territorial marketing and municipal interventionism, until the appearance of urban brownfields, the impacts of the population decrease and the economic transformations seems, for the greater part very subtile, but to be present on the territory of the CMA. The shrinking cities phenomenon is observed in this study according to a new approach that confronts the actors’s speech, the territory reality and the analysis of the economic and demographic dynamics. It is thus an exploratory research wich tries to question the current way of thinking the urban growth.
Resumo:
Ce mémoire s’inscrit dans une approche émergente en urbanisme qui cherche à mettre en dialogue les théories et les pratiques de l’urbanisme et la pensée du philosophe français Gilles Deleuze. Depuis quelques années, la pensée de Gilles Deleuze (surtout ses travaux coécrits avec Félix Guattari 1972; 1980) commence à s’immiscer dans les débats contemporains en urbanisme. Les travaux de Kim Dovey (2010; 2012), Jean Hillier (2005; 2007; 2011) et Colin McFarlane (2011a; 2011b) constituent les exemples les plus achevés d’une réflexion deleuzienne sur l’urbanisme. À degrés divers, ces auteurs mobilisent surtout cette pensée pour sa capacité à appréhender la complexité, le changement et l’instabilité (assemblage thinking). Pourtant, cette mobilisation de la pensée deleuzienne en urbanisme laisse largement intouchée le projet éthique et politique au coeur de la pensée de Gilles Deleuze. Le projet qui anime ce mémoire est d’explorer ce qu’une éthique deleuzienne peut apporter aux théories et pratiques de l’urbanisme. Cette éthique implique notamment un questionnement radical du cadre étatique dans lequel l’urbanisme s’insère, ce que ce mémoire appelle le «devenir- imperceptible » de l’urbanisme. Un travail empirique, constitué de 14 récits de ville prenant pour objet le territoire du pont Jacques-Cartier à Montréal, accompagne et poursuit cette réflexion théorique. Ces différents récits révèlent le pont Jacques-Cartier comme un lieu producteur de territoires, de mémoires et d’affects. Cette démarche terrain allie certains éléments de l’assemblage thinking et une éthique professionnelle deleuzienne. Elle explore la possibilité d’un rapport réellement sensible entre le territoire, l’urbaniste et les personnes concernées par une entreprise urbanistique.
Resumo:
This paper discusses the creation of a European Banking Union. First, we discuss questions of design. We highlight seven fundamental choices that decision makers will need to make: Which EU countries should participate in the banking union? To which categories of banks should it apply? Which institution should be tasked with supervision? Which one should deal with resolution? How centralised should the deposit insurance system be? What kind of fiscal backing would be required? What governance framework and political institutions would be needed? In terms of geographical scope, we see the coverage of the banking union of the euro area as necessary and of additional countries as desirable, even though this would entail important additional economic difficulties. The system should ideally cover all banks within the countries included, in order to prevent major competitive and distributional distortions. Supervisory authority should be granted either to both the ECB and a new agency, or to a new agency alone. National supervisors, acting under the authority of the European supervisor, would be tasked with the supervision of smaller banks in accordance with the subsidiarity principle. A European resolution authority should be established, with the possibility of drawing on ESM resources. A fully centralized deposit insurance system would eventually be desirable, but a system of partial reinsurance may also be envisaged at least in a first phase. A banking union would require at least implicit European fiscal backing, with significant political authority and legitimacy. Thus, banking union cannot be considered entirely separately from fiscal union and political union. The most difficult challenge of creating a European banking union lies with the short-term steps towards its eventual implementation. Many banks in the euro area, and especially in the crisis countries, are currently under stress and the move towards banking union almost certainly has significant distributional implications. Yet it is precisely because banks are under such stress that early and concrete action is needed. An overarching principle for such action is to minimize the cost to the tax payers. The first step should be to create a European supervisor that will anchor the development of the future banking union. In parallel, a capability to quickly assess the true capital position of the system’s most important banks should be created, for which we suggest establishing a temporary European Banking Sector Task Force working together with the European supervisor and other authorities. Ideally, problems identified by this process should be resolved by national authorities; in case fiscal capacities would prove insufficient, the European level would take over in the country concerned with some national financial participation, or in an even less likely adverse scenario, in all participating countries at once. This approach would require the passing of emergency legislation in the concerned countries that would give the Task Force the required access to information and, if necessary, further intervention rights. Thus, the principle of fiscal responsibility of respective member states for legacy costs would be preserved to the maximum extent possible, and at the same time, market participants and the public would be reassured that adequate tools are in place to address any eventuality.
Resumo:
Ashby was a keen observer of the world around him, as per his technological and psychiatrical developments. Over the years, he drew numerous philosophical conclusions on the nature of human intelligence and the operation of the brain, on artificial intelligence and the thinking ability of computers and even on science in general. In this paper, the quite profound philosophy espoused by Ashby is considered as a whole, in particular in terms of its relationship with the world as it stands now and even in terms of scientific predictions of where things might lead. A meaningful comparison is made between Ashby's comments and the science fiction concept of 'The Matrix' and serious consideration is given as to how much Ashby's ideas lay open the possibility of the matrix becoming a real world eventuality.
Resumo:
In this paper we show how to extend clausal temporal resolution to the ground eventuality fragment of monodic first-order temporal logic, which has recently been introduced by Hodkinson, Wolter and Zakharyaschev. While a finite Hilbert-like axiomatization of complete monodic first order temporal logic was developed by Wolter and Zakharyaschev, we propose a temporal resolution-based proof system which reduces the satisfiability problem for ground eventuality monodic first-order temporal formulae to the satisfiability problem for formulae of classical first-order logic.
Resumo:
Pós-graduação em Engenharia Elétrica - FEIS
Resumo:
The topic of this study is surprise, re gard as an evolutionary complex process, with manifold implication in different fields, from neurological, since aspecific correlate of surprise exist more or less at every level of neuronal processes (e.g. Rao e Ballard, 1999.), to behavioral , inasmuch a s our ability to quickly valuate(assess), recognize and learn from surprising events, are be regarded as pivotal for survival (e.g. Ranganath e Rainer, 2003). In particular this work, going from belief that surprise is really a psychoevolutive mechanism of primary relevance, has the objective to investigate if there may be a substantial connection between development of surprise' emotion and specific developmental problems, or, if in subjects with pervasive developmental disorders surprise may embody (represent) a essential mechanism of emotional tuning, and consequently if abnormalities in such process may be at the base of at least a part of cognitive and behavioural problems that determine (describe) this pathology. Theoretical reasons lead us to conside r this particular pathologic condition, recall to a broad area of research concern the comprehension of belief as marker of ability to reasons about mental states of others (i.e. Theory of Mind), and in addition, at the detection of specific subjects' diff iculty in this field. On the experimental side, as well as limited of this work, we have to compare comprehension and expression of surprise in a sample of 21 children with pervasive developmental disorders (PDD), with a sample of 35 children without deve lopmental problems, in a range of age 3-12. Method After the customary approach to become friendly with the child, an experimenter and an accomplice showed three boxes of nuts, easily to distinguish one from the other because of their different colours an d , working together with the child, the contents of one of the boxes were replaced and a different material (macaroni, pebbles) was put in the box. for the purpose of preparing a surprise for someone. At this stage, the accomplice excused himself/herself and left and the experimenter suggested to the child that he prepare another surprise, replacing the contents in the second box. When the accomplice came back, the child was asked to prepare a surprise for him by picking out the box that he thought was the right one for the purpose. After, and the child doesn't know it, the accomplice change the content of one of the boxes with candies and asked out to the children to open the box, in order to see if he show surprise. Result Date have obtain a significant difference between autistic and normal group, in all four tests. The expression of surprise too, is present in significantly lower degree in autistic group than in control group. Moreover, autistic children do not provide appropriate metarappresentative explanations. Conclusion Our outcome, with knowledge of the limit of our investigation at an experimental level (low number of the champions, no possibility of video registration to firm the expressions ) orient to consider eventuality that surprise, may be seen as relevant component, or indicative, in autistic spectrum disorders.
Resumo:
Abstract: This investigation of the concept of faith is divided into two parts. Part One evaluates a topical philosophical interpretation of faith as irreducibly disjunctive, collecting the best fragmented ideas as to what constitutes faith in a recent family resemblance exposition as an objective for an adequate essentialist analysis of the concept of faith to achieve. Part Two offers a more extended essentialist analysis of the concept of faith as unconditional patience in the eventuality of a positive future state, and a detailed reduction of six supposedly disparate family resemblance senses of faith to this single definition. Criteria for a satisfactory analysis of faithfulness are considered and defended. In contrast with what has become a standard doxastic-epistemic interpretation of faith as persistent unjustified or even unjustifiable belief, a concept of faith is advanced that appears to satisfy the necessary and sufficient criteria identified. Systematic comparison with a variety of usages of the word “faith” suggests that the analysis agrees with many and arguably most applications of this sometimes loosely understood term. Implications of the analysis of the concept of faith are considered and defended against anticipated objections. Pascal’s wager is critically examined in relation to matters of religious faith, along with positivist meaningfulness requirements that seem to conflict especially with epistemically ungrounded belief, the power of faith, and the metaphorical size of mustard seeds. The inquiry concludes with a synthesis of five aspects of six supposedly distinct senses of faith under the single essentialist reductive umbrella of unconditional patience in the eventuality of a positive future state.
Resumo:
BackgroundConsensus-based approaches provide an alternative to evidence-based decision making, especially in situations where high-level evidence is limited. Our aim was to demonstrate a novel source of information, objective consensus based on recommendations in decision tree format from multiple sources.MethodsBased on nine sample recommendations in decision tree format a representative analysis was performed. The most common (mode) recommendations for each eventuality (each permutation of parameters) were determined. The same procedure was applied to real clinical recommendations for primary radiotherapy for prostate cancer. Data was collected from 16 radiation oncology centres, converted into decision tree format and analyzed in order to determine the objective consensus.ResultsBased on information from multiple sources in decision tree format, treatment recommendations can be assessed for every parameter combination. An objective consensus can be determined by means of mode recommendations without compromise or confrontation among the parties. In the clinical example involving prostate cancer therapy, three parameters were used with two cut-off values each (Gleason score, PSA, T-stage) resulting in a total of 27 possible combinations per decision tree. Despite significant variations among the recommendations, a mode recommendation could be found for specific combinations of parameters.ConclusionRecommendations represented as decision trees can serve as a basis for objective consensus among multiple parties.