906 resultados para Abductive reasoning
Resumo:
The paper presents a foundation model for Marxian theories of the breakdown of capitalism based on a new falling rate of profit mechanism. All of these theories are based on one or more of "the historical tendencies": a rising capital-wage bill ratio, a rising capitalist share and a falling rate of profit. The model is a foundation in the sense that it generates these tendencies in the context of a model with a constant subsistence wage. The newly discovered generating mechanism is based on neo-classical reasoning for a model with land. It is non-Ricardian in that land augmenting technical progress can be unboundedly rapid. Finally, since the model has no steady state, it is necessary to use a new technique, Chaplygin's method, to prove the result.
Resumo:
The paper presents a foundation model for Marxian theories of the breakdown of capitalism based on a new falling rate of profit mechanism. All of these theories are based on one or more of ?the historical tendencies?: a rising capital-wage bill ratio, a rising capitalist share and a falling rate of profit. The model is a foundation in the sense that it generates these tendencies in the context of a model with a constant subsistence wage. The newly discovered generating mechanism is based on neo-classical reasoning for a model with land. It is non-Ricardian in that land augmenting technical progress can be unboundedly rapid. Finally, since the model has no steady state, it is necessary to use a new technique, Chaplygin?s method, to prove the result.
Resumo:
Report for the scientific sojourn carried out at the Model-based Systems and Qualitative Reasoning Group (Technical University of Munich), from September until December 2005. Constructed wetlands (CWs), or modified natural wetlands, are used all over the world as wastewater treatment systems for small communities because they can provide high treatment efficiency with low energy consumption and low construction, operation and maintenance costs. Their treatment process is very complex because it includes physical, chemical and biological mechanisms like microorganism oxidation, microorganism reduction, filtration, sedimentation and chemical precipitation. Besides, these processes can be influenced by different factors. In order to guarantee the performance of CWs, an operation and maintenance program must be defined for each Wastewater Treatment Plant (WWTP). The main objective of this project is to provide a computer support to the definition of the most appropriate operation and maintenance protocols to guarantee the correct performance of CWs. To reach them, the definition of models which represent the knowledge about CW has been proposed: components involved in the sanitation process, relation among these units and processes to remove pollutants. Horizontal Subsurface Flow CWs are chosen as a case study and the filtration process is selected as first modelling-process application. However, the goal is to represent the process knowledge in such a way that it can be reused for other types of WWTP.
Resumo:
This work investigates applying introspective reasoning to improve the performance of Case-Based Reasoning (CBR) systems, in both reactive and proactive fashion, by guiding learning to improve how a CBR system applies its cases and by identifying possible future system deficiencies. First we present our reactive approach, a new introspective reasoning model which enables CBR systems to autonomously learn to improve multiple facets of their reasoning processes in response to poor quality solutions. We illustrate our model’s benefits with experimental results from tests in an industrial design application. Then as for our proactive approach, we introduce a novel method for identifying regions in a case-base where the system gives low confidence solutions to possible future problems. Experimentation is provided for Zoology and Robo-Soccer domains and we argue how encountered regions of dubiosity help us to analyze the case-bases of a given CBR system.
Resumo:
BACKGROUND: Nowadays, cognitive remediation is widely accepted as an effective treatment for patients with schizophrenia. In French-speaking countries, techniques used in cognitive remediation for patients with schizophrenia have been applied from those used for patients with cerebral injury. As cognitive impairment is a core feature of schizophrenia, the Département de psychiatrie du CHUV in Lausanne (DP-CHUV) intended to develop a cognitive remediation program for patients with a schizophrenia spectrum disease (Recos-Vianin, 2007). Numerous studies show that the specific cognitive deficits greatly differ from one patient to another. Consequently, Recos aims at providing individualized cognitive remediation therapy. In this feasibility trial, we measured the benefits of this individualized therapy for patients with schizophrenia. Before treatment, the patients were evaluated with a large battery of cognitive tests in order to determine which of the five specific training modules - Verbal memory, visuospatial memory and attention, working memory, selective attention, reasoning - could provide the best benefit depending on their deficit. OBJECTIVES: The study was designed to evaluate the benefits of the Recos program by comparing cognitive functioning before and after treatment. METHOD: Twenty-eight patients with schizophrenia spectrum disorders (schizophrenia [n=18], schizoaffective disorder [n=5], schizotypal disorder [n=4], schizophreniform disorder [n=1], DSM-IV-TR) participated in between one and three of the cognitive modules. The choice of the training module was based on the results of the cognitive tests obtained during the first evaluation. The patients participated in 20 training sessions per module (one session per week). At the end of the training period, the cognitive functioning of each patient was reevaluated by using the same neuropsychological battery. RESULTS: The results showed a greater improvement in the cognitive functions, which were specifically trained, compared to the cognitive functions, which were not trained. However, an improvement was also observed in both types of cognitive functions, suggesting an indirect cognitive gain. CONCLUSION: In our view, the great heterogeneity of the observed cognitive deficits in schizophrenia necessitates a detailed neuropsychological investigation as well as an individualized cognitive remediation therapy. These preliminary results need to be confirmed with a more extended sample of patients.
Resumo:
I prove that as long as we allow the marginal utility for money (lambda) to vary between purchases (similarly to the budget) then the quasi-linear and the ordinal budget-constrained models rationalize the same data. However, we know that lambda is approximately constant. I provide a simple constructive proof for the necessary and sufficient condition for the constant lambda rationalization, which I argue should replace the Generalized Axiom of Revealed Preference in empirical studies of consumer behavior. 'Go Cardinals!' It is the minimal requirement of any scientifi c theory that it is consistent with the data it is trying to explain. In the case of (Hicksian) consumer theory it was revealed preference -introduced by Samuelson (1938,1948) - that provided an empirical test to satisfy this need. At that time most of economic reasoning was done in terms of a competitive general equilibrium, a concept abstract enough so that it can be built on the ordinal preferences over baskets of goods - even if the extremely specialized ones of Arrow and Debreu. However, starting in the sixties, economics has moved beyond the 'invisible hand' explanation of how -even competitive- markets operate. A seemingly unavoidable step of this 'revolution' was that ever since, most economic research has been carried out in a partial equilibrium context. Now, the partial equilibrium approach does not mean that the rest of the markets are ignored, rather that they are held constant. In other words, there is a special commodity -call it money - that reflects the trade-offs of moving purchasing power across markets. As a result, the basic building block of consumer behavior in partial equilibrium is no longer the consumer's preferences over goods, rather her valuation of them, in terms of money. This new paradigm necessitates a new theory of revealed preference.
Resumo:
Continuing developments in science and technology mean that the amounts of information forensic scientists are able to provide for criminal investigations is ever increasing. The commensurate increase in complexity creates difficulties for scientists and lawyers with regard to evaluation and interpretation, notably with respect to issues of inference and decision. Probability theory, implemented through graphical methods, and specifically Bayesian networks, provides powerful methods to deal with this complexity. Extensions of these methods to elements of decision theory provide further support and assistance to the judicial system. Bayesian Networks for Probabilistic Inference and Decision Analysis in Forensic Science provides a unique and comprehensive introduction to the use of Bayesian decision networks for the evaluation and interpretation of scientific findings in forensic science, and for the support of decision-makers in their scientific and legal tasks. Includes self-contained introductions to probability and decision theory. Develops the characteristics of Bayesian networks, object-oriented Bayesian networks and their extension to decision models. Features implementation of the methodology with reference to commercial and academically available software. Presents standard networks and their extensions that can be easily implemented and that can assist in the reader's own analysis of real cases. Provides a technique for structuring problems and organizing data based on methods and principles of scientific reasoning. Contains a method for the construction of coherent and defensible arguments for the analysis and evaluation of scientific findings and for decisions based on them. Is written in a lucid style, suitable for forensic scientists and lawyers with minimal mathematical background. Includes a foreword by Ian Evett. The clear and accessible style of this second edition makes this book ideal for all forensic scientists, applied statisticians and graduate students wishing to evaluate forensic findings from the perspective of probability and decision analysis. It will also appeal to lawyers and other scientists and professionals interested in the evaluation and interpretation of forensic findings, including decision making based on scientific information.
Resumo:
Clinical epidemiology is the most currently used name for a comparatively new branch of medicine covering a certain number of activities related to the practice of clinical medicine, but using epidemiological techniques and methods. Clinical epidemiology has only just begun to be known in Europe, whereas units are being increasingly developed and expanded in North America, particularly within the clinical departments of hospitals. The methods it offers are valid for both practicing physicians and hospital doctors (or those being trained in hospitals) and serve the purpose of promoting a better quality medical service, especially where a more adequate evaluation of the effectiveness of diagnostic methods, therapy and prognosis in medicine is concerned. Clinical epidemiology proposes a methodology of medical reasoning and of decision-making, as well as techniques intended to facilitate the indispensable task of keeping up with advances in medical knowledge.
Resumo:
There is an increasing awareness that the articulation of forensic science and criminal investigation is critical to the resolution of crimes. However, models and methods to support an effective collaboration between these partners are still poorly expressed or even lacking. Three propositions are borrowed from crime intelligence methods in order to bridge this gap: (a) the general intelligence process, (b) the analyses of investigative problems along principal perspectives: entities and their relationships, time and space, quantitative aspects and (c) visualisation methods as a mode of expression of a problem in these dimensions. Indeed, in a collaborative framework, different kinds of visualisations integrating forensic case data can play a central role for supporting decisions. Among them, link-charts are scrutinised for their abilities to structure and ease the analysis of a case by describing how relevant entities are connected. However, designing an informative chart that does not bias the reasoning process is not straightforward. Using visualisation as a catalyser for a collaborative approach integrating forensic data thus calls for better specifications.
Resumo:
OBJECTIVES: The aim of the study was to assess whether prospective follow-up data within the Swiss HIV Cohort Study can be used to predict patients who stop smoking; or among smokers who stop, those who start smoking again. METHODS: We built prediction models first using clinical reasoning ('clinical models') and then by selecting from numerous candidate predictors using advanced statistical methods ('statistical models'). Our clinical models were based on literature that suggests that motivation drives smoking cessation, while dependence drives relapse in those attempting to stop. Our statistical models were based on automatic variable selection using additive logistic regression with component-wise gradient boosting. RESULTS: Of 4833 smokers, 26% stopped smoking, at least temporarily; because among those who stopped, 48% started smoking again. The predictive performance of our clinical and statistical models was modest. A basic clinical model for cessation, with patients classified into three motivational groups, was nearly as discriminatory as a constrained statistical model with just the most important predictors (the ratio of nonsmoking visits to total visits, alcohol or drug dependence, psychiatric comorbidities, recent hospitalization and age). A basic clinical model for relapse, based on the maximum number of cigarettes per day prior to stopping, was not as discriminatory as a constrained statistical model with just the ratio of nonsmoking visits to total visits. CONCLUSIONS: Predicting smoking cessation and relapse is difficult, so that simple models are nearly as discriminatory as complex ones. Patients with a history of attempting to stop and those known to have stopped recently are the best candidates for an intervention.
Resumo:
En la recerca s’expliquen els resultats de la implementació d’un programa adreçat bàsicament al treball de la consciència emocional de l’intern penitenciari i la regulació d’aquesta consciència. El motiu és que en el plantejament teòric es parteix de les constatacions de molts autors que els programes que millor funcionen en la rehabilitació i tractament del delinqüent ingressat a presó són aquells que tenen en compte no només l’ambient del delinqüent, la seva família, la seva educació escolar i formativa a nivell laboral sinó també l’estudi del seus sentiments i de manera clau la seva cognició, el seu raonament, la seva comprensió i també els seus valors.És per això que els programes eficaços semblen ser aquells que inclouen tècniques adreçades a millorar les habilitats de raonament, la empatia, l’avaluació de les conseqüències de la seva conducta envers els altres i sobre ells mateixos, la reflexió abans de l’acció, les seves habilitats per a la resolució de problemes i també en moltes ocasions les pròpies habilitats socials poc desenvolupades. Els resultats obtinguts analitzen les dades referents a l’impacte de la intervenció envers els interns i de manera específica envers la seva conducta.
Resumo:
The case of a immunocompromised HIV patient with fever and lymphadenopathy discussed in an anatomo-pathological round. This complex clinical case was used as an opportunity to discuss the broad differential diagnosis of fever in an immunocompromized individual with multiples lymphadenopathies. Clinical reasoning leading to the probable diagnosis based on clinical, biological and radiological informations is not only a difficult task for the speaker but also a rich source of learning opportunities for our medical community.
Resumo:
The paper follows on from earlier work [Taroni F and Aitken CGG. Probabilistic reasoning in the law, Part 1: assessment of probabilities and explanation of the value of DNA evidence. Science & Justice 1998; 38: 165-177]. Different explanations of the value of DNA evidence were presented to students from two schools of forensic science and to members of fifteen laboratories all around the world. The responses were divided into two groups; those which came from a school or laboratory identified as Bayesian and those which came from a school or laboratory identified as non-Bayesian. The paper analyses these responses using a likelihood approach. This approach is more consistent with a Bayesian analysis than one based on a frequentist approach, as was reported by Taroni F and Aitken CGG. [Probabilistic reasoning in the law, Part 1: assessment of probabilities and explanation of the value of DNA evidence] in Science & Justice 1998.
Resumo:
PURPOSE: Patients diagnosed with a specific neoplasm tend to have a subsequent excess risk of the same neoplasm. The age incidence of a second neoplasm at the same site is approximately constant with age, and consequently the relative risk is greater at younger age. It is unclear whether such a line of reasoning can be extended from a specific neoplasm to the incidence of all neoplasms in subjects diagnosed with a defined neoplasm. METHODS: We considered the age-specific incidence of all non-hormone-related epithelial neoplasms after a first primary colorectal cancer (n = 9542) in the Vaud Cancer Registry data set. RESULTS: In subjects with a previous colorectal cancer, the incidence rate of all other epithelial non-hormone-related cancers was stable around 800 per 100,000 between age 30 and 60 years, and rose only about twofold to reach 1685 at age 70 to 79 years and 1826 per 100,000 at age 80 years or older. After excluding synchronous cancers, the rise was only about 1.5-fold, that is, from about 700 to 1000. In the general population, the incidence rate of all epithelial non-hormone-related cancers was 29 per 100,000 at age 30 to 39 years, and rose 30-fold to 883 per 100,000 at age 70 to 79 years. Excluding colorectal cancers, the rise of all non-hormone-related cancers was from 360 per 100,000 at age 40 to 49 years to 940 at age 70 to 79 years after colorectal cancer, and from 90 to 636 per 100,000 in the general population (i.e., 2.6- vs. 7.1-fold). CONCLUSIONS: The rise of incidence with age of all epithelial non-hormone-related second cancers after colorectal cancer is much smaller than in the general population. This can possibly be related to the occurrence of a single mutational event in a population of susceptible individuals, although alternative models are plausible within the complexity of the process of carcinogenesis.
Resumo:
Projecte de recerca elaborat a partir d’una estada a la Università degli studi di Siena, Italy , entre 2007 i 2009. El projecte ha consistit en un estudi de la formalització lògica del raonament en presència de vaguetat amb els mètodes de la Lògica Algebraica i de la Teoria de la Prova. S'ha treballat fonamental en quatre direccions complementàries. En primer lloc, s'ha proposat un nou plantejament, més abstracte que el paradigma dominant fins ara, per l'estudi dels sistemes de lògica borrosa. Fins ara en l'estudi d'aquests sistemes l'atenció havia recaigut essencialment en l'obtenció de semàntiques basades en tnormes contínues (o almenys contínues per l'esquerra). En primer nivell de major abstracció hem estudiat les propietats de completesa de les lògiques borroses (tant proposicionals com de primer ordre) respecte de semàntiques definides sobre qualsevol cadena de valors de veritat, no necessàriament només sobre l'interval unitat dels nombres reals. A continuació, en un nivell encara més abstracte, s’ha pres l'anomenada jerarquia de Leibniz de la Lògica Algebraica Abstracta que classifica tots els sistemes lògics amb un bon comportament algebraic i s'ha expandit a una nova jerarquia (que anomenem implicacional) que permet definir noves classes de lògiques borroses que contenen quasi totes les conegudes fins ara. En segon lloc, s’ha continuat una línia d'investigació iniciada els darrers anys consistent en l'estudi de la veritat parcial com a noció sintàctica (és a dir, com a constants de veritat explícites en els sistemes de prova de les lògiques borroses). Per primer cop, s’ha considerat la semàntica racional per les lògiques proposicionals i la semàntica real i racional per les lògiques de primer ordre expandides amb constants. En tercer lloc, s’ha tractat el problema més fonamental del significat i la utilitat de les lògiques borroses com a modelitzadores de (part de) els fenòmens de la vaguetat en un darrer article de caràcter més filosòfic i divulgatiu, i en un altre més tècnic en què defensem la necessitat i presentem l'estat de l'art de l'estudi de les estructures algèbriques associades a les lògiques borroses. Finalment, s’ha dedicat la darrera part del projecte a l'estudi de la complexitat aritmètica de les lògiques borroses de primer ordre.