919 resultados para Logical reasoning
Resumo:
The paper presents a foundation model for Marxian theories of the breakdown of capitalism based on a new falling rate of profit mechanism. All of these theories are based on one or more of "the historical tendencies": a rising capital-wage bill ratio, a rising capitalist share and a falling rate of profit. The model is a foundation in the sense that it generates these tendencies in the context of a model with a constant subsistence wage. The newly discovered generating mechanism is based on neo-classical reasoning for a model with land. It is non-Ricardian in that land augmenting technical progress can be unboundedly rapid. Finally, since the model has no steady state, it is necessary to use a new technique, Chaplygin's method, to prove the result.
Resumo:
The paper presents a foundation model for Marxian theories of the breakdown of capitalism based on a new falling rate of profit mechanism. All of these theories are based on one or more of ?the historical tendencies?: a rising capital-wage bill ratio, a rising capitalist share and a falling rate of profit. The model is a foundation in the sense that it generates these tendencies in the context of a model with a constant subsistence wage. The newly discovered generating mechanism is based on neo-classical reasoning for a model with land. It is non-Ricardian in that land augmenting technical progress can be unboundedly rapid. Finally, since the model has no steady state, it is necessary to use a new technique, Chaplygin?s method, to prove the result.
Resumo:
Report for the scientific sojourn carried out at the Model-based Systems and Qualitative Reasoning Group (Technical University of Munich), from September until December 2005. Constructed wetlands (CWs), or modified natural wetlands, are used all over the world as wastewater treatment systems for small communities because they can provide high treatment efficiency with low energy consumption and low construction, operation and maintenance costs. Their treatment process is very complex because it includes physical, chemical and biological mechanisms like microorganism oxidation, microorganism reduction, filtration, sedimentation and chemical precipitation. Besides, these processes can be influenced by different factors. In order to guarantee the performance of CWs, an operation and maintenance program must be defined for each Wastewater Treatment Plant (WWTP). The main objective of this project is to provide a computer support to the definition of the most appropriate operation and maintenance protocols to guarantee the correct performance of CWs. To reach them, the definition of models which represent the knowledge about CW has been proposed: components involved in the sanitation process, relation among these units and processes to remove pollutants. Horizontal Subsurface Flow CWs are chosen as a case study and the filtration process is selected as first modelling-process application. However, the goal is to represent the process knowledge in such a way that it can be reused for other types of WWTP.
Resumo:
El present treball fa un anàlisi i desenvolupament sobre les millores en la velocitat i en l’escalabilitat d'un simulador distribuït de grups de peixos. Aquests resultats s’han obtingut fent servir una nova estratègia de comunicació per als processos lògics (LPs) i canvis en l'algoritme de selecció de veïns que s'aplica a cadascun dels peixos en cada pas de simulació. L’idea proposada permet que cada procés lògic anticipi futures necessitats de dades pels seus veïns reduint el temps de comunicació al limitar la quantitat de missatges intercanviats entre els LPs. El nou algoritme de selecció dels veïns es va desenvolupar amb l'objectiu d'evitar treball innecessari permetent la disminució de les instruccions executades en cada pas de simulació i per cadascun del peixos simulats reduint de forma significativa el temps de simulació.
Resumo:
En aquest projecte s'ha implementat un sistema de data archiving amb SAP-IXOS per a una empresa després que aquesta demanés consell a la nostra consultora per reduir espai en la base de dades, ja que només els quedava lliure un 14% de l'ocupació total, i d'aquesta forma millorar el rendiment del sistema. S'ha realitzat un estudi de la base de dades, ocupació en Mb i creixement mensual de les dades en les taules. S'han parametritzat objectes d'arxivat, creat i definits repositoris de contingut, fitxers lògics i s'han programat reports addicionals per arxivar les dades correctament. El resultat ha estat reduir la base de dades en un 22% sent posible l'entrada de noves dades obtenint una resposta més àgil del sistema.
Resumo:
Conflict among member states regarding the distribution of net financial burdens has been allowed to contaminate the entire design of the EU budget with very negative consequences in terms of equity, efficiency and transparency. To get around this problem and pave the way for a substantive budget reform, we propose to decouple distributional negotiations from the rest of the budget process by linking member state net balances in a rigid manner to relative prosperity. This would be achieved through the introduction of a system of compensating horizontal transfers that would take to its logical conclusion the Commission's proposal for a generalized compensation mechanism. We discuss the impact of the proposed scheme on member states? incentives and illustrate its financial implications using revenue and expenditure projections for 2013 that are based on the current Financial Perspectives and Own Resources Decision.
Resumo:
This work investigates applying introspective reasoning to improve the performance of Case-Based Reasoning (CBR) systems, in both reactive and proactive fashion, by guiding learning to improve how a CBR system applies its cases and by identifying possible future system deficiencies. First we present our reactive approach, a new introspective reasoning model which enables CBR systems to autonomously learn to improve multiple facets of their reasoning processes in response to poor quality solutions. We illustrate our model’s benefits with experimental results from tests in an industrial design application. Then as for our proactive approach, we introduce a novel method for identifying regions in a case-base where the system gives low confidence solutions to possible future problems. Experimentation is provided for Zoology and Robo-Soccer domains and we argue how encountered regions of dubiosity help us to analyze the case-bases of a given CBR system.
Resumo:
Este documento explica la metodología de trabajo utilizada durante el desarrollo de este proyecto. El resultado obtenido ha sido un producto que forma parte del portal web Luxury Porperties. Luxury Properties es una marca comercial, creada por la compañía Luxury Dreams, para identificar un portal web especializado en las propiedades de lujo. Para desarrollar este concepto, la empresa compró el dominio luxuryproperties.es con el objetivo de crear un sistema que implemente toda la lógica de negocio necesaria para la publicidad y mercadeo de sus productos. Al final del documento se encuentran las recomendaciones y las futuras valoraciones a tener en cuenta por el cliente en futuras ampliaciones del portal.
Resumo:
BACKGROUND: Multiple interventions were made to optimize the medication process in our intensive care unit (ICU). 1 Transcriptions from the medical order form to the administration plan were eliminated by merging both into a single document; 2 the new form was built in a logical sequence and was highly structured to promote completeness and standardization of information; 3 frequently used drug names, approved units, and fixed routes were pre-printed; 4 physicians and nurses were trained with regard to the correct use of the new form. This study was aimed at evaluating the impact of these interventions on clinically significant types of medication errors. METHODS: Eight types of medication errors were measured by a prospective chart review before and after the interventions in the ICU of a public tertiary care hospital. We used an interrupted time-series design to control the secular trends. RESULTS: Over 85 days, 9298 lines of drug prescription and/or administration to 294 patients, corresponding to 754 patient-days were collected and analysed for the three series before and three series following the intervention. Global error rate decreased from 4.95 to 2.14% (-56.8%, P < 0.001). CONCLUSIONS: The safety of the medication process in our ICU was improved by simple and inexpensive interventions. In addition to the optimization of the prescription writing process, the documentation of intravenous preparation, and the scheduling of administration, the elimination of the transcription in combination with the training of users contributed to reducing errors and carried an interesting potential to increase safety.
Resumo:
BACKGROUND: Nowadays, cognitive remediation is widely accepted as an effective treatment for patients with schizophrenia. In French-speaking countries, techniques used in cognitive remediation for patients with schizophrenia have been applied from those used for patients with cerebral injury. As cognitive impairment is a core feature of schizophrenia, the Département de psychiatrie du CHUV in Lausanne (DP-CHUV) intended to develop a cognitive remediation program for patients with a schizophrenia spectrum disease (Recos-Vianin, 2007). Numerous studies show that the specific cognitive deficits greatly differ from one patient to another. Consequently, Recos aims at providing individualized cognitive remediation therapy. In this feasibility trial, we measured the benefits of this individualized therapy for patients with schizophrenia. Before treatment, the patients were evaluated with a large battery of cognitive tests in order to determine which of the five specific training modules - Verbal memory, visuospatial memory and attention, working memory, selective attention, reasoning - could provide the best benefit depending on their deficit. OBJECTIVES: The study was designed to evaluate the benefits of the Recos program by comparing cognitive functioning before and after treatment. METHOD: Twenty-eight patients with schizophrenia spectrum disorders (schizophrenia [n=18], schizoaffective disorder [n=5], schizotypal disorder [n=4], schizophreniform disorder [n=1], DSM-IV-TR) participated in between one and three of the cognitive modules. The choice of the training module was based on the results of the cognitive tests obtained during the first evaluation. The patients participated in 20 training sessions per module (one session per week). At the end of the training period, the cognitive functioning of each patient was reevaluated by using the same neuropsychological battery. RESULTS: The results showed a greater improvement in the cognitive functions, which were specifically trained, compared to the cognitive functions, which were not trained. However, an improvement was also observed in both types of cognitive functions, suggesting an indirect cognitive gain. CONCLUSION: In our view, the great heterogeneity of the observed cognitive deficits in schizophrenia necessitates a detailed neuropsychological investigation as well as an individualized cognitive remediation therapy. These preliminary results need to be confirmed with a more extended sample of patients.
Resumo:
I prove that as long as we allow the marginal utility for money (lambda) to vary between purchases (similarly to the budget) then the quasi-linear and the ordinal budget-constrained models rationalize the same data. However, we know that lambda is approximately constant. I provide a simple constructive proof for the necessary and sufficient condition for the constant lambda rationalization, which I argue should replace the Generalized Axiom of Revealed Preference in empirical studies of consumer behavior. 'Go Cardinals!' It is the minimal requirement of any scientifi c theory that it is consistent with the data it is trying to explain. In the case of (Hicksian) consumer theory it was revealed preference -introduced by Samuelson (1938,1948) - that provided an empirical test to satisfy this need. At that time most of economic reasoning was done in terms of a competitive general equilibrium, a concept abstract enough so that it can be built on the ordinal preferences over baskets of goods - even if the extremely specialized ones of Arrow and Debreu. However, starting in the sixties, economics has moved beyond the 'invisible hand' explanation of how -even competitive- markets operate. A seemingly unavoidable step of this 'revolution' was that ever since, most economic research has been carried out in a partial equilibrium context. Now, the partial equilibrium approach does not mean that the rest of the markets are ignored, rather that they are held constant. In other words, there is a special commodity -call it money - that reflects the trade-offs of moving purchasing power across markets. As a result, the basic building block of consumer behavior in partial equilibrium is no longer the consumer's preferences over goods, rather her valuation of them, in terms of money. This new paradigm necessitates a new theory of revealed preference.
Resumo:
This paper extends previous research and discussion on the use of multivariate continuous data, which are about to become more prevalent in forensic science. As an illustrative example, attention is drawn here on the area of comparative handwriting examinations. Multivariate continuous data can be obtained in this field by analysing the contour shape of loop characters through Fourier analysis. This methodology, based on existing research in this area, allows one describe in detail the morphology of character contours throughout a set of variables. This paper uses data collected from female and male writers to conduct a comparative analysis of likelihood ratio based evidence assessment procedures in both, evaluative and investigative proceedings. While the use of likelihood ratios in the former situation is now rather well established (typically, in order to discriminate between propositions of authorship of a given individual versus another, unknown individual), focus on the investigative setting still remains rather beyond considerations in practice. This paper seeks to highlight that investigative settings, too, can represent an area of application for which the likelihood ratio can offer a logical support. As an example, the inference of gender of the writer of an incriminated handwritten text is forwarded, analysed and discussed in this paper. The more general viewpoint according to which likelihood ratio analyses can be helpful for investigative proceedings is supported here through various simulations. These offer a characterisation of the robustness of the proposed likelihood ratio methodology.
Resumo:
The purpose of this paper is to highlight the curiously circular course followed by mainstream macroeconomic thinking in recent times. Having broken from classical orthodoxy in the late 1930s via Keynes’s General Theory, over the last three or four decades the mainstream conventional wisdom, regressing rather than progressing, has now come to embrace a conception of the working of the macroeconomy which is again of a classical, essentially pre-Keynesian, character. At the core of the analysis presented in the typical contemporary macro textbook is the (neo)classical model of the labour market, which represents employment as determined (given conditions of productivity) by the terms of labour supply. While it is allowed that changes in aggregate demand may temporarily affect output and employment, the contention is that in due course employment will automatically return to its ‘natural’ (full employment) level. Unemployment is therefore identified as a merely frictional or voluntary phenomenon: involuntary unemployment - in other words persisting demand-deficient unemployment - is entirely absent from the picture. Variations in aggregate demand are understood to have a lasting impact only on the price level, not on output and employment. This in effect amounts to a return to a Pigouvian conception such as targeted by Keynes in the General Theory. We take the view that this reversion to ideas which should by now be obsolete reflects not the discovery of logical or empirical deficiencies in the Keynes analysis, but results rather from doctrinaire blindness and failure of scholarship on account of which essential features of the Keynes theory have been overlooked or misrepresented. There is an urgent need for a critical appraisal of the current conventional macroeconomic wisdom.
Resumo:
Continuing developments in science and technology mean that the amounts of information forensic scientists are able to provide for criminal investigations is ever increasing. The commensurate increase in complexity creates difficulties for scientists and lawyers with regard to evaluation and interpretation, notably with respect to issues of inference and decision. Probability theory, implemented through graphical methods, and specifically Bayesian networks, provides powerful methods to deal with this complexity. Extensions of these methods to elements of decision theory provide further support and assistance to the judicial system. Bayesian Networks for Probabilistic Inference and Decision Analysis in Forensic Science provides a unique and comprehensive introduction to the use of Bayesian decision networks for the evaluation and interpretation of scientific findings in forensic science, and for the support of decision-makers in their scientific and legal tasks. Includes self-contained introductions to probability and decision theory. Develops the characteristics of Bayesian networks, object-oriented Bayesian networks and their extension to decision models. Features implementation of the methodology with reference to commercial and academically available software. Presents standard networks and their extensions that can be easily implemented and that can assist in the reader's own analysis of real cases. Provides a technique for structuring problems and organizing data based on methods and principles of scientific reasoning. Contains a method for the construction of coherent and defensible arguments for the analysis and evaluation of scientific findings and for decisions based on them. Is written in a lucid style, suitable for forensic scientists and lawyers with minimal mathematical background. Includes a foreword by Ian Evett. The clear and accessible style of this second edition makes this book ideal for all forensic scientists, applied statisticians and graduate students wishing to evaluate forensic findings from the perspective of probability and decision analysis. It will also appeal to lawyers and other scientists and professionals interested in the evaluation and interpretation of forensic findings, including decision making based on scientific information.
Resumo:
Treball de recerca realitzat per un alumne d'ensenyament secundari i guardonat amb un Premi CIRIT per fomentar l'esperit científic del Jovent l'any 2009. Aquest treball de recerca es basa en l'experimentació i, posteriorment, l'obtenció i anàlisi de resultats de l'experiment creador d'anells de Liesegang. Aquest experiment, consistent en la precipitació d'un compost en una base gelificada formant anells distanciats logarítmicament els uns dels altres, ha estat durant més d'un segle objecte d'investigació de moltíssims científics, els quals no han sabut mai treure'n una explicació lògica i raonable d'aquest rar comportament. L'autor ha pretès recrear els curiosos anells intentant formar-los amb diferents inhibidors i compostos als trobats en la bibliografia. Després de realitzar més d'una trentena d'experiments, s'ha realitzat una anàlisi exhaustiva dels resultats. Aquest apartat ha estat un dels més enriquidors, ja que s'han dut a terme en ell comparacions sorprenents i troballes molt curioses, com per exemple la similitud entre els anells de Liesegang i les estructures de Turing, la qual intenta explicar les formes presents en els ocels dels éssers vius; i l'aparició d'anells de Liesegang segons l’òptica visual, efecte inexistent en l’àmplia bibliografia consultada. A més a més, també s'han efectuat una sèrie d'estudis: un en què es confirmen les distàncies logarítmiques entre els anells i on es realitza una comparació entre les dades empíriques i el patró matemàtic; i un altre en què s'estudia el comportament dels anells al variar els factors que regulen la velocitat de reacció.