966 resultados para Fuzzy Logic by the Extension Principle
Resumo:
According to the Taylor principle a central bank should adjust the nominal interest rate by more than one-for-one in response to changes in current inflation. Most of the existing literature supports the view that by following this simple recommendation a central bank can avoid being a source of unnecessary fluctuations in economic activity. The present paper shows that this conclusion is not robust with respect to the modelling of capital accumulation. We use our insights to discuss the desirability of alternative interest rate rules. Our results suggest a reinterpretation of monetary policy under Volcker and Greenspan: The empirically plausible characterization of monetary policy can explain the stabilization of macroeconomic outcomes observed in the early eighties for the US economy. The Taylor principle in itself cannot.
Resumo:
According to the Taylor principle a central bank should adjust the nominal interest rate by more than one-for-one in response to changes in current inflation. Most of the existing literature supports the view that by following this simple recommendation a central bank can avoid being a source of unnecessary fluctuations in economic activity. The present paper shows that this conclusion is not robust with respect to the modelling of capital accumulation. We use our insights to discuss the desirability of alternative interest raterules. Our results suggest a reinterpretation of monetary policy under Volcker and Greenspan: The empirically plausible characterization of monetary policy can explain the stabilization of macroeconomic outcomes observed in the early eighties for the US economy. The Taylor principle in itself cannot.
Resumo:
A new statistical parallax method using the Maximum Likelihood principle is presented, allowing the simultaneous determination of a luminosity calibration, kinematic characteristics and spatial distribution of a given sample. This method has been developed for the exploitation of the Hipparcos data and presents several improvements with respect to the previous ones: the effects of the selection of the sample, the observational errors, the galactic rotation and the interstellar absorption are taken into account as an intrinsic part of the formulation (as opposed to external corrections). Furthermore, the method is able to identify and characterize physically distinct groups in inhomogeneous samples, thus avoiding biases due to unidentified components. Moreover, the implementation used by the authors is based on the extensive use of numerical methods, so avoiding the need for simplification of the equations and thus the bias they could introduce. Several examples of application using simulated samples are presented, to be followed by applications to real samples in forthcoming articles.
Resumo:
The atomic force microscope is not only a very convenient tool for studying the topography of different samples, but it can also be used to measure specific binding forces between molecules. For this purpose, one type of molecule is attached to the tip and the other one to the substrate. Approaching the tip to the substrate allows the molecules to bind together. Retracting the tip breaks the newly formed bond. The rupture of a specific bond appears in the force-distance curves as a spike from which the binding force can be deduced. In this article we present an algorithm to automatically process force-distance curves in order to obtain bond strength histograms. The algorithm is based on a fuzzy logic approach that permits an evaluation of "quality" for every event and makes the detection procedure much faster compared to a manual selection. In this article, the software has been applied to measure the binding strength between tubuline and microtubuline associated proteins.
Resumo:
Peer-reviewed
Resumo:
The signalling function of melanin-based colouration is debated. Sexual selection theory states that ornaments should be costly to produce, maintain, wear or display to signal quality honestly to potential mates or competitors. An increasing number of studies supports the hypothesis that the degree of melanism covaries with aspects of body condition (e.g. body mass or immunity), which has contributed to change the initial perception that melanin-based colour ornaments entail no costs. Indeed, the expression of many (but not all) melanin-based colour traits is weakly sensitive to the environment but strongly heritable suggesting that these colour traits are relatively cheap to produce and maintain, thus raising the question of how such colour traits could signal quality honestly. Here I review the production, maintenance and wearing/displaying costs that can generate a correlation between melanin-based colouration and body condition, and consider other evolutionary mechanisms that can also lead to covariation between colour and body condition. Because genes controlling melanic traits can affect numerous phenotypic traits, pleiotropy could also explain a linkage between body condition and colouration. Pleiotropy may result in differently coloured individuals signalling different aspects of quality that are maintained by frequency-dependent selection or local adaptation. Colouration may therefore not signal absolute quality to potential mates or competitors (e.g. dark males may not achieve a higher fitness than pale males); otherwise genetic variation would be rapidly depleted by directional selection. As a consequence, selection on heritable melanin-based colouration may not always be directional, but mate choice may be conditional to environmental conditions (i.e. context-dependent sexual selection). Despite the interest of evolutionary biologists in the adaptive value of melanin-based colouration, its actual role in sexual selection is still poorly understood.
Resumo:
In this article I present a possible solution for the classic problem of the apparent incompatibility between Mill's Greatest Happiness Principle and his Principle of Liberty arguing that in the other-regarding sphere the judgments of experience and knowledge accumulated through history have moral and legal force, whilst in the self-regarding sphere the judgments of the experienced people only have prudential value and the reason for this is the idea according to which each of us is a better judge than anyone else to decide what causes us pain and which kind of pleasure we prefer (the so-called epistemological argument). Considering that the Greatest Happiness Principle is nothing but the aggregate of each person's happiness, given the epistemological claim we conclude that, by leaving people free even to cause harm to themselves, we still would be maximizing happiness, so both principles (the Greatest Happiness Principle and the Principle of Liberty) could be compatible.
Resumo:
This work deals with an hybrid PID+fuzzy logic controller applied to control the machine tool biaxial table motions. The non-linear model includes backlash and the axis elasticity. Two PID controllers do the primary table control. A third PID+fuzzy controller has a cross coupled structure whose function is to minimise the trajectory contour errors. Once with the three PID controllers tuned, the system is simulated with and without the third controller. The responses results are plotted and compared to analyse the effectiveness of this hybrid controller over the system. They show that the proposed methodology reduces the contour error in a proportion of 70:1.
Resumo:
Permanent bilateral occlusion of the common carotid arteries (2VO) in the rat has been established as a valid experimental model to investigate the effects of chronic cerebral hypoperfusion on cognitive function and neurodegenerative processes. Our aim was to compare the cognitive and morphological outcomes following the standard 2VO procedure, in which there is concomitant artery ligation, with those of a modified protocol, with a 1-week interval between artery occlusions to avoid an abrupt reduction of cerebral blood flow, as assessed by animal performance in the water maze and damage extension to the hippocampus and striatum. Male Wistar rats (N = 47) aged 3 months were subjected to chronic hypoperfusion by permanent bilateral ligation of the common carotid arteries using either the standard or the modified protocol, with the right carotid being the first to be occluded. Three months after the surgical procedure, rat performance in the water maze was assessed to investigate long-term effects on spatial learning and memory and their brains were processed in order to estimate hippocampal volume and striatal area. Both groups of hypoperfused rats showed deficits in reference (F(8,172) = 7.0951, P < 0.00001) and working spatial memory [2nd (F(2,44) = 7.6884, P < 0.001), 3rd (F(2,44) = 21.481, P < 0.00001) and 4th trials (F(2,44) = 28.620, P < 0.0001)]; however, no evidence of tissue atrophy was found in the brain structures studied. Despite similar behavioral and morphological outcomes, the rats submitted to the modified protocol showed a significant increase in survival rate, during the 3 months of the experiment (P < 0.02).
Resumo:
Abstract Mixed Martial Arts (MMA) and the Ultimate Fighting Championship (UFC) founded in 1993 have been under scrutiny for the past two decades. Unlike boxing, the ethical status of MMA and whether it is morally defensible have rarely been analyzed in the academic literature. I argue that MMA requires such an analysis because it is inherently violent. The purpose of this study was to examine elite-level MMA by referring to the ethical concepts of autonomy, paternalism and the Harm Principle. Findings from interviews with MMA athletes as well as my personal experience of MMA were presented to establish a deeper understanding of the sport and what it means to train and compete in a sport defined as violent. The conceptual analysis and findings of MMA athletes' experiences in this investigation resulted in the conclusion that MMA is ethically defensible. Additional findings, implications and recommendations for further research were also discussed.
Resumo:
List of Employees to be involved in the extension of the Port Dalhousie and Thorold Railway (1 page, handwritten). This is signed by S.D. Woodruff, Nov. 25, 1856.
Resumo:
Section des étudiants
Resumo:
Cette thèse contribue à une théorie générale de la conception du projet. S’inscrivant dans une demande marquée par les enjeux du développement durable, l’objectif principal de cette recherche est la contribution d’un modèle théorique de la conception permettant de mieux situer l’utilisation des outils et des normes d’évaluation de la durabilité d’un projet. Les principes fondamentaux de ces instruments normatifs sont analysés selon quatre dimensions : ontologique, méthodologique, épistémologique et téléologique. Les indicateurs de certains effets contre-productifs reliés, en particulier, à la mise en compte de ces normes confirment la nécessité d’une théorie du jugement qualitatif. Notre hypothèse principale prend appui sur le cadre conceptuel offert par la notion de « principe de précaution » dont les premières formulations remontent du début des années 1970, et qui avaient précisément pour objectif de remédier aux défaillances des outils et méthodes d’évaluation scientifique traditionnelles. La thèse est divisée en cinq parties. Commençant par une revue historique des modèles classiques des théories de la conception (design thinking) elle se concentre sur l’évolution des modalités de prise en compte de la durabilité. Dans cette perspective, on constate que les théories de la « conception verte » (green design) datant du début des années 1960 ou encore, les théories de la « conception écologique » (ecological design) datant des années 1970 et 1980, ont finalement convergé avec les récentes théories de la «conception durable» (sustainable design) à partir du début des années 1990. Les différentes approches du « principe de précaution » sont ensuite examinées sous l’angle de la question de la durabilité du projet. Les standards d’évaluation des risques sont comparés aux approches utilisant le principe de précaution, révélant certaines limites lors de la conception d’un projet. Un premier modèle théorique de la conception intégrant les principales dimensions du principe de précaution est ainsi esquissé. Ce modèle propose une vision globale permettant de juger un projet intégrant des principes de développement durable et se présente comme une alternative aux approches traditionnelles d’évaluation des risques, à la fois déterministes et instrumentales. L’hypothèse du principe de précaution est dès lors proposée et examinée dans le contexte spécifique du projet architectural. Cette exploration débute par une présentation de la notion classique de «prudence» telle qu’elle fut historiquement utilisée pour guider le jugement architectural. Qu’en est-il par conséquent des défis présentés par le jugement des projets d’architecture dans la montée en puissance des méthodes d’évaluation standardisées (ex. Leadership Energy and Environmental Design; LEED) ? La thèse propose une réinterprétation de la théorie de la conception telle que proposée par Donald A. Schön comme une façon de prendre en compte les outils d’évaluation tels que LEED. Cet exercice révèle cependant un obstacle épistémologique qui devra être pris en compte dans une reformulation du modèle. En accord avec l’épistémologie constructiviste, un nouveau modèle théorique est alors confronté à l’étude et l’illustration de trois concours d'architecture canadienne contemporains ayant adopté la méthode d'évaluation de la durabilité normalisée par LEED. Une série préliminaire de «tensions» est identifiée dans le processus de la conception et du jugement des projets. Ces tensions sont ensuite catégorisées dans leurs homologues conceptuels, construits à l’intersection du principe de précaution et des théories de la conception. Ces tensions se divisent en quatre catégories : (1) conceptualisation - analogique/logique; (2) incertitude - épistémologique/méthodologique; (3) comparabilité - interprétation/analytique, et (4) proposition - universalité/ pertinence contextuelle. Ces tensions conceptuelles sont considérées comme autant de vecteurs entrant en corrélation avec le modèle théorique qu’elles contribuent à enrichir sans pour autant constituer des validations au sens positiviste du terme. Ces confrontations au réel permettent de mieux définir l’obstacle épistémologique identifié précédemment. Cette thèse met donc en évidence les impacts généralement sous-estimés, des normalisations environnementales sur le processus de conception et de jugement des projets. Elle prend pour exemple, de façon non restrictive, l’examen de concours d'architecture canadiens pour bâtiments publics. La conclusion souligne la nécessité d'une nouvelle forme de « prudence réflexive » ainsi qu’une utilisation plus critique des outils actuels d’évaluation de la durabilité. Elle appelle une instrumentalisation fondée sur l'intégration globale, plutôt que sur l'opposition des approches environnementales.
Resumo:
MHCII molecules expose a weave of antigens, which send survival or activation signals to T lymphocytes. The ongoing process of peptide binding to the MHC class II groove implicates three accessory molecules: the invariant chain, DM and DO. The invariant chain folds and directs the MHCII molecules to the endosomal pathway. Then, DM exchanges the CLIP peptide, which is a remnant of the degraded invariant chain, for peptides of better affinity. Expressed in highly specialized antigen presenting cells, DO competes with MHCII molecules for DM binding and favors the presentation of receptor-internalized antigens. Altogether, these molecules exhibit potential immunomodulatory properties that can be exploited to increase the potency of peptide vaccines. DO requires DM for maturation and to exit the ER. Interestingly, it is possible to monitor this interaction through a conformation change on DOβ that is recognized by the Mags.DO5 monoclonal antibody. Using Mags.DO5, we showed that DM stabilizes the interactions between the DO α1 and β1 chains and that DM influences DO folding in the ER. Thus, the Mags.DO5+ conformation correlates with DO egress from the ER. To further evaluate this conformation change, directed evolution was applied to DO. Of the 41 unique mutants obtained, 25% were localized at the DM-DO binding interface and 12% are at the solvent-exposed β1 domain, which is thought to be the Mags.DO5 epitope. In addition, I used the library to test the ability of HLA-DO to inhibit HLA-DM and sorted for the amount of CLIP. Interestingly, most of the mutants showed a decrease inhibitory effect, supporting the notion that the intrinsic instability of DO is a required for its function. Finally, these results support the model in which DO competes against classical MHCII molecules by sequestering DM chaperone’s function. MHCII molecules are also characterized by their ability to present superantigens, a group of bacterial or viral toxins that coerces MHCII-TCR binding in a less promiscuous fashion than what is observed in a canonical setting. While the mechanism of how bacterial superantigens form trimeric complexes with TCR and MHCII is well understood, the mouse mammary tumor virus superantigens (vSAG) are poorly defined. In the absence of a crystal structure, I chose a functional approach to examine the relation between vSAG, MHCII and TCR with the goal of uncovering the overall trimolecular architecture. I showed that TCR concomitantly binds both the MHCII α chain and the vSAG and that TCR-MHCII docking is almost canonical when coerced by vSAGs. Because many peptides may be tolerated in the MHCII groove, the pressure exerted by vSAG seems to tweak conventional TCR-MHCII interactions. Furthermore, my results demonstrate that vSAG binding to MHCII molecules is conformation-dependent and abrogated by the CLIP amino-terminal residues extending outside the peptide-binding groove. In addition, they also suggest that vSAGs cross-link adjacent MHCIIs and activate T cells via a TGXY motif.
Resumo:
Mit aktiven Magnetlagern ist es möglich, rotierende Körper durch magnetische Felder berührungsfrei zu lagern. Systembedingt sind bei aktiv magnetgelagerten Maschinen wesentliche Signale ohne zusätzlichen Aufwand an Messtechnik für Diagnoseaufgaben verfügbar. In der Arbeit wird ein Konzept entwickelt, das durch Verwendung der systeminhärenten Signale eine Diagnose magnetgelagerter rotierender Maschinen ermöglicht und somit neben einer kontinuierlichen Anlagenüberwachung eine schnelle Bewertung des Anlagenzustandes gestattet. Fehler können rechtzeitig und ursächlich in Art und Größe erkannt und entsprechende Gegenmaßnahmen eingeleitet werden. Anhand der erfassten Signale geschieht die Gewinnung von Merkmalen mit signal- und modellgestützten Verfahren. Für den Magnetlagerregelkreis erfolgen Untersuchungen zum Einsatz modellgestützter Parameteridentifikationsverfahren, deren Verwendbarkeit wird bei der Diagnose am Regler und Leistungsverstärker nachgewiesen. Unter Nutzung von Simulationsmodellen sowie durch Experimente an Versuchsständen werden die Merkmalsverläufe im normalen Referenzzustand und bei auftretenden Fehlern aufgenommen und die Ergebnisse in einer Wissensbasis abgelegt. Diese dient als Grundlage zur Festlegung von Grenzwerten und Regeln für die Überwachung des Systems und zur Erstellung wissensbasierter Diagnosemodelle. Bei der Überwachung werden die Merkmalsausprägungen auf das Überschreiten von Grenzwerten überprüft, Informationen über erkannte Fehler und Betriebszustände gebildet sowie gegebenenfalls Alarmmeldungen ausgegeben. Sich langsam anbahnende Fehler können durch die Berechnung der Merkmalstrends mit Hilfe der Regressionsanalyse erkannt werden. Über die bisher bei aktiven Magnetlagern übliche Überwachung von Grenzwerten hinaus erfolgt bei der Fehlerdiagnose eine Verknüpfung der extrahierten Merkmale zur Identifizierung und Lokalisierung auftretender Fehler. Die Diagnose geschieht mittels regelbasierter Fuzzy-Logik, dies gestattet die Einbeziehung von linguistischen Aussagen in Form von Expertenwissen sowie die Berücksichtigung von Unbestimmtheiten und ermöglicht damit eine Diagnose komplexer Systeme. Für Aktor-, Sensor- und Reglerfehler im Magnetlagerregelkreis sowie Fehler durch externe Kräfte und Unwuchten werden Diagnosemodelle erstellt und verifiziert. Es erfolgt der Nachweis, dass das entwickelte Diagnosekonzept mit beherrschbarem Rechenaufwand korrekte Diagnoseaussagen liefert. Durch Kaskadierung von Fuzzy-Logik-Modulen wird die Transparenz des Regelwerks gewahrt und die Abarbeitung der Regeln optimiert. Endresultat ist ein neuartiges hybrides Diagnosekonzept, welches signal- und modellgestützte Verfahren der Merkmalsgewinnung mit wissensbasierten Methoden der Fehlerdiagnose kombiniert. Das entwickelte Diagnosekonzept ist für die Anpassung an unterschiedliche Anforderungen und Anwendungen bei rotierenden Maschinen konzipiert.