818 resultados para fuzzy rule base models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Backgrounds Ea aims: The boundaries between the categories of body composition provided by vectorial analysis of bioimpedance are not well defined. In this paper, fuzzy sets theory was used for modeling such uncertainty. Methods: An Italian database with 179 cases 18-70 years was divided randomly into developing (n = 20) and testing samples (n = 159). From the 159 registries of the testing sample, 99 contributed with unequivocal diagnosis. Resistance/height and reactance/height were the input variables in the model. Output variables were the seven categories of body composition of vectorial analysis. For each case the linguistic model estimated the membership degree of each impedance category. To compare such results to the previously established diagnoses Kappa statistics was used. This demanded singling out one among the output set of seven categories of membership degrees. This procedure (defuzzification rule) established that the category with the highest membership degree should be the most likely category for the case. Results: The fuzzy model showed a good fit to the development sample. Excellent agreement was achieved between the defuzzified impedance diagnoses and the clinical diagnoses in the testing sample (Kappa = 0.85, p < 0.001). Conclusions: fuzzy linguistic model was found in good agreement with clinical diagnoses. If the whole model output is considered, information on to which extent each BIVA category is present does better advise clinical practice with an enlarged nosological framework and diverse therapeutic strategies. (C) 2012 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concept behind a biodegradable ligament device is to temporarily replace the biomechanical functions of the ruptured ligament, while it progressively regenerates its capacities. However, there is a lack of methods to predict the mechanical behaviour evolution of the biodegradable devices during degradation, which is an important aspect of the project. In this work, a hyper elastic constitutive model will be used to predict the mechanical behaviour of a biodegradable rope made of aliphatic polyesters. A numerical approach using ABAQUS is presented, where the material parameters of the model proposal are automatically updated in correspondence to the degradation time, by means of a script in PYTHON. In this method we also use a User Material subroutine (UMAT) to apply a failure criterion base on the strength that decreases according to a first order differential equation. The parameterization of the material model proposal for different degradation times were achieved by fitting the theoretical curves with the experimental data of tensile tests on fibres. To model all the rope behaviour we had considered one step of homogenisation considering the fibres architectures in an elementary volume. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: This study proposes a new approach that considers uncertainty in predicting and quantifying the presence and severity of diabetic peripheral neuropathy. METHODS: A rule-based fuzzy expert system was designed by four experts in diabetic neuropathy. The model variables were used to classify neuropathy in diabetic patients, defining it as mild, moderate, or severe. System performance was evaluated by means of the Kappa agreement measure, comparing the results of the model with those generated by the experts in an assessment of 50 patients. Accuracy was evaluated by an ROC curve analysis obtained based on 50 other cases; the results of those clinical assessments were considered to be the gold standard. RESULTS: According to the Kappa analysis, the model was in moderate agreement with expert opinions. The ROC analysis (evaluation of accuracy) determined an area under the curve equal to 0.91, demonstrating very good consistency in classifying patients with diabetic neuropathy. CONCLUSION: The model efficiently classified diabetic patients with different degrees of neuropathy severity. In addition, the model provides a way to quantify diabetic neuropathy severity and allows a more accurate patient condition assessment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research activity carried out during the PhD course was focused on the development of mathematical models of some cognitive processes and their validation by means of data present in literature, with a double aim: i) to achieve a better interpretation and explanation of the great amount of data obtained on these processes from different methodologies (electrophysiological recordings on animals, neuropsychological, psychophysical and neuroimaging studies in humans), ii) to exploit model predictions and results to guide future research and experiments. In particular, the research activity has been focused on two different projects: 1) the first one concerns the development of neural oscillators networks, in order to investigate the mechanisms of synchronization of the neural oscillatory activity during cognitive processes, such as object recognition, memory, language, attention; 2) the second one concerns the mathematical modelling of multisensory integration processes (e.g. visual-acoustic), which occur in several cortical and subcortical regions (in particular in a subcortical structure named Superior Colliculus (SC)), and which are fundamental for orienting motor and attentive responses to external world stimuli. This activity has been realized in collaboration with the Center for Studies and Researches in Cognitive Neuroscience of the University of Bologna (in Cesena) and the Department of Neurobiology and Anatomy of the Wake Forest University School of Medicine (NC, USA). PART 1. Objects representation in a number of cognitive functions, like perception and recognition, foresees distribute processes in different cortical areas. One of the main neurophysiological question concerns how the correlation between these disparate areas is realized, in order to succeed in grouping together the characteristics of the same object (binding problem) and in maintaining segregated the properties belonging to different objects simultaneously present (segmentation problem). Different theories have been proposed to address these questions (Barlow, 1972). One of the most influential theory is the so called “assembly coding”, postulated by Singer (2003), according to which 1) an object is well described by a few fundamental properties, processing in different and distributed cortical areas; 2) the recognition of the object would be realized by means of the simultaneously activation of the cortical areas representing its different features; 3) groups of properties belonging to different objects would be kept separated in the time domain. In Chapter 1.1 and in Chapter 1.2 we present two neural network models for object recognition, based on the “assembly coding” hypothesis. These models are networks of Wilson-Cowan oscillators which exploit: i) two high-level “Gestalt Rules” (the similarity and previous knowledge rules), to realize the functional link between elements of different cortical areas representing properties of the same object (binding problem); 2) the synchronization of the neural oscillatory activity in the γ-band (30-100Hz), to segregate in time the representations of different objects simultaneously present (segmentation problem). These models are able to recognize and reconstruct multiple simultaneous external objects, even in difficult case (some wrong or lacking features, shared features, superimposed noise). In Chapter 1.3 the previous models are extended to realize a semantic memory, in which sensory-motor representations of objects are linked with words. To this aim, the network, previously developed, devoted to the representation of objects as a collection of sensory-motor features, is reciprocally linked with a second network devoted to the representation of words (lexical network) Synapses linking the two networks are trained via a time-dependent Hebbian rule, during a training period in which individual objects are presented together with the corresponding words. Simulation results demonstrate that, during the retrieval phase, the network can deal with the simultaneous presence of objects (from sensory-motor inputs) and words (from linguistic inputs), can correctly associate objects with words and segment objects even in the presence of incomplete information. Moreover, the network can realize some semantic links among words representing objects with some shared features. These results support the idea that semantic memory can be described as an integrated process, whose content is retrieved by the co-activation of different multimodal regions. In perspective, extended versions of this model may be used to test conceptual theories, and to provide a quantitative assessment of existing data (for instance concerning patients with neural deficits). PART 2. The ability of the brain to integrate information from different sensory channels is fundamental to perception of the external world (Stein et al, 1993). It is well documented that a number of extraprimary areas have neurons capable of such a task; one of the best known of these is the superior colliculus (SC). This midbrain structure receives auditory, visual and somatosensory inputs from different subcortical and cortical areas, and is involved in the control of orientation to external events (Wallace et al, 1993). SC neurons respond to each of these sensory inputs separately, but is also capable of integrating them (Stein et al, 1993) so that the response to the combined multisensory stimuli is greater than that to the individual component stimuli (enhancement). This enhancement is proportionately greater if the modality-specific paired stimuli are weaker (the principle of inverse effectiveness). Several studies have shown that the capability of SC neurons to engage in multisensory integration requires inputs from cortex; primarily the anterior ectosylvian sulcus (AES), but also the rostral lateral suprasylvian sulcus (rLS). If these cortical inputs are deactivated the response of SC neurons to cross-modal stimulation is no different from that evoked by the most effective of its individual component stimuli (Jiang et al 2001). This phenomenon can be better understood through mathematical models. The use of mathematical models and neural networks can place the mass of data that has been accumulated about this phenomenon and its underlying circuitry into a coherent theoretical structure. In Chapter 2.1 a simple neural network model of this structure is presented; this model is able to reproduce a large number of SC behaviours like multisensory enhancement, multisensory and unisensory depression, inverse effectiveness. In Chapter 2.2 this model was improved by incorporating more neurophysiological knowledge about the neural circuitry underlying SC multisensory integration, in order to suggest possible physiological mechanisms through which it is effected. This endeavour was realized in collaboration with Professor B.E. Stein and Doctor B. Rowland during the 6 months-period spent at the Department of Neurobiology and Anatomy of the Wake Forest University School of Medicine (NC, USA), within the Marco Polo Project. The model includes four distinct unisensory areas that are devoted to a topological representation of external stimuli. Two of them represent subregions of the AES (i.e., FAES, an auditory area, and AEV, a visual area) and send descending inputs to the ipsilateral SC; the other two represent subcortical areas (one auditory and one visual) projecting ascending inputs to the same SC. Different competitive mechanisms, realized by means of population of interneurons, are used in the model to reproduce the different behaviour of SC neurons in conditions of cortical activation and deactivation. The model, with a single set of parameters, is able to mimic the behaviour of SC multisensory neurons in response to very different stimulus conditions (multisensory enhancement, inverse effectiveness, within- and cross-modal suppression of spatially disparate stimuli), with cortex functional and cortex deactivated, and with a particular type of membrane receptors (NMDA receptors) active or inhibited. All these results agree with the data reported in Jiang et al. (2001) and in Binns and Salt (1996). The model suggests that non-linearities in neural responses and synaptic (excitatory and inhibitory) connections can explain the fundamental aspects of multisensory integration, and provides a biologically plausible hypothesis about the underlying circuitry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The "sustainability" concept relates to the prolonging of human economic systems with as little detrimental impact on ecological systems as possible. Construction that exhibits good environmental stewardship and practices that conserve resources in a manner that allow growth and development to be sustained for the long-term without degrading the environment are indispensable in a developed society. Past, current and future advancements in asphalt as an environmentally sustainable paving material are especially important because the quantities of asphalt used annually in Europe as well as in the U.S. are large. The asphalt industry is still developing technological improvements that will reduce the environmental impact without affecting the final mechanical performance. Warm mix asphalt (WMA) is a type of asphalt mix requiring lower production temperatures compared to hot mix asphalt (HMA), while aiming to maintain the desired post construction properties of traditional HMA. Lowering the production temperature reduce the fuel usage and the production of emissions therefore and that improve conditions for workers and supports the sustainable development. Even the crumb-rubber modifier (CRM), with shredded automobile tires and used in the United States since the mid 1980s, has proven to be an environmentally friendly alternative to conventional asphalt pavement. Furthermore, the use of waste tires is not only relevant in an environmental aspect but also for the engineering properties of asphalt [Pennisi E., 1992]. This research project is aimed to demonstrate the dual value of these Asphalt Mixes in regards to the environmental and mechanical performance and to suggest a low environmental impact design procedure. In fact, the use of eco-friendly materials is the first phase towards an eco-compatible design but it cannot be the only step. The eco-compatible approach should be extended also to the design method and material characterization because only with these phases is it possible to exploit the maximum potential properties of the used materials. Appropriate asphalt concrete characterization is essential and vital for realistic performance prediction of asphalt concrete pavements. Volumetric (Mix design) and mechanical (Permanent deformation and Fatigue performance) properties are important factors to consider. Moreover, an advanced and efficient design method is necessary in order to correctly use the material. A design method such as a Mechanistic-Empirical approach, consisting of a structural model capable of predicting the state of stresses and strains within the pavement structure under the different traffic and environmental conditions, was the application of choice. In particular this study focus on the CalME and its Incremental-Recursive (I-R) procedure, based on damage models for fatigue and permanent shear strain related to the surface cracking and to the rutting respectively. It works in increments of time and, using the output from one increment, recursively, as input to the next increment, predicts the pavement conditions in terms of layer moduli, fatigue cracking, rutting and roughness. This software procedure was adopted in order to verify the mechanical properties of the study mixes and the reciprocal relationship between surface layer and pavement structure in terms of fatigue and permanent deformation with defined traffic and environmental conditions. The asphalt mixes studied were used in a pavement structure as surface layer of 60 mm thickness. The performance of the pavement was compared to the performance of the same pavement structure where different kinds of asphalt concrete were used as surface layer. In comparison to a conventional asphalt concrete, three eco-friendly materials, two warm mix asphalt and a rubberized asphalt concrete, were analyzed. The First Two Chapters summarize the necessary steps aimed to satisfy the sustainable pavement design procedure. In Chapter I the problem of asphalt pavement eco-compatible design was introduced. The low environmental impact materials such as the Warm Mix Asphalt and the Rubberized Asphalt Concrete were described in detail. In addition the value of a rational asphalt pavement design method was discussed. Chapter II underlines the importance of a deep laboratory characterization based on appropriate materials selection and performance evaluation. In Chapter III, CalME is introduced trough a specific explanation of the different equipped design approaches and specifically explaining the I-R procedure. In Chapter IV, the experimental program is presented with a explanation of test laboratory devices adopted. The Fatigue and Rutting performances of the study mixes are shown respectively in Chapter V and VI. Through these laboratory test data the CalME I-R models parameters for Master Curve, fatigue damage and permanent shear strain were evaluated. Lastly, in Chapter VII, the results of the asphalt pavement structures simulations with different surface layers were reported. For each pavement structure, the total surface cracking, the total rutting, the fatigue damage and the rutting depth in each bound layer were analyzed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last 60 years, computers and software have favoured incredible advancements in every field. Nowadays, however, these systems are so complicated that it is difficult – if not challenging – to understand whether they meet some requirement or are able to show some desired behaviour or property. This dissertation introduces a Just-In-Time (JIT) a posteriori approach to perform the conformance check to identify any deviation from the desired behaviour as soon as possible, and possibly apply some corrections. The declarative framework that implements our approach – entirely developed on the promising open source forward-chaining Production Rule System (PRS) named Drools – consists of three components: 1. a monitoring module based on a novel, efficient implementation of Event Calculus (EC), 2. a general purpose hybrid reasoning module (the first of its genre) merging temporal, semantic, fuzzy and rule-based reasoning, 3. a logic formalism based on the concept of expectations introducing Event-Condition-Expectation rules (ECE-rules) to assess the global conformance of a system. The framework is also accompanied by an optional module that provides Probabilistic Inductive Logic Programming (PILP). By shifting the conformance check from after execution to just in time, this approach combines the advantages of many a posteriori and a priori methods proposed in literature. Quite remarkably, if the corrective actions are explicitly given, the reactive nature of this methodology allows to reconcile any deviations from the desired behaviour as soon as it is detected. In conclusion, the proposed methodology brings some advancements to solve the problem of the conformance checking, helping to fill the gap between humans and the increasingly complex technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of the present research is to define a Semantic Web framework for precedent modelling, by using knowledge extracted from text, metadata, and rules, while maintaining a strong text-to-knowledge morphism between legal text and legal concepts, in order to fill the gap between legal document and its semantics. The framework is composed of four different models that make use of standard languages from the Semantic Web stack of technologies: a document metadata structure, modelling the main parts of a judgement, and creating a bridge between a text and its semantic annotations of legal concepts; a legal core ontology, modelling abstract legal concepts and institutions contained in a rule of law; a legal domain ontology, modelling the main legal concepts in a specific domain concerned by case-law; an argumentation system, modelling the structure of argumentation. The input to the framework includes metadata associated with judicial concepts, and an ontology library representing the structure of case-law. The research relies on the previous efforts of the community in the field of legal knowledge representation and rule interchange for applications in the legal domain, in order to apply the theory to a set of real legal documents, stressing the OWL axioms definitions as much as possible in order to enable them to provide a semantically powerful representation of the legal document and a solid ground for an argumentation system using a defeasible subset of predicate logics. It appears that some new features of OWL2 unlock useful reasoning features for legal knowledge, especially if combined with defeasible rules and argumentation schemes. The main task is thus to formalize legal concepts and argumentation patterns contained in a judgement, with the following requirement: to check, validate and reuse the discourse of a judge - and the argumentation he produces - as expressed by the judicial text.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La tesi affronta il concetto di esposizione al rischio occupazionale e il suo scopo è quello di indagare l’ambiente di lavoro e il comportamento dei lavoratori, con l'obiettivo di ridurre il tasso di incidenza degli infortuni sul lavoro ed eseguire la riduzione dei rischi. In primo luogo, è proposta una nuova metodologia denominata MIMOSA (Methodology for the Implementation and Monitoring of Occupational SAfety), che quantifica il livello di "salute e sicurezza" di una qualsiasi impresa. Al fine di raggiungere l’obiettivo si è reso necessario un approccio multidisciplinare in cui concetti d’ingegneria e di psicologia sono stati combinati per sviluppare una metodologia di previsione degli incidenti e di miglioramento della sicurezza sul lavoro. I risultati della sperimentazione di MIMOSA hanno spinto all'uso della Logica Fuzzy nel settore della sicurezza occupazionale per migliorare la metodologia stessa e per superare i problemi riscontrati nell’incertezza della raccolta dei dati. La letteratura mostra che i fattori umani, la percezione del rischio e il comportamento dei lavoratori in relazione al rischio percepito, hanno un ruolo molto importante nella comparsa degli incidenti. Questa considerazione ha portato ad un nuovo approccio e ad una seconda metodologia che consiste nella prevenzione di incidenti, non solo sulla base dell'analisi delle loro dinamiche passate. Infatti la metodologia considera la valutazione di un indice basato sui comportamenti proattivi dei lavoratori e sui danni potenziali degli eventi incidentali evitati. L'innovazione consiste nell'applicazione della Logica Fuzzy per tener conto dell’"indeterminatezza" del comportamento umano e del suo linguaggio naturale. In particolare l’applicazione è incentrata sulla proattività dei lavoratori e si prefigge di impedire l'evento "infortunio", grazie alla generazione di una sorta d’indicatore di anticipo. Questa procedura è stata testata su un’azienda petrolchimica italiana.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uno dei più importanti campi di ricerca che coinvolge gli astrofisici è la comprensione della Struttura a Grande Scala dell'universo. I principi della Formazione delle Strutture sono ormai ben saldi, e costituiscono la base del cosiddetto "Modello Cosmologico Standard". Fino agli inizi degli anni 2000, la teoria che spiegava con successo le proprietà statistiche dell'universo era la cosiddetta "Teoria Perturbativa Standard". Attraverso simulazioni numeriche e osservazioni di qualità migliore, si è evidenziato il limite di quest'ultima teoria nel descrivere il comportamento dello spettro di potenza su scale oltre il regime lineare. Ciò spinse i teorici a trovare un nuovo approccio perturbativo, in grado di estendere la validità dei risultati analitici. In questa Tesi si discutono le teorie "Renormalized Perturbation Theory"e"Multipoint Propagator". Queste nuove teorie perturbative sono la base teorica del codice BisTeCca, un codice numerico originale che permette il calcolo dello spettro di potenza a 2 loop e del bispettro a 1 loop in ordine perturbativo. Come esempio applicativo, abbiamo utilizzato BisTeCca per l'analisi dei bispettri in modelli di universo oltre la cosmologia standard LambdaCDM, introducendo una componente di neutrini massicci. Si mostrano infine gli effetti su spettro di potenza e bispettro, ottenuti col nostro codice BisTeCca, e si confrontano modelli di universo con diverse masse di neutrini.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ability of the pm3 semiempirical quantum mechanical method to reproduce hydrogen bonding in nucleotide base pairs was assessed. Results of pm3 calculations on the nucleotides 2′-deoxyadenosine 5′-monophosphate (pdA), 2′-deoxyguanosine 5′-monophosphate (pdG), 2′-deoxycytidine 5′-monophosphate (pdC), and 2′-deoxythymidine 5′-monophosphate (pdT) and the base pairs pdA–pdT, pdG–pdC, and pdG(syn)–pdC are presented and discussed. The pm3 method is the first of the parameterized nddo quantum mechanical models with any ability to reproduce hydrogen bonding between nucleotide base pairs. Intermolecular hydrogen bond lengths between nucleotides displaying Watson–Crick base pairing are 0.1–0.2 Å less than experimental results. Nucleotide bond distances, bond angles, and torsion angles about the glycosyl bond (χ), the C4′C5′ bond (γ), and the C5′O5′ bond (β) agree with experimental results. There are many possible conformations of nucleotides. pm3 calculations reveal that many of the most stable conformations are stabilized by intramolecular CHO hydrogen bonds. These interactions disrupt the usual sugar puckering. The stacking interactions of a dT–pdA duplex are examined at different levels of gradient optimization. The intramolecular hydrogen bonds found in the nucleotide base pairs disappear in the duplex, as a result of the additional constraints on the phosphate group when part of a DNA backbone. Sugar puckering is reproduced by the pm3 method for the four bases in the dT–pdA duplex. pm3 underestimates the attractive stacking interactions of base pairs in a B-DNA helical conformation. The performance of the pm3 method implemented in SPARTAN is contrasted with that implemented in MOPAC. At present, accurate ab initio calculations are too timeconsuming to be of practical use, and molecular mechanics methods cannot be used to determine quantum mechanical properties such as reaction-path calculations, transition-state structures, and activation energies. The pm3 method should be used with extreme caution for examination of small DNA systems. Future parameterizations of semiempirical methods should incorporate base stacking interactions into the parameterization data set to enhance the ability of these methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the advent of cheaper and faster DNA sequencing technologies, assembly methods have greatly changed. Instead of outputting reads that are thousands of base pairs long, new sequencers parallelize the task by producing read lengths between 35 and 400 base pairs. Reconstructing an organism’s genome from these millions of reads is a computationally expensive task. Our algorithm solves this problem by organizing and indexing the reads using n-grams, which are short, fixed-length DNA sequences of length n. These n-grams are used to efficiently locate putative read joins, thereby eliminating the need to perform an exhaustive search over all possible read pairs. Our goal was develop a novel n-gram method for the assembly of genomes from next-generation sequencers. Specifically, a probabilistic, iterative approach was utilized to determine the most likely reads to join through development of a new metric that models the probability of any two arbitrary reads being joined together. Tests were run using simulated short read data based on randomly created genomes ranging in lengths from 10,000 to 100,000 nucleotides with 16 to 20x coverage. We were able to successfully re-assemble entire genomes up to 100,000 nucleotides in length.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Radio-frequency electromagnetic fields (RF EMF) of mobile communication systems are widespread in the living environment, yet their effects on humans are uncertain despite a growing body of literature. OBJECTIVES: We investigated the influence of a Universal Mobile Telecommunications System (UMTS) base station-like signal on well-being and cognitive performance in subjects with and without self-reported sensitivity to RF EMF. METHODS: We performed a controlled exposure experiment (45 min at an electric field strength of 0, 1, or 10 V/m, incident with a polarization of 45 degrees from the left back side of the subject, weekly intervals) in a randomized, double-blind crossover design. A total of 117 healthy subjects (33 self-reported sensitive, 84 nonsensitive subjects) participated in the study. We assessed well-being, perceived field strength, and cognitive performance with questionnaires and cognitive tasks and conducted statistical analyses using linear mixed models. Organ-specific and brain tissue-specific dosimetry including uncertainty and variation analysis was performed. RESULTS: In both groups, well-being and perceived field strength were not associated with actual exposure levels. We observed no consistent condition-induced changes in cognitive performance except for two marginal effects. At 10 V/m we observed a slight effect on speed in one of six tasks in the sensitive subjects and an effect on accuracy in another task in nonsensitive subjects. Both effects disappeared after multiple end point adjustment. CONCLUSIONS: In contrast to a recent Dutch study, we could not confirm a short-term effect of UMTS base station-like exposure on well-being. The reported effects on brain functioning were marginal and may have occurred by chance. Peak spatial absorption in brain tissue was considerably smaller than during use of a mobile phone. No conclusions can be drawn regarding short-term effects of cell phone exposure or the effects of long-term base station-like exposure on human health.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We found a significant positive correlation between local summer air temperature (May-September) and the annual sediment mass accumulation rate (MAR) in Lake Silvaplana (46°N, 9°E, 1800 m a.s.l.) during the twentieth century (r = 0.69, p < 0.001 for decadal smoothed series). Sediment trap data (2001-2005) confirm this relation with exceptionally high particle yields during the hottest summer of the last 140 years in 2003. On this base we developed a decadal-scale summer temperature reconstruction back to AD 1580. Surprisingly, the comparison of our reconstruction with two other independent regional summer temperature reconstructions (based on tree-rings and documentary data) revealed a significant negative correlation for the pre-1900 data (ie, late ‘Little Ice Age’). This demonstrates that the correlation between MAR and summer temperature is not stable in time and the actualistic principle does not apply in this case. We suggest that different climatic regimes (modern/‘Little Ice Age’) lead to changing state conditions in the catchment and thus to considerably different sediment transport mechanisms. Therefore, we calibrated our MAR data with gridded early instrumental temperature series from AD 1760-1880 (r = -0.48, p < 0.01 for decadal smoothed series) to properly reconstruct the late LIA climatic conditions. We found exceptionally low temperatures between AD 1580 and 1610 (0.75°C below twentieth-century mean) and during the late Maunder Minimum from AD 1680 to 1710 (0.5°C below twentieth-century mean). In general, summer temperatures did not experience major negative departures from the twentieth-century mean during the late ‘Little Ice Age’. This compares well with the two existing independent regional reconstructions suggesting that the LIA in the Alps was mainly a phenomenon of the cold season.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is an important and difficult challenge to protect modern interconnected power system from blackouts. Applying advanced power system protection techniques and increasing power system stability are ways to improve the reliability and security of power systems. Phasor-domain software packages such as Power System Simulator for Engineers (PSS/E) can be used to study large power systems but cannot be used for transient analysis. In order to observe both power system stability and transient behavior of the system during disturbances, modeling has to be done in the time-domain. This work focuses on modeling of power systems and various control systems in the Alternative Transients Program (ATP). ATP is a time-domain power system modeling software in which all the power system components can be modeled in detail. Models are implemented with attention to component representation and parameters. The synchronous machine model includes the saturation characteristics and control interface. Transient Analysis Control System is used to model the excitation control system, power system stabilizer and the turbine governor system of the synchronous machine. Several base cases of a single machine system are modeled and benchmarked against PSS/E. A two area system is modeled and inter-area and intra-area oscillations are observed. The two area system is reduced to a two machine system using reduced dynamic equivalencing. The original and the reduced systems are benchmarked against PSS/E. This work also includes the simulation of single-pole tripping using one of the base case models. Advantages of single-pole tripping and comparison of system behavior against three-pole tripping are studied. Results indicate that the built-in control system models in PSS/E can be effectively reproduced in ATP. The benchmarked models correctly simulate the power system dynamics. The successful implementation of a dynamically reduced system in ATP shows promise for studying a small sub-system of a large system without losing the dynamic behaviors. Other aspects such as relaying can be investigated using the benchmarked models. It is expected that this work will provide guidance in modeling different control systems for the synchronous machine and in representing dynamic equivalents of large power systems.