944 resultados para Evans, Richard J.: Telling lies about Hitler
Resumo:
An electric system based on renewable energy faces challenges concerning the storage and utilization of energy due to the intermittent and seasonal nature of renewable energy sources. Wind and solar photovoltaic power productions are variable and difficult to predict, and thus electricity storage will be needed in the case of basic power production. Hydrogen’s energetic potential lies in its ability and versatility to store chemical energy, to serve as an energy carrier and as feedstock for various industries. Hydrogen is also used e.g. in the production of biofuels. The amount of energy produced during hydrogen combustion is higher than any other fuel’s on a mass basis with a higher-heating-value of 39.4 kWh/kg. However, even though hydrogen is the most abundant element in the universe, on Earth most hydrogen exists in molecular forms such as water. Therefore, hydrogen must be produced and there are various methods to do so. Today, the majority hydrogen comes from fossil fuels, mainly from steam methane reforming, and only about 4 % of global hydrogen comes from water electrolysis. Combination of electrolytic production of hydrogen from water and supply of renewable energy is attracting more interest due to the sustainability and the increased flexibility of the resulting energy system. The preferred option for intermittent hydrogen storage is pressurization in tanks since at ambient conditions the volumetric energy density of hydrogen is low, and pressurized tanks are efficient and affordable when the cycling rate is high. Pressurized hydrogen enables energy storage in larger capacities compared to battery technologies and additionally the energy can be stored for longer periods of time, on a time scale of months. In this thesis, the thermodynamics and electrochemistry associated with water electrolysis are described. The main water electrolysis technologies are presented with state-of-the-art specifications. Finally, a Power-to-Hydrogen infrastructure design for Lappeenranta University of Technology is presented. Laboratory setup for water electrolysis is specified and factors affecting its commissioning in Finland are presented.
Resumo:
The growing population on earth along with diminishing fossil deposits and the climate change debate calls out for a better utilization of renewable, bio-based materials. In a biorefinery perspective, the renewable biomass is converted into many different products such as fuels, chemicals, and materials, quite similar to the petroleum refinery industry. Since forests cover about one third of the land surface on earth, ligno-cellulosic biomass is the most abundant renewable resource available. The natural first step in a biorefinery is separation and isolation of the different compounds the biomass is comprised of. The major components in wood are cellulose, hemicellulose, and lignin, all of which can be made into various end-products. Today, focus normally lies on utilizing only one component, e.g., the cellulose in the Kraft pulping process. It would be highly desirable to utilize all the different compounds, both from an economical and environmental point of view. The separation process should therefore be optimized. Hemicelluloses can partly be extracted with hot-water prior to pulping. Depending in the severity of the extraction, the hemicelluloses are degraded to various degrees. In order to be able to choose from a variety of different end-products, the hemicelluloses should be as intact as possible after the extraction. The main focus of this work has been on preserving the hemicellulose molar mass throughout the extraction at a high yield by actively controlling the extraction pH at the high temperatures used. Since it has not been possible to measure pH during an extraction due to the high temperatures, the extraction pH has remained a “black box”. Therefore, a high-temperature in-line pH measuring system was developed, validated, and tested for hot-water wood extractions. One crucial step in the measurements is calibration, therefore extensive efforts was put on developing a reliable calibration procedure. Initial extractions with wood showed that the actual extraction pH was ~0.35 pH units higher than previously believed. The measuring system was also equipped with a controller connected to a pump. With this addition it was possible to control the extraction to any desired pH set point. When the pH dropped below the set point, the controller started pumping in alkali and by that the desired set point was maintained very accurately. Analyses of the extracted hemicelluloses showed that less hemicelluloses were extracted at higher pH but with a higher molar-mass. Monomer formation could, at a certain pH level, be completely inhibited. Increasing the temperature, but maintaining a specific pH set point, would speed up the extraction without degrading the molar-mass of the hemicelluloses and thereby intensifying the extraction. The diffusion of the dissolved hemicelluloses from the wood particle is a major part of the extraction process. Therefore, a particle size study ranging from 0.5 mm wood particles to industrial size wood chips was conducted to investigate the internal mass transfer of the hemicelluloses. Unsurprisingly, it showed that hemicelluloses were extracted faster from smaller wood particles than larger although it did not seem to have a substantial effect on the average molar mass of the extracted hemicelluloses. However, smaller particle sizes require more energy to manufacture and thus increases the economic cost. Since bark comprises 10 – 15 % of a tree, it is important to also consider it in a biorefinery concept. Spruce inner and outer bark was hot-water extracted separately to investigate the possibility to isolate the bark hemicelluloses. It was showed that the bark hemicelluloses comprised mostly of pectic material and differed considerably from the wood hemicelluloses. The bark hemicelluloses, or pectins, could be extracted at lower temperatures than the wood hemicelluloses. A chemical characterization, done separately on inner and outer bark, showed that inner bark contained over 10 % stilbene glucosides that could be extracted already at 100 °C with aqueous acetone.
Resumo:
This essay proposes that the ecologic association shown between the 20th century coronary heart disease epidemic and the 1918 influenza pandemic could shed light on the mechanism associated with the high lethality of the latter. It suggests that an autoimmune interference at the apoB-LDL interface could explain both hypercholesterolemia and inflammation (through interference with the cellular metabolism of arachidonic acid). Autoimmune inflammation, then, would explain the 1950s-60s acute coronary events (coronary thrombosis upon influenza re-infection) and the respiratory failure seen among young adults in 1918. This hypothesis also argues that the lethality of the 1918 pandemic may have not depended so much on the 1918 virus as on an immune vulnerability to it, possibly resulting from an earlier priming of cohorts born around 1890 by the 1890 influenza pandemic virus.
Resumo:
The brain is a complex system, which produces emergent properties such as those associated with activity-dependent plasticity in processes of learning and memory. Therefore, understanding the integrated structures and functions of the brain is well beyond the scope of either superficial or extremely reductionistic approaches. Although a combination of zoom-in and zoom-out strategies is desirable when the brain is studied, constructing the appropriate interfaces to connect all levels of analysis is one of the most difficult challenges of contemporary neuroscience. Is it possible to build appropriate models of brain function and dysfunctions with computational tools? Among the best-known brain dysfunctions, epilepsies are neurological syndromes that reach a variety of networks, from widespread anatomical brain circuits to local molecular environments. One logical question would be: are those complex brain networks always producing maladaptive emergent properties compatible with epileptogenic substrates? The present review will deal with this question and will try to answer it by illustrating several points from the literature and from our laboratory data, with examples at the behavioral, electrophysiological, cellular and molecular levels. We conclude that, because the brain is a complex system compatible with the production of emergent properties, including plasticity, its functions should be approached using an integrated view. Concepts such as brain networks, graphics theory, neuroinformatics, and e-neuroscience are discussed as new transdisciplinary approaches dealing with the continuous growth of information about brain physiology and its dysfunctions. The epilepsies are discussed as neurobiological models of complex systems displaying maladaptive plasticity.
Resumo:
Classical Pavlovian fear conditioning to painful stimuli has provided the generally accepted view of a core system centered in the central amygdala to organize fear responses. Ethologically based models using other sources of threat likely to be expected in a natural environment, such as predators or aggressive dominant conspecifics, have challenged this concept of a unitary core circuit for fear processing. We discuss here what the ethologically based models have told us about the neural systems organizing fear responses. We explored the concept that parallel paths process different classes of threats, and that these different paths influence distinct regions in the periaqueductal gray - a critical element for the organization of all kinds of fear responses. Despite this parallel processing of different kinds of threats, we have discussed an interesting emerging view that common cortical-hippocampal-amygdalar paths seem to be engaged in fear conditioning to painful stimuli, to predators and, perhaps, to aggressive dominant conspecifics as well. Overall, the aim of this review is to bring into focus a more global and comprehensive view of the systems organizing fear responses.
Resumo:
In recent years, technological advancements in microelectronics and sensor technologies have revolutionized the field of electrical engineering. New manufacturing techniques have enabled a higher level of integration that has combined sensors and electronics into compact and inexpensive systems. Previously, the challenge in measurements was to understand the operation of the electronics and sensors, but this has now changed. Nowadays, the challenge in measurement instrumentation lies in mastering the whole system, not just the electronics. To address this issue, this doctoral dissertation studies whether it would be beneficial to consider a measurement system as a whole from the physical phenomena to the digital recording device, where each piece of the measurement system affects the system performance, rather than as a system consisting of small independent parts such as a sensor or an amplifier that could be designed separately. The objective of this doctoral dissertation is to describe in depth the development of the measurement system taking into account the challenges caused by the electrical and mechanical requirements and the measurement environment. The work is done as an empirical case study in two example applications that are both intended for scientific studies. The cases are a light sensitive biological sensor used in imaging and a gas electron multiplier detector for particle physics. The study showed that in these two cases there were a number of different parts of the measurement system that interacted with each other. Without considering these interactions, the reliability of the measurement may be compromised, which may lead to wrong conclusions about the measurement. For this reason it is beneficial to conceptualize the measurement system as a whole from the physical phenomena to the digital recording device where each piece of the measurement system affects the system performance. The results work as examples of how a measurement system can be successfully constructed to support a study of sensors and electronics.
Resumo:
INTRODUÇÃO: Pacientes com doença renal crônica (DRC) apresentam sinergismo entre fatores de risco tradicionais para aterosclerose e emergentes derivados do estado urêmico. OBJETIVO: Traçar o perfil epidemiológico de um grupo de pacientes com DRC submetido à avaliação cardiológica. MÉTODOS: Pacientes sintomáticos - com isquemia em cintilografia miocárdica e/ou disfunção sistólica ao ecodopplercardiograma - com idade maior que 50 anos e diabetes mellitus (DM) como causa da DRC e aqueles com dois ou mais fatores de risco ateroscleróticos realizaram cineangiocoronariografia. Assintomáticos - não diabéticos e sem fatores de risco - foram investigados com ecodopplercardiograma e aqueles com único fator de risco, por meio de ecodopplercardiograma e cintilografia. RESULTADOS: Foram estudados 46 pacientes, 58,7% homens, idade de 50-70 ± 11,7 anos, 91,3% dialíticos. Tempo de hemodiálise: 61,96 ± 55,1 meses. Hipertensão arterial foi causa da DRC em 56,5%. Dos 28 pacientes (60,9%) submetidos à cineangiocoronariografia, 53,6% apresentaram doença arterial coronariana (DAC). Os pacientes foram divididos em três grupos: com DAC (A), sem DAC (B) e não submetidos à cineangiocoronariografia (C). Diferença significativa ocorreu entre os Grupos B e C na frequência de índice tibiobraquial (ITB) anormal (p = 0,026), com ausência de ITB anormal no Grupo C e na média de idade, superior no B (p = 0,045). No Grupo A, 53,3% dos pacientes estavam em avaliação pré-paratireoidectomia (PTX). CONCLUSÃO: Este estudo confirmou a alta frequência de alterações cardiovasculares, inclusive de DAC, nos pacientes portadores de DRC, principalmente naqueles em diálise.
Resumo:
Human beings have always strived to preserve their memories and spread their ideas. In the beginning this was always done through human interpretations, such as telling stories and creating sculptures. Later, technological progress made it possible to create a recording of a phenomenon; first as an analogue recording onto a physical object, and later digitally, as a sequence of bits to be interpreted by a computer. By the end of the 20th century technological advances had made it feasible to distribute media content over a computer network instead of on physical objects, thus enabling the concept of digital media distribution. Many digital media distribution systems already exist, and their continued, and in many cases increasing, usage is an indicator for the high interest in their future enhancements and enriching. By looking at these digital media distribution systems, we have identified three main areas of possible improvement: network structure and coordination, transport of content over the network, and the encoding used for the content. In this thesis, our aim is to show that improvements in performance, efficiency and availability can be done in conjunction with improvements in software quality and reliability through the use of formal methods: mathematical approaches to reasoning about software so that we can prove its correctness, together with the desirable properties. We envision a complete media distribution system based on a distributed architecture, such as peer-to-peer networking, in which different parts of the system have been formally modelled and verified. Starting with the network itself, we show how it can be formally constructed and modularised in the Event-B formalism, such that we can separate the modelling of one node from the modelling of the network itself. We also show how the piece selection algorithm in the BitTorrent peer-to-peer transfer protocol can be adapted for on-demand media streaming, and how this can be modelled in Event-B. Furthermore, we show how modelling one peer in Event-B can give results similar to simulating an entire network of peers. Going further, we introduce a formal specification language for content transfer algorithms, and show that having such a language can make these algorithms easier to understand. We also show how generating Event-B code from this language can result in less complexity compared to creating the models from written specifications. We also consider the decoding part of a media distribution system by showing how video decoding can be done in parallel. This is based on formally defined dependencies between frames and blocks in a video sequence; we have shown that also this step can be performed in a way that is mathematically proven correct. Our modelling and proving in this thesis is, in its majority, tool-based. This provides a demonstration of the advance of formal methods as well as their increased reliability, and thus, advocates for their more wide-spread usage in the future.
Resumo:
The main purpose of this study was to describe and evaluate nursing students' learning about an empowering discourse in patient education. In Phase 1, the purpose was to describe an empowering discourse between a nurse and a patient. In Phase 2, the purpose was first to create a computer simulation program of an empowering discourse based on the description, and second, the purpose was to evaluate nursing students’ learning of how to conduct an empowering discourse using a computer simulation program. The ultimate goal was to strengthen the knowledge basis on empowering discourse and to develop nursing students’ knowledge about how to conduct an empowering discourse for the development of patient education. In Phase I, empowering discourse was described using a systematic literature review with a metasummary technique (n=15). Data were collected covering a period from January 1995 to October 2005. In Phase 2, the computer simulation program of empowering discourse was created based the description in 2006–2007. A descriptive comparative design was used to evaluate students’ (n=69) process of learning empowering discourse using the computer simulation program and a pretest–post-test design without a control group was used to evaluate students’ (n=43) outcomes of learning. Data were collected in 2007. Empowering discourse was a structured process and it was possible to simulate and learned with the computer simulation program. According to students’ knowledge, empowering discourse was an unstructured process. Process of learning empowering discourse using the computer simulation program was controlled by the students and it changed students’ knowledge. The outcomes of learning empowering discourse appeared as changes of students’ knowledge to more holistic and better-organized or only to more holistic or better-organized. The study strengthened knowledge base of empowering discourse and developed students to more knowledgeable in empowering discourse.