952 resultados para Global navigation satellites system
Resumo:
Hermite interpolation is increasingly showing to be a powerful numerical solution tool, as applied to different kinds of second order boundary value problems. In this work we present two Hermite finite element methods to solve viscous incompressible flows problems, in both two- and three-dimension space. In the two-dimensional case we use the Zienkiewicz triangle to represent the velocity field, and in the three-dimensional case an extension of this element to tetrahedra, still called a Zienkiewicz element. Taking as a model the Stokes system, the pressure is approximated with continuous functions, either piecewise linear or piecewise quadratic, according to the version of the Zienkiewicz element in use, that is, with either incomplete or complete cubics. The methods employ both the standard Galerkin or the Petrov–Galerkin formulation first proposed in Hughes et al. (1986) [18], based on the addition of a balance of force term. A priori error analyses point to optimal convergence rates for the PG approach, and for the Galerkin formulation too, at least in some particular cases. From the point of view of both accuracy and the global number of degrees of freedom, the new methods are shown to have a favorable cost-benefit ratio, as compared to velocity Lagrange finite elements of the same order, especially if the Galerkin approach is employed.
Resumo:
The purpose of this thesis is to analyse interactions between freshwater flows, terrestrial ecosystems and human well-being. Freshwater management and policy has mainly focused on the liquid water part (surface and ground water run off) of the hydrological cycle including aquatic ecosystems. Although of great significance, this thesis shows that such a focus will not be sufficient for coping with freshwater related social-ecological vulnerability. The thesis illustrates that the terrestrial component of the hydrological cycle, reflected in vapour flows (or evapotranspiration), serves multiple functions in the human life-support system. A broader understanding of the interactions between terrestrial systems and freshwater flows is particularly important in light of present widespread land cover change in terrestrial ecosystems. The water vapour flows from continental ecosystems were quantified at a global scale in Paper I of the thesis. It was estimated that in order to sustain the majority of global terrestrial ecosystem services on which humanity depends, an annual water vapour flow of 63 000 km3/yr is needed, including 6800 km3/yr for crop production. In comparison, the annual human withdrawal of liquid water amounts to roughly 4000 km3/yr. A potential conflict between freshwater for future food production and for terrestrial ecosystem services was identified. Human redistribution of water vapour flows as a consequence of long-term land cover change was addressed at both continental (Australia) (Paper II) and global scales (Paper III). It was estimated that the annual vapour flow had decreased by 10% in Australia during the last 200 years. This is due to a decrease in woody vegetation for agricultural production. The reduction in vapour flows has caused severe problems with salinity of soils and rivers. The human-induced alteration of vapour flows was estimated at more than 15 times the volume of human-induced change in liquid water (Paper II).
Resumo:
Modern food production is a complex, globalized system in which what we eat and how it is produced are increasingly disconnected. This thesis examines some of the ways in which global trade has changed the mix of inputs to food and feed, and how this affects food security and our perceptions of sustainability. One useful indicator of the ecological impact of trade in food and feed products is the Appropriated Ecosystem Areas (ArEAs), which estimates the terrestrial and aquatic areas needed to produce all the inputs to particular products. The method is introduced in Paper I and used to calculate and track changes in imported subsidies to Swedish agriculture over the period 1962-1994. In 1994, Swedish consumers needed agricultural areas outside their national borders to satisfy more than a third of their food consumption needs. The method is then applied to Swedish meat production in Paper II to show that the term “Made in Sweden” is often a misnomer. In 1999, almost 80% of manufactured feed for Swedish pigs, cattle and chickens was dependent on imported inputs, mainly from Europe, Southeast Asia and South America. Paper III examines ecosystem subsidies to intensive aquaculture in two nations: shrimp production in Thailand and salmon production in Norway. In both countries, aquaculture was shown to rely increasingly on imported subsidies. The rapid expansion of aquaculture turned these countries from fishmeal net exporters to fishmeal net importers, increasingly using inputs from the Southeastern Pacific Ocean. As the examined agricultural and aquacultural production systems became globalized, levels of dependence on other nations’ ecosystems, the number of external supply sources, and the distance to these sources steadily increased. Dependence on other nations is not problematic, as long as we are able to acknowledge these links and sustainably manage resources both at home and abroad. However, ecosystem subsidies are seldom recognized or made explicit in national policy or economic accounts. Economic systems are generally not designed to receive feedbacks when the status of remote ecosystems changes, much less to respond in an ecologically sensitive manner. Papers IV and V discuss the problem of “masking” of the true environmental costs of production for trade. One of our conclusions is that, while the ArEAs approach is a useful tool for illuminating environmentally-based subsidies in the policy arena, it does not reflect all of the costs. Current agricultural and aquacultural production methods have generated substantial increases in production levels, but if policy continues to support the focus on yield and production increases alone, taking the work of ecosystems for granted, vulnerability can result. Thus, a challenge is to develop a set of complementary tools that can be used in economic accounting at national and international scales that address ecosystem support and performance. We conclude that future resilience in food production systems will require more explicit links between consumers and the work of supporting ecosystems, locally and in other regions of the world, and that food security planning will require active management of the capacity of all involved ecosystems to sustain food production.
Resumo:
La difusividad diapicna en el océano es uno de los parámetros más desconocidos en los modelos climáticos actuales. Su importancia radica en que es uno de los principales factores de transporte de calor hacia capas más profundas del océano. Las medidas de esta difusividad son variables e insuficientes para confeccionar un mapa global con estos valores. A través de una amplia revisión bibliográfica hasta el año 2009 del tema se encontró que el sistema climático es extremadamente sensible a la difusividad diapicna, donde el escalado del Océano Pacífico Sur, con una potencia de su coeficiente de difusividad o kv de 0.63, resultó ser más sensible a los cambios en el coeficiente de difusividad diapicna que el Océano Atlántico con una potencia de kv de 0.44 , se pone de manifiesto así la necesidad de esclarecer los esquemas de mezcla, esquemas de clausura y sus parametrizaciones a través de Modelos de Circulación Global (GCMs) y Modelos de Complejidad Intermedia del Sistema Terrestre (EMICs), dentro del marco de un posible cambio climático y un calentamiento global debido al aumento de las emisiones de gases de efecto invernadero. Así, el objetivo principal de este trabajo es comprender la sensibilidad del sistema climático a la difusividad diapicna en el océano a través de los GCMs y los EMICs. Para esto es necesario el análisis de los posibles esquemas de mezcla diapicna con el objetivo final de encontrar el modelo óptimo que permita predecir la evolución del sistema climático, el estudio de todas las variables que influyen en el mismo, y la correcta simulación en largos periodos de tiempo. The diapycnal diffusivity in the ocean is one of the least known parameters in current climate models. Measurements of this diffusivity are sparse and insufficient for compiling a global map. Through a lengthy review of the literature through 2009 found that the climate system is extremely sensitive to the diapycnal diffusivity, where in the South Pacific scales with the 0.63 power of the diapycnal diffusion, in contrasts to the scales with the 0.44 power of the diapycnal diffusion of North Atlantic. Therefore, the South Pacific is more sensitive than the North Atlantic. All this evidenced the need to clarify the schemes of mixing and its parameterisations through Global Circulation Models (GCMs) and Earth Models of Intermediate Complexity (EMICs) within a context of possible climate change and global warming due to increased of emissions of greenhouse gases. Thus, the main objective of this work understands the sensitivity of the climate system to diapycnal diffusivity in the ocean through the GCMs and EMICs. This requires the analysis of possible schemes of diapycnal mixing with the ultimate goal of finding the optimal model to predict the evolution of the climate system, the study of all variables that affect it and the correct simulation over long periods of time.
Resumo:
[EN] Marine N2 fixing microorganisms, termed diazotrophs, are a key functional group in marine pelagic ecosystems. The biological fixation of dinitrogen (N2) to bioavailable nitrogen provides an important new source of nitrogen for pelagic marine ecosystems 5 and influences primary productivity and organic matter export to the deep ocean. As one of a series of efforts to collect biomass and rates specific to different phytoplankton functional groups, we have constructed a database on diazotrophic organisms in the global pelagic upper ocean by compiling about 12 000 direct field measurements of cyanobacterial diazotroph abundances (based on microscopic cell counts or qPCR 10 assays targeting the nifH genes) and N2 fixation rates. Biomass conversion factors are estimated based on cell sizes to convert abundance data to diazotrophic biomass. The database is limited spatially, lacking large regions of the ocean especially in the Indian Ocean. The data are approximately log-normal distributed, and large variances exist in most sub-databases with non-zero values differing 5 to 8 orders of magnitude. 15 Lower mean N2 fixation rate was found in the North Atlantic Ocean than the Pacific Ocean. Reporting the geometric mean and the range of one geometric standard error below and above the geometric mean, the pelagic N2 fixation rate in the global ocean is estimated to be 62 (53–73) TgNyr−1 and the pelagic diazotrophic biomass in the global ocean is estimated to be 4.7 (2.3–9.6) TgC from cell counts and to 89 (40–20 200) TgC from nifH-based abundances. Uncertainties related to biomass conversion factors can change the estimate of geometric mean pelagic diazotrophic biomass in the global ocean by about ±70 %. This evolving database can be used to study spatial and temporal distributions and variations of marine N2 fixation, to validate geochemical estimates and to parameterize and validate biogeochemical models. The database is 25 stored in PANGAEA (http://doi.pangaea.de/10.1594/PANGAEA.774851).
Resumo:
Máster en Oceanografía
Resumo:
The research activity carried out during the PhD course in Electrical Engineering belongs to the branch of electric and electronic measurements. The main subject of the present thesis is a distributed measurement system to be installed in Medium Voltage power networks, as well as the method developed to analyze data acquired by the measurement system itself and to monitor power quality. In chapter 2 the increasing interest towards power quality in electrical systems is illustrated, by reporting the international research activity inherent to the problem and the relevant standards and guidelines emitted. The aspect of the quality of voltage provided by utilities and influenced by customers in the various points of a network came out only in recent years, in particular as a consequence of the energy market liberalization. Usually, the concept of quality of the delivered energy has been associated mostly to its continuity. Hence the reliability was the main characteristic to be ensured for power systems. Nowadays, the number and duration of interruptions are the “quality indicators” commonly perceived by most customers; for this reason, a short section is dedicated also to network reliability and its regulation. In this contest it should be noted that although the measurement system developed during the research activity belongs to the field of power quality evaluation systems, the information registered in real time by its remote stations can be used to improve the system reliability too. Given the vast scenario of power quality degrading phenomena that usually can occur in distribution networks, the study has been focused on electromagnetic transients affecting line voltages. The outcome of such a study has been the design and realization of a distributed measurement system which continuously monitor the phase signals in different points of a network, detect the occurrence of transients superposed to the fundamental steady state component and register the time of occurrence of such events. The data set is finally used to locate the source of the transient disturbance propagating along the network lines. Most of the oscillatory transients affecting line voltages are due to faults occurring in any point of the distribution system and have to be seen before protection equipment intervention. An important conclusion is that the method can improve the monitored network reliability, since the knowledge of the location of a fault allows the energy manager to reduce as much as possible both the area of the network to be disconnected for protection purposes and the time spent by technical staff to recover the abnormal condition and/or the damage. The part of the thesis presenting the results of such a study and activity is structured as follows: chapter 3 deals with the propagation of electromagnetic transients in power systems by defining characteristics and causes of the phenomena and briefly reporting the theory and approaches used to study transients propagation. Then the state of the art concerning methods to detect and locate faults in distribution networks is presented. Finally the attention is paid on the particular technique adopted for the same purpose during the thesis, and the methods developed on the basis of such approach. Chapter 4 reports the configuration of the distribution networks on which the fault location method has been applied by means of simulations as well as the results obtained case by case. In this way the performance featured by the location procedure firstly in ideal then in realistic operating conditions are tested. In chapter 5 the measurement system designed to implement the transients detection and fault location method is presented. The hardware belonging to the measurement chain of every acquisition channel in remote stations is described. Then, the global measurement system is characterized by considering the non ideal aspects of each device that can concur to the final combined uncertainty on the estimated position of the fault in the network under test. Finally, such parameter is computed according to the Guide to the Expression of Uncertainty in Measurements, by means of a numeric procedure. In the last chapter a device is described that has been designed and realized during the PhD activity aiming at substituting the commercial capacitive voltage divider belonging to the conditioning block of the measurement chain. Such a study has been carried out aiming at providing an alternative to the used transducer that could feature equivalent performance and lower cost. In this way, the economical impact of the investment associated to the whole measurement system would be significantly reduced, making the method application much more feasible.
Resumo:
Providing support for multimedia applications on low-power mobile devices remains a significant research challenge. This is primarily due to two reasons: • Portable mobile devices have modest sizes and weights, and therefore inadequate resources, low CPU processing power, reduced display capabilities, limited memory and battery lifetimes as compared to desktop and laptop systems. • On the other hand, multimedia applications tend to have distinctive QoS and processing requirementswhichmake themextremely resource-demanding. This innate conflict introduces key research challenges in the design of multimedia applications and device-level power optimization. Energy efficiency in this kind of platforms can be achieved only via a synergistic hardware and software approach. In fact, while System-on-Chips are more and more programmable thus providing functional flexibility, hardwareonly power reduction techniques cannot maintain consumption under acceptable bounds. It is well understood both in research and industry that system configuration andmanagement cannot be controlled efficiently only relying on low-level firmware and hardware drivers. In fact, at this level there is lack of information about user application activity and consequently about the impact of power management decision on QoS. Even though operating system support and integration is a requirement for effective performance and energy management, more effective and QoSsensitive power management is possible if power awareness and hardware configuration control strategies are tightly integratedwith domain-specificmiddleware services. The main objective of this PhD research has been the exploration and the integration of amiddleware-centric energymanagement with applications and operating-system. We choose to focus on the CPU-memory and the video subsystems, since they are the most power-hungry components of an embedded system. A second main objective has been the definition and implementation of software facilities (like toolkits, API, and run-time engines) in order to improve programmability and performance efficiency of such platforms. Enhancing energy efficiency and programmability ofmodernMulti-Processor System-on-Chips (MPSoCs) Consumer applications are characterized by tight time-to-market constraints and extreme cost sensitivity. The software that runs on modern embedded systems must be high performance, real time, and even more important low power. Although much progress has been made on these problems, much remains to be done. Multi-processor System-on-Chip (MPSoC) are increasingly popular platforms for high performance embedded applications. This leads to interesting challenges in software development since efficient software development is a major issue for MPSoc designers. An important step in deploying applications on multiprocessors is to allocate and schedule concurrent tasks to the processing and communication resources of the platform. The problem of allocating and scheduling precedenceconstrained tasks on processors in a distributed real-time system is NP-hard. There is a clear need for deployment technology that addresses thesemulti processing issues. This problem can be tackled by means of specific middleware which takes care of allocating and scheduling tasks on the different processing elements and which tries also to optimize the power consumption of the entire multiprocessor platform. This dissertation is an attempt to develop insight into efficient, flexible and optimalmethods for allocating and scheduling concurrent applications tomultiprocessor architectures. It is a well-known problem in literature: this kind of optimization problems are very complex even in much simplified variants, therefore most authors propose simplified models and heuristic approaches to solve it in reasonable time. Model simplification is often achieved by abstracting away platform implementation ”details”. As a result, optimization problems become more tractable, even reaching polynomial time complexity. Unfortunately, this approach creates an abstraction gap between the optimization model and the real HW-SW platform. The main issue with heuristic or, more in general, with incomplete search is that they introduce an optimality gap of unknown size. They provide very limited or no information on the distance between the best computed solution and the optimal one. The goal of this work is to address both abstraction and optimality gaps, formulating accurate models which accounts for a number of ”non-idealities” in real-life hardware platforms, developing novel mapping algorithms that deterministically find optimal solutions, and implementing software infrastructures required by developers to deploy applications for the targetMPSoC platforms. Energy Efficient LCDBacklightAutoregulation on Real-LifeMultimediaAp- plication Processor Despite the ever increasing advances in Liquid Crystal Display’s (LCD) technology, their power consumption is still one of the major limitations to the battery life of mobile appliances such as smart phones, portable media players, gaming and navigation devices. There is a clear trend towards the increase of LCD size to exploit the multimedia capabilities of portable devices that can receive and render high definition video and pictures. Multimedia applications running on these devices require LCD screen sizes of 2.2 to 3.5 inches andmore to display video sequences and pictures with the required quality. LCD power consumption is dependent on the backlight and pixel matrix driving circuits and is typically proportional to the panel area. As a result, the contribution is also likely to be considerable in future mobile appliances. To address this issue, companies are proposing low power technologies suitable for mobile applications supporting low power states and image control techniques. On the research side, several power saving schemes and algorithms can be found in literature. Some of them exploit software-only techniques to change the image content to reduce the power associated with the crystal polarization, some others are aimed at decreasing the backlight level while compensating the luminance reduction by compensating the user perceived quality degradation using pixel-by-pixel image processing algorithms. The major limitation of these techniques is that they rely on the CPU to perform pixel-based manipulations and their impact on CPU utilization and power consumption has not been assessed. This PhDdissertation shows an alternative approach that exploits in a smart and efficient way the hardware image processing unit almost integrated in every current multimedia application processors to implement a hardware assisted image compensation that allows dynamic scaling of the backlight with a negligible impact on QoS. The proposed approach overcomes CPU-intensive techniques by saving system power without requiring either a dedicated display technology or hardware modification. Thesis Overview The remainder of the thesis is organized as follows. The first part is focused on enhancing energy efficiency and programmability of modern Multi-Processor System-on-Chips (MPSoCs). Chapter 2 gives an overview about architectural trends in embedded systems, illustrating the principal features of new technologies and the key challenges still open. Chapter 3 presents a QoS-driven methodology for optimal allocation and frequency selection for MPSoCs. The methodology is based on functional simulation and full system power estimation. Chapter 4 targets allocation and scheduling of pipelined stream-oriented applications on top of distributed memory architectures with messaging support. We tackled the complexity of the problem by means of decomposition and no-good generation, and prove the increased computational efficiency of this approach with respect to traditional ones. Chapter 5 presents a cooperative framework to solve the allocation, scheduling and voltage/frequency selection problem to optimality for energyefficient MPSoCs, while in Chapter 6 applications with conditional task graph are taken into account. Finally Chapter 7 proposes a complete framework, called Cellflow, to help programmers in efficient software implementation on a real architecture, the Cell Broadband Engine processor. The second part is focused on energy efficient software techniques for LCD displays. Chapter 8 gives an overview about portable device display technologies, illustrating the principal features of LCD video systems and the key challenges still open. Chapter 9 shows several energy efficient software techniques present in literature, while Chapter 10 illustrates in details our method for saving significant power in an LCD panel. Finally, conclusions are drawn, reporting the main research contributions that have been discussed throughout this dissertation.
Resumo:
[EN]Respiration of zooplanktonic organisms is an significant fraction of the global carbon cycle. However, it estimation in order to obtain the data required in oceanography is still a problem. In this work, we studied respiration rates in laboratory and field experiments. Laboratory experiments using Daphnia spp. showed a significant decrease of respiration rates during starvation. In addition, we measured the gut fluorescence and enzymatic activity (electron transfer system, ETS). The former did not show the expected decrease probably due to the volume of the incubators. The relationship between respiration and ETS presented the classical variability ranging between 0.5 and 1 as observed in previous works. Copepod respiration rates were measured during RAPROCAN 1504 cruise around the Canary Islands.
Resumo:
In case of severe osteoarthritis at the knee causing pain, deformity, and loss of stability and mobility, the clinicians consider that the substitution of these surfaces by means of joint prostheses. The objectives to be pursued by this surgery are: complete pain elimination, restoration of the normal physiological mobility and joint stability, correction of all deformities and, thus, of limping. The knee surgical navigation systems have bee developed in computer-aided surgery in order to improve the surgical final outcome in total knee arthroplasty. These systems provide the surgeon with quantitative and real-time information about each surgical action, like bone cut executions and prosthesis component alignment, by mean of tracking tools rigidly fixed onto the femur and the tibia. Nevertheless, there is still a margin of error due to the incorrect surgical procedures and to the still limited number of kinematic information provided by the current systems. Particularly, patello-femoral joint kinematics is not considered in knee surgical navigation. It is also unclear and, thus, a source of misunderstanding, what the most appropriate methodology is to study the patellar motion. In addition, also the knee ligamentous apparatus is superficially considered in navigated total knee arthroplasty, without taking into account how their physiological behavior is altered by this surgery. The aim of the present research work was to provide new functional and biomechanical assessments for the improvement of the surgical navigation systems for joint replacement in the human lower limb. This was mainly realized by means of the identification and development of new techniques that allow a thorough comprehension of the functioning of the knee joint, with particular attention to the patello-femoral joint and to the main knee soft tissues. A knee surgical navigation system with active markers was used in all research activities presented in this research work. Particularly, preliminary test were performed in order to assess the system accuracy and the robustness of a number of navigation procedures. Four studies were performed in-vivo on patients requiring total knee arthroplasty and randomly implanted by means of traditional and navigated procedures in order to check for the real efficacy of the latter with respect to the former. In order to cope with assessment of patello-femoral joint kinematics in the intact and replaced knees, twenty in-vitro tests were performed by using a prototypal tracking tool also for the patella. In addition to standard anatomical and articular recommendations, original proposals for defining the patellar anatomical-based reference frame and for studying the patello-femoral joint kinematics were reported and used in these tests. These definitions were applied to two further in-vitro tests in which, for the first time, also the implant of patellar component insert was fully navigated. In addition, an original technique to analyze the main knee soft tissues by means of anatomical-based fiber mappings was also reported and used in the same tests. The preliminary instrumental tests revealed a system accuracy within the millimeter and a good inter- and intra-observer repeatability in defining all anatomical reference frames. In in-vivo studies, the general alignments of femoral and tibial prosthesis components and of the lower limb mechanical axis, as measured on radiographs, was more satisfactory, i.e. within ±3°, in those patient in which total knee arthroplasty was performed by navigated procedures. As for in-vitro tests, consistent patello-femoral joint kinematic patterns were observed over specimens throughout the knee flexion arc. Generally, the physiological intact knee patellar motion was not restored after the implant. This restoration was successfully achieved in the two further tests where all component implants, included the patellar insert, were fully navigated, i.e. by means of intra-operative assessment of also patellar component positioning and general tibio-femoral and patello-femoral joint assessment. The tests for assessing the behavior of the main knee ligaments revealed the complexity of the latter and the different functional roles played by the several sub-bundles compounding each ligament. Also in this case, total knee arthroplasty altered the physiological behavior of these knee soft tissues. These results reveal in-vitro the relevance and the feasibility of the applications of new techniques for accurate knee soft tissues monitoring, patellar tracking assessment and navigated patellar resurfacing intra-operatively in the contest of the most modern operative techniques. This present research work gives a contribution to the much controversial knowledge on the normal and replaced of knee kinematics by testing the reported new methodologies. The consistence of these results provides fundamental information for the comprehension and improvements of knee orthopedic treatments. In the future, the reported new techniques can be safely applied in-vivo and also adopted in other joint replacements.
Resumo:
This PhD thesis addresses the topic of large-scale interactions between climate and marine biogeochemistry. To this end, centennial simulations are performed under present and projected future climate conditions with a coupled ocean-atmosphere model containing a complex marine biogeochemistry model. The role of marine biogeochemistry in the climate system is first investigated. Phytoplankton solar radiation absorption in the upper ocean enhances sea surface temperatures and upper ocean stratification. The associated increase in ocean latent heat losses raises atmospheric temperatures and water vapor. Atmospheric circulation is modified at tropical and extratropical latitudes with impacts on precipitation, incoming solar radiation, and ocean circulation which cause upper-ocean heat content to decrease at tropical latitudes and to increase at middle latitudes. Marine biogeochemistry is tightly related to physical climate variability, which may vary in response to internal natural dynamics or to external forcing such as anthropogenic carbon emissions. Wind changes associated with the North Atlantic Oscillation (NAO), the dominant mode of climate variability in the North Atlantic, affect ocean properties by means of momentum, heat, and freshwater fluxes. Changes in upper ocean temperature and mixing impact the spatial structure and seasonality of North Atlantic phytoplankton through light and nutrient limitations. These changes affect the capability of the North Atlantic Ocean of absorbing atmospheric CO2 and of fixing it inside sinking particulate organic matter. Low-frequency NAO phases determine a delayed response of ocean circulation, temperature and salinity, which in turn affects stratification and marine biogeochemistry. In 20th and 21st century simulations natural wind fluctuations in the North Pacific, related to the two dominant modes of atmospheric variability, affect the spatial structure and the magnitude of the phytoplankton spring bloom through changes in upper-ocean temperature and mixing. The impacts of human-induced emissions in the 21st century are generally larger than natural climate fluctuations, with the phytoplankton spring bloom starting one month earlier than in the 20th century and with ~50% lower magnitude. This PhD thesis advances the knowledge of bio-physical interactions within the global climate, highlighting the intrinsic coupling between physical climate and biosphere, and providing a framework on which future studies of Earth System change can be built on.
Resumo:
With life expectancies increasing around the world, populations are getting age and neurodegenerative diseases have become a global issue. For this reason we have focused our attention on the two most important neurodegenerative diseases: Parkinson’s and Alzheimer’s. Parkinson’s disease is a chronic progressive neurodegenerative movement disorder of multi-factorial origin. Environmental toxins as well as agricultural chemicals have been associated with PD. Has been observed that N/OFQ contributes to both neurotoxicity and symptoms associated with PD and that pronociceptin gene expression is up-regulated in rat SN of 6-OHDA and MPP induced experimental parkinsonism. First, we investigated the role of N/OFQ-NOP system in the pathogenesis of PD in an animal model developed using PQ and/or MB. Then we studied Alzheimer's disease. This disorder is defined as a progressive neurologic disease of the brain leading to the irreversible loss of neurons and the loss of intellectual abilities, including memory and reasoning, which become severe enough to impede social or occupational functioning. Effective biomarker tests could prevent such devastating damage occurring. We utilized the peripheral blood cells of AD discordant monozygotic twin in the search of peripheral markers which could reflect the pathology within the brain, and also support the hypothesis that PBMC might be a useful model of epigenetic gene regulation in the brain. We investigated the mRNA levels in several genes involve in AD pathogenesis, as well DNA methylation by MSP Real-Time PCR. Finally by Western Blotting we assess the immunoreactivity levels for histone modifications. Our results support the idea that epigenetic changes assessed in PBMCs can also be useful in neurodegenerative disorders, like AD and PD, enabling identification of new biomarkers in order to develop early diagnostic programs.
Resumo:
Der Globale Wandel ist im Begriff, den Tourismus zu verändern. Die Wechselwirkung von Tourismus und Klimawandel sind beidseitiger Art. Die vorliegende Arbeit zeigt Möglichkeiten der Adaption und einen wandelbaren Fremdenverkehr. Eine Übersicht der gängigen Tourismusmodelle stellt den Stand der Forschung dar. Der Fremdenverkehr ist durch drei Faktoren massiv geprägt: Die Nachfrage und Motivation, die Reisemittler und Veranstalter sowie das Destinationsangebot. Bei der Motivation wirken Motiv und Anreiz Motivationspsychologisch betrachtet auf die Reiseentscheidung deren Grundlage verarbeitete Informationen sind. Reisemittler und Veranstalter haben einen großen Einfluss auf Entscheidungsprozesse. Neue IuK Technologien haben deren Arbeit grundlegend verändert. Das Tourismusangebot wird stark durch die naturräumlichen Gegebenheiten sowie das politische System bestimmt. Überlebenswichtig für die Destination ist die evolutionstheoretisch etrachtete Fitnessmaximierung also Adaption und Wandel, um sich an geänderte Rahmenbedingungen anpassen zu können. Gerade im Bereich des Klimawandels müssen Maßnahmen ergriffen werden. Aber auch die Marktsättigung gerade in Verbindung mit der aktuellen Finanzkrise wirkt besonders schwer auf die Destination. Eine hohes Innovationsvermögen, Trendscanning und der Zusammenschluss in flexiblen Netzwerkclustern können einen Kundenmehrwert erzeugen. Die Fitnessmaximierung ist somit Überlebensziel der Destination und führt zur Kundenzufriedenheit die im Sättigungsmarkt alleinig Wachstum generieren kann.
Resumo:
Over the last 60 years, computers and software have favoured incredible advancements in every field. Nowadays, however, these systems are so complicated that it is difficult – if not challenging – to understand whether they meet some requirement or are able to show some desired behaviour or property. This dissertation introduces a Just-In-Time (JIT) a posteriori approach to perform the conformance check to identify any deviation from the desired behaviour as soon as possible, and possibly apply some corrections. The declarative framework that implements our approach – entirely developed on the promising open source forward-chaining Production Rule System (PRS) named Drools – consists of three components: 1. a monitoring module based on a novel, efficient implementation of Event Calculus (EC), 2. a general purpose hybrid reasoning module (the first of its genre) merging temporal, semantic, fuzzy and rule-based reasoning, 3. a logic formalism based on the concept of expectations introducing Event-Condition-Expectation rules (ECE-rules) to assess the global conformance of a system. The framework is also accompanied by an optional module that provides Probabilistic Inductive Logic Programming (PILP). By shifting the conformance check from after execution to just in time, this approach combines the advantages of many a posteriori and a priori methods proposed in literature. Quite remarkably, if the corrective actions are explicitly given, the reactive nature of this methodology allows to reconcile any deviations from the desired behaviour as soon as it is detected. In conclusion, the proposed methodology brings some advancements to solve the problem of the conformance checking, helping to fill the gap between humans and the increasingly complex technology.
Resumo:
Flüchtige organische Bestandteile (engl.: VOC) sind in der Atmosphäre in Spuren vorhanden, spielen aber trotzdem eine wichtige Rolle in der Luftchemie: sie beeinflussen das Ozon der Troposphäre, städtischen Smog, Oxidationskapazität und haben direkte und indirekte Auswirkungen auf die globale Klimaveränderung. Eine wichtige Klasse der VOC sind die Nicht-Methan-Kohlenwasserstoffe (engl.: NMHC), die überwiegend von anthropogenen Quellen kommen. Aus diesem Grund ist für Luftchemiker ein Messinstrument nötig, das die VOC, die NMHC eingeschlossen, mit einer höheren Zeitauflösung misst, besonders für Echtzeitmessungen an Bord eines Forschungsflugzeuges. Dafür wurde das System zur schnellen Beobachtung von organischen Spuren (engl.: FOTOS) entworfen, gebaut für den Einsatz in einem neuen Wissenschaftlichen Flugzeug, das in großen Höhen und über weite Strecken fliegt, genannt HALO. In der Folge wurde FOTOS in zwei Messkampagnen am Boden getestet. FOTOS wurde entworfen und gebaut mit einem speziell angefertigten, automatisierten, kryogenen Probensystem mit drei Fallen und einem angepassten, erworbenen schnellen GC-MS. Ziel dieses Aufbaus war es, die Vielseitigkeit zu vergrößern und das Störungspotential zu verringern, deshalb wurden keine chemischen Trocknungsmittel oder adsorbierenden Stoffe verwendet. FOTOS erreichte eine Probenfrequenz von 5.5 Minuten, während es mindestens 13 verschiedene C2- bis C5-NMHC maß. Die Drei-Sigma-Detektionsgrenze für n- und iso-Pentan wurde als 2.6 und 2.0 pptv ermittelt, in dieser Reihenfolge. Labortests bestätigten, dass FOTOS ein vielseitiges, robustes, hochautomatisiertes, präzises, genaues, empfindliches Instrument ist, geeignet für Echtzeitmessungen von VOC in Probenfrequenzen, die angemessen sind für ein Forschungsflugzeug wie HALO. Um die Leistung von FOTOS zu bestätigen, wurde vom 26. Januar bis 4. Februar 2010 ein Zwischenvergleich gemacht mit dem GC-FID-System am Meteorologischen Observatorium Hohenpeißenberg, einer WMO-GAW-globalen Station. Dreizehn verschiedene NMHC wurden innerhalb des Rahmens der GWA Data Quality Objectives (DQO) analysiert und verglichen. Mehr als 80% der Messungen von sechs C3- bis C5-NMHC erfüllten diese DQO. Diese erste Messkampagne im Feld hob die Robustheit und Messgenauigkeit von FOTOS hervor, zusätzlich zu dem Vorteil der höheren Probenfrequenz, sogar in einer Messung am Boden. Um die Möglichkeiten dieses Instrumentes im Feld zu zeigen, maß FOTOS ausgewählte leichte NMHC während einer Messkampagne im Borealen Waldgebiet, HUMPPA-COPEC 2010. Vom 12. Juli bis zum 12. August 2010 beteiligte sich eine internationale Gruppe von Instituten und Instrumenten an Messungen physikalischer und chemischer Größen der Gas- und Partikelphasen der Luft über dem Borealen Wald an der SMEAR II-Station nahe Hyyttiälä, Finnland. Es wurden mehrere Hauptpunkte von Interesse im Mischungsverhältnis der Alkane und im Isomerenverhätnis von Pentan identifiziert, insbesondere sehr unterschiedliche Perioden niedriger und hoher Variabilität, drei Rauchschwaden von Biomassen-Verbrennung von russischen Waldbränden und zwei Tage mit extrem sauberer Luft aus der Polarregion. Vergleiche der NMHC mit anderen anthropogenen Indikatoren zeigten mehrere Quellen anthropogener Einflüsse am Ort auf und erlaubten eine Unterscheidung zwischen lokalen und weiter entfernten Quellen. Auf einen minimalen natürlichen Beitrag zum 24h-Kreislauf von NOx wurde geschlussfolgert aus der Korrelation von NOx mit Alkanen. Altersschätzungen der Luftmassen durch das Isomerenverhältnis von Pentan wurden erschwert durch sich verändernde Verhältnisse der Quellen und durch Besonderheiten der Photochemie während des Sommers im hohen Norden. Diese Messungen zeigten den Wert des Messens leichter NMHC, selbst in abgelegenen Regionen, als einen zusätzlichen spezifischen Marker von anthropogenem Einfluss.