860 resultados para life-time contributions and retirement


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Distributions of free and bound n-alkanes, n-alkanoic acids, and n-alkanols were determined in order to compare the character of organic matter contained in organic-carbon-rich sediments from two sites sampled by the hydraulic piston corer. Two diatomaceous debris-flow samples of Pleistocene age were obtained from Hole 530B in the Angola Basin. A sample of bioturbated Pleistocene diatomaceous clay and another of bioturbated late Miocene nannofossil clay were collected from Hole 532 on the Walvis Ridge. Geolipid distributions of all samples contain large terrigenous contributions and lesser amounts of marine components. Similarities in organic matter contents of Hole 530B and Hole 532 sediments suggest that a common depositional setting, probably on the Walvis Ridge, was the original source of these sediments through Quaternary, and possibly late Neogene, times and that downslope relocation of these biogenic deposits has frequently occurred.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In arctic populations of Macrothrix hirsuticornis life cycles are mainly governed by temperature. This was found by using laboratory cultures in combination with the analysis of population samples from waters in Svalbard. In arctic waters ex-ephippio-++ usually produce gamogenetic F1-++ together with a high percentage of oo, which have to fertilize the resting eggs. Temperatures around 14°C, which are very rare in waters of Svalbard, will induce parthenogenetic oo in the F1 and even the F2-generation, a mode of reproduction normally found in Macrothrix-populations of Central Europe. This was found in laboratory cultures of M. hirsuticornis from Bear Island, and there was evidence, that a similar cycle occurs in warm wells in Spitsbergen. The arctic distribution of M. hirsuticornis mainly depends on temperature, which regulates the speed of individual development. But this can only be understood together with the length of time, during which suitable life conditions are given. Physiological adaptations to life in waters in high latitudes could not be found, in spite of the extreme northern occurrence of M. hirsuticornis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ocean acidification, caused by increased atmospheric carbon dioxide (CO2) concentrations, is currently an important environmental problem. It is therefore necessary to investigate the effects of ocean acidification on all life stages of a wide range of marine organisms. However, few studies have examined the effects of increased CO2 on early life stages of organisms, including corals. Using a range of pH values (pH 7.3, 7.6, and 8.0) in manipulative duplicate aquarium experiments, we have evaluated the effects of increased CO2 on early life stages (larval and polyp stages) of Acropora spp. with the aim of estimating CO2 tolerance thresholds at these stages. Larval survival rates did not differ significantly between the reduced pH and control conditions. In contrast, polyp growth and algal infection rates were significantly decreased at reduced pH levels compared to control conditions. These results suggest that future ocean acidification may lead to reduced primary polyp growth and delayed establishment of symbiosis. Stress exposure experiments using longer experimental time scales and lower levels of CO2 concentrations than those used in this study are needed to establish the threshold of CO2 emissions required to sustain coral reef ecosystems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pollen and organic-walled dinoflagellate cyst assemblages from core GeoB 9503-5 retrieved from the mud-belt ( 50 m water depth) off the Senegal River mouth have been analyzed to reconstruct short-term palaeoceanographic and palaeoenvironmental changes in subtropical NW Africa during the time interval from ca. 4200 to 1200 cal yr BP. Our study emphasizes significant coeval changes in continental and oceanic environments in and off Senegal and shows that initial dry conditions were followed by a strong and rapid increase in humidity between ca. 2900 and 2500 cal yr BP. After ca. 2500 cal yr BP, the environment slowly became drier again as indicated by slight increases in Sahelian savannah and desert elements in the pollen record. Around ca. 2200 cal yr BP, this relatively dry period ended with periodic pulses of high terrigenous contributions and strong fluctuations in fern spore and river plume dinoflagellate cyst percentages as well as in the fluxes of pollen, dinoflagellate cysts, fresh-water algae and plant cuticles, suggesting "episodic flash flood" events of the Senegal River. The driest phase developed after about 2100 cal yr BP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Early life stages of marine crustaceans respond sensitively to elevated seawater PCO2. However, the underlying physiological mechanisms have not been studied well. We therefore investigated the effects of elevated seawater PCO2 on oxygen consumption, dry weight, elemental composition, median developmental time (MDT) and mortality in zoea I larvae of the spider crab Hyas araneus (Svalbard 79°N/11°E; collection, May 2009; hatch, December 2009). At the time of moulting, oxygen consumption rate had reached a steady state level under control conditions. In contrast, elevated seawater PCO2 caused the metabolic rate to rise continuously leading to a maximum 1.5-fold increase beyond control level a few days before moulting into the second stage (zoea II), followed by a pronounced decrease. Dry weight of larvae reared under high CO2 conditions was lower than in control larvae at the beginning of the moult cycle, yet this difference had disappeared at the time of moulting. MDT of zoea I varied between 45 ± 1 days under control conditions and 42 ± 2 days under the highest seawater CO2 concentration. The present study indicates that larval development under elevated seawater PCO2 levels results in higher metabolic costs during premoulting events in zoea I. However, H. araneus zoea I larvae seem to be able to compensate for higher metabolic costs as larval MDT and survival was not affected by elevated PCO2 levels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During Integrated Ocean Drilling Program Expedition 302 (Arctic Coring Expedition (ACEX)) a more than 200 m thick sequence of Paleogene organic carbon (OC)-rich (black shale type) sediments was drilled. Here we present new biomarker data determined in ACEX sediment samples to decipher processes controlling OC accumulation and their paleoenvironmental significance during periods of Paleogene global warmth and proposed increased freshwater discharge in the early Cenozoic. Specific source-related biomarkers including n-alkanes, fatty acids, isoprenoids, carotenoids, hopanes/hopenes, hopanoic acids, aromatic terpenoids, and long-chain alkenones show a high variability of components, derived from marine and terrestrial origin. The distribution of hopanoic acid isomers is dominated by compounds with the biological 17beta(H), 21beta(H) configuration indicating a low level of maturity. On the basis of the biomarker data the terrestrial OC supply was significantly enriched during the late Paleocene and part of the earliest Eocene, whereas increased aquatic contributions and euxinic conditions of variable intensity were determined for the Paleocene-Eocene thermal maximum and Eocene thermal maximum 2 events as well as the middle Eocene time interval. Furthermore, samples from the middle Eocene are characterized by the occurrence of long-chain alkenones, high proportions of lycopane, and high ratios (>0.6) of (n-C35 + lycopane)/n-C31. The occurrence of C37-alkenenones, which were first determined toward the end of the Azolla freshwater event, indicates that the OC becomes more marine in origin during the middle Eocene. Preliminary UK'37- based sea surface temperature (SST) values display a longterm temperature decrease of about 15C during the time interval 49-44.5 Ma (25° to 10°C), coinciding with the global benthic d18O cooling trend after the early Eocene climatic optimum. At about 46 Ma, parallel with onset of ice-rafted debris, SST (interpreted as summer temperatures) decreased to values <15°C. For the late early Miocene a SST of 11°-15°C was determined. Most of the middle Eocene ACEX sediments are characterized by a smooth short-chain n-alkane distribution, which may point to natural oil-type hydrocarbons from leakage of petroleum reservoirs or erosion of related source rocks and redeposition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coral reefs are globally threatened by climate change-related ocean warming and ocean acidification (OA). To date, slow-response mechanisms such as genetic adaptation have been considered the major determinant of coral reef persistence, with little consideration of rapid-response acclimatization mechanisms. These rapid mechanisms such as parental effects that can contribute to trans-generational acclimatization (e.g. epigenetics) have, however, been identified as important contributors to offspring response in other systems. We present the first evidence of parental effects in a cross-generational exposure to temperature and OA in reef-building corals. Here, we exposed adults to high (28.9°C, 805 µatm PCO2) or ambient (26.5°C, 417 µatm PCO2) temperature and OA treatments during the larval brooding period. Exposure to high treatment negatively affected adult performance, but their larvae exhibited size differences and metabolic acclimation when subsequently re-exposed, unlike larvae from parents exposed to ambient conditions. Understanding the innate capacity corals possess to respond to current and future climatic conditions is essential to reef protection and maintenance. Our results identify that parental effects may have an important role through (1) ameliorating the effects of stress through preconditioning and adaptive plasticity, and/or (2) amplifying the negative parental response through latent effects on future life stages. Whether the consequences of parental effects and the potential for trans-generational acclimatization are beneficial or maladaptive, our work identifies a critical need to expand currently proposed climate change outcomes for corals to further assess rapid response mechanisms that include non-genetic inheritance through parental contributions and classical epigenetic mechanisms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It seems like that backward- bending of labor supply function can be observed in Central Asian Countries such as Uzbekistan and Kazakhstan. People’s basic needs of life are satisfied and they do not increase labor supplies even if wage increases. It is possible to find some cases in which slowdowns increase, when a manager in a firm enforces penalties for workers have slowdowns. This phenomenon occurs because a worker prefers the position of equilibrium on the labor supply function always in the upper direction. This article explains the increase of free-riders by penalties and how to avoid them.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Distributed real-time embedded systems are becoming increasingly important to society. More demands will be made on them and greater reliance will be placed on the delivery of their services. A relevant subset of them is high-integrity or hard real-time systems, where failure can cause loss of life, environmental harm, or significant financial loss. Additionally, the evolution of communication networks and paradigms as well as the necessity of demanding processing power and fault tolerance, motivated the interconnection between electronic devices; many of the communications have the possibility of transferring data at a high speed. The concept of distributed systems emerged as systems where different parts are executed on several nodes that interact with each other via a communication network. Java’s popularity, facilities and platform independence have made it an interesting language for the real-time and embedded community. This was the motivation for the development of RTSJ (Real-Time Specification for Java), which is a language extension intended to allow the development of real-time systems. The use of Java in the development of high-integrity systems requires strict development and testing techniques. However, RTJS includes a number of language features that are forbidden in such systems. In the context of the HIJA project, the HRTJ (Hard Real-Time Java) profile was developed to define a robust subset of the language that is amenable to static analysis for high-integrity system certification. Currently, a specification under the Java community process (JSR- 302) is being developed. Its purpose is to define those capabilities needed to create safety critical applications with Java technology called Safety Critical Java (SCJ). However, neither RTSJ nor its profiles provide facilities to develop distributed realtime applications. This is an important issue, as most of the current and future systems will be distributed. The Distributed RTSJ (DRTSJ) Expert Group was created under the Java community process (JSR-50) in order to define appropriate abstractions to overcome this problem. Currently there is no formal specification. The aim of this thesis is to develop a communication middleware that is suitable for the development of distributed hard real-time systems in Java, based on the integration between the RMI (Remote Method Invocation) model and the HRTJ profile. It has been designed and implemented keeping in mind the main requirements such as the predictability and reliability in the timing behavior and the resource usage. iThe design starts with the definition of a computational model which identifies among other things: the communication model, most appropriate underlying network protocols, the analysis model, and a subset of Java for hard real-time systems. In the design, the remote references are the basic means for building distributed applications which are associated with all non-functional parameters and resources needed to implement synchronous or asynchronous remote invocations with real-time attributes. The proposed middleware separates the resource allocation from the execution itself by defining two phases and a specific threading mechanism that guarantees a suitable timing behavior. It also includes mechanisms to monitor the functional and the timing behavior. It provides independence from network protocol defining a network interface and modules. The JRMP protocol was modified to include two phases, non-functional parameters, and message size optimizations. Although serialization is one of the fundamental operations to ensure proper data transmission, current implementations are not suitable for hard real-time systems and there are no alternatives. This thesis proposes a predictable serialization that introduces a new compiler to generate optimized code according to the computational model. The proposed solution has the advantage of allowing us to schedule the communications and to adjust the memory usage at compilation time. In order to validate the design and the implementation a demanding validation process was carried out with emphasis in the functional behavior, the memory usage, the processor usage (the end-to-end response time and the response time in each functional block) and the network usage (real consumption according to the calculated consumption). The results obtained in an industrial application developed by Thales Avionics (a Flight Management System) and in exhaustive tests show that the design and the prototype are reliable for industrial applications with strict timing requirements. Los sistemas empotrados y distribuidos de tiempo real son cada vez más importantes para la sociedad. Su demanda aumenta y cada vez más dependemos de los servicios que proporcionan. Los sistemas de alta integridad constituyen un subconjunto de gran importancia. Se caracterizan por que un fallo en su funcionamiento puede causar pérdida de vidas humanas, daños en el medio ambiente o cuantiosas pérdidas económicas. La necesidad de satisfacer requisitos temporales estrictos, hace más complejo su desarrollo. Mientras que los sistemas empotrados se sigan expandiendo en nuestra sociedad, es necesario garantizar un coste de desarrollo ajustado mediante el uso técnicas adecuadas en su diseño, mantenimiento y certificación. En concreto, se requiere una tecnología flexible e independiente del hardware. La evolución de las redes y paradigmas de comunicación, así como la necesidad de mayor potencia de cómputo y de tolerancia a fallos, ha motivado la interconexión de dispositivos electrónicos. Los mecanismos de comunicación permiten la transferencia de datos con alta velocidad de transmisión. En este contexto, el concepto de sistema distribuido ha emergido como sistemas donde sus componentes se ejecutan en varios nodos en paralelo y que interactúan entre ellos mediante redes de comunicaciones. Un concepto interesante son los sistemas de tiempo real neutrales respecto a la plataforma de ejecución. Se caracterizan por la falta de conocimiento de esta plataforma durante su diseño. Esta propiedad es relevante, por que conviene que se ejecuten en la mayor variedad de arquitecturas, tienen una vida media mayor de diez anos y el lugar ˜ donde se ejecutan puede variar. El lenguaje de programación Java es una buena base para el desarrollo de este tipo de sistemas. Por este motivo se ha creado RTSJ (Real-Time Specification for Java), que es una extensión del lenguaje para permitir el desarrollo de sistemas de tiempo real. Sin embargo, RTSJ no proporciona facilidades para el desarrollo de aplicaciones distribuidas de tiempo real. Es una limitación importante dado que la mayoría de los actuales y futuros sistemas serán distribuidos. El grupo DRTSJ (DistributedRTSJ) fue creado bajo el proceso de la comunidad de Java (JSR-50) con el fin de definir las abstracciones que aborden dicha limitación, pero en la actualidad aun no existe una especificacion formal. El objetivo de esta tesis es desarrollar un middleware de comunicaciones para el desarrollo de sistemas distribuidos de tiempo real en Java, basado en la integración entre el modelo de RMI (Remote Method Invocation) y el perfil HRTJ. Ha sido diseñado e implementado teniendo en cuenta los requisitos principales, como la predecibilidad y la confiabilidad del comportamiento temporal y el uso de recursos. El diseño parte de la definición de un modelo computacional el cual identifica entre otras cosas: el modelo de comunicaciones, los protocolos de red subyacentes más adecuados, el modelo de análisis, y un subconjunto de Java para sistemas de tiempo real crítico. En el diseño, las referencias remotas son el medio básico para construcción de aplicaciones distribuidas las cuales son asociadas a todos los parámetros no funcionales y los recursos necesarios para la ejecución de invocaciones remotas síncronas o asíncronas con atributos de tiempo real. El middleware propuesto separa la asignación de recursos de la propia ejecución definiendo dos fases y un mecanismo de hebras especifico que garantiza un comportamiento temporal adecuado. Además se ha incluido mecanismos para supervisar el comportamiento funcional y temporal. Se ha buscado independencia del protocolo de red definiendo una interfaz de red y módulos específicos. También se ha modificado el protocolo JRMP para incluir diferentes fases, parámetros no funcionales y optimizaciones de los tamaños de los mensajes. Aunque la serialización es una de las operaciones fundamentales para asegurar la adecuada transmisión de datos, las actuales implementaciones no son adecuadas para sistemas críticos y no hay alternativas. Este trabajo propone una serialización predecible que ha implicado el desarrollo de un nuevo compilador para la generación de código optimizado acorde al modelo computacional. La solución propuesta tiene la ventaja que en tiempo de compilación nos permite planificar las comunicaciones y ajustar el uso de memoria. Con el objetivo de validar el diseño e implementación se ha llevado a cabo un exigente proceso de validación con énfasis en: el comportamiento funcional, el uso de memoria, el uso del procesador (tiempo de respuesta de extremo a extremo y en cada uno de los bloques funcionales) y el uso de la red (consumo real conforme al estimado). Los buenos resultados obtenidos en una aplicación industrial desarrollada por Thales Avionics (un sistema de gestión de vuelo) y en las pruebas exhaustivas han demostrado que el diseño y el prototipo son fiables para aplicaciones industriales con estrictos requisitos temporales.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A particle accelerator is any device that, using electromagnetic fields, is able to communicate energy to charged particles (typically electrons or ionized atoms), accelerating and/or energizing them up to the required level for its purpose. The applications of particle accelerators are countless, beginning in a common TV CRT, passing through medical X-ray devices, and ending in large ion colliders utilized to find the smallest details of the matter. Among the other engineering applications, the ion implantation devices to obtain better semiconductors and materials of amazing properties are included. Materials supporting irradiation for future nuclear fusion plants are also benefited from particle accelerators. There are many devices in a particle accelerator required for its correct operation. The most important are the particle sources, the guiding, focalizing and correcting magnets, the radiofrequency accelerating cavities, the fast deflection devices, the beam diagnostic mechanisms and the particle detectors. Most of the fast particle deflection devices have been built historically by using copper coils and ferrite cores which could effectuate a relatively fast magnetic deflection, but needed large voltages and currents to counteract the high coil inductance in a response in the microseconds range. Various beam stability considerations and the new range of energies and sizes of present time accelerators and their rings require new devices featuring an improved wakefield behaviour and faster response (in the nanoseconds range). This can only be achieved by an electromagnetic deflection device based on a transmission line. The electromagnetic deflection device (strip-line kicker) produces a transverse displacement on the particle beam travelling close to the speed of light, in order to extract the particles to another experiment or to inject them into a different accelerator. The deflection is carried out by the means of two short, opposite phase pulses. The diversion of the particles is exerted by the integrated Lorentz force of the electromagnetic field travelling along the kicker. This Thesis deals with a detailed calculation, manufacturing and test methodology for strip-line kicker devices. The methodology is then applied to two real cases which are fully designed, built, tested and finally installed in the CTF3 accelerator facility at CERN (Geneva). Analytical and numerical calculations, both in 2D and 3D, are detailed starting from the basic specifications in order to obtain a conceptual design. Time domain and frequency domain calculations are developed in the process using different FDM and FEM codes. The following concepts among others are analyzed: scattering parameters, resonating high order modes, the wakefields, etc. Several contributions are presented in the calculation process dealing specifically with strip-line kicker devices fed by electromagnetic pulses. Materials and components typically used for the fabrication of these devices are analyzed in the manufacturing section. Mechanical supports and connexions of electrodes are also detailed, presenting some interesting contributions on these concepts. The electromagnetic and vacuum tests are then analyzed. These tests are required to ensure that the manufactured devices fulfil the specifications. Finally, and only from the analytical point of view, the strip-line kickers are studied together with a pulsed power supply based on solid state power switches (MOSFETs). The solid state technology applied to pulsed power supplies is introduced and several circuit topologies are modelled and simulated to obtain fast and good flat-top pulses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this paper is to increase current empirical evidence on the relevance of real options for explaining firm investment decisions in oligopolistic markets. We study an actual investment case in the Spanish mobile telephony industry, the entrant in the market of a new operator, Yoigo. We analyze the option to abandon in order to show the relevance of the possibility of selling the company in an oligopolistic market where competitors are not allowed free entrance. The NPV (net present value) of the new entrant is calculated as a starting point. Then, based on the general approach proposed by Copeland and Antikarov (2001), a binomial tree is used to model managerial flexibility in discrete time periods, and value the option to abandon. The strike price of the option is calculated based on incremental EBITDA margins due to selling customers or merging with a competitor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although several profiling techniques for identifying performance bottlenecks in logic programs have been developed, they are generally not automatic and in most cases they do not provide enough information for identifying the root causes of such bottlenecks. This complicates using their results for guiding performance improvement. We present a profiling method and tool that provides such explanations. Our profiler associates cost centers to certain program elements and can measure different types of resource-related properties that affect performance, preserving the precedence of cost centers in the cali graph. It includes an automatic method for detecting procedures that are performance bottlenecks. The profiling tool has been integrated in a previously developed run-time checking framework to allow verification of certain properties when they cannot be verified statically. The approach allows checking global computational properties which require complex instrumentation tracking information about previous execution states, such as, e.g., that the execution time accumulated by a given procedure is not greater than a given bound. We have built a prototype implementation, integrated it in the Ciao/CiaoPP system and successfully applied it to performance improvement, automatic optimization (e.g., resource-aware specialization of programs), run-time checking, and debugging of global computational properties (e.g., resource usage) in Prolog programs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although several profiling techniques for identifying performance bottlenecks in logic programs have been developed, they are generally not automatic and in most cases they do not provide enough information for identifying the root causes of such bottlenecks. This complicates using their results for guiding performance improvement. We present a profiling method and tool that provides such explanations. Our profiler associates cost centers to certain program elements and can measure different types of resource-related properties that affect performance, preserving the precedence of cost centers in the call graph. It includes an automatic method for detecting procedures that are performance bottlenecks. The profiling tool has been integrated in a previously developed run-time checking framework to allow verification of certain properties when they cannot be verified statically. The approach allows checking global computational properties which require complex instrumentation tracking information about previous execution states, such as, e.g., that the execution time accumulated by a given procedure is not greater than a given bound. We have built a prototype implementation, integrated it in the Ciao/CiaoPP system and successfully applied it to performance improvement, automatic optimization (e.g., resource-aware specialization of programs), run-time checking, and debugging of global computational properties (e.g., resource usage) in Prolog programs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En la actualidad, el seguimiento de la dinámica de los procesos medio ambientales está considerado como un punto de gran interés en el campo medioambiental. La cobertura espacio temporal de los datos de teledetección proporciona información continua con una alta frecuencia temporal, permitiendo el análisis de la evolución de los ecosistemas desde diferentes escalas espacio-temporales. Aunque el valor de la teledetección ha sido ampliamente probado, en la actualidad solo existe un número reducido de metodologías que permiten su análisis de una forma cuantitativa. En la presente tesis se propone un esquema de trabajo para explotar las series temporales de datos de teledetección, basado en la combinación del análisis estadístico de series de tiempo y la fenometría. El objetivo principal es demostrar el uso de las series temporales de datos de teledetección para analizar la dinámica de variables medio ambientales de una forma cuantitativa. Los objetivos específicos son: (1) evaluar dichas variables medio ambientales y (2) desarrollar modelos empíricos para predecir su comportamiento futuro. Estos objetivos se materializan en cuatro aplicaciones cuyos objetivos específicos son: (1) evaluar y cartografiar estados fenológicos del cultivo del algodón mediante análisis espectral y fenometría, (2) evaluar y modelizar la estacionalidad de incendios forestales en dos regiones bioclimáticas mediante modelos dinámicos, (3) predecir el riesgo de incendios forestales a nivel pixel utilizando modelos dinámicos y (4) evaluar el funcionamiento de la vegetación en base a la autocorrelación temporal y la fenometría. Los resultados de esta tesis muestran la utilidad del ajuste de funciones para modelizar los índices espectrales AS1 y AS2. Los parámetros fenológicos derivados del ajuste de funciones permiten la identificación de distintos estados fenológicos del cultivo del algodón. El análisis espectral ha demostrado, de una forma cuantitativa, la presencia de un ciclo en el índice AS2 y de dos ciclos en el AS1 así como el comportamiento unimodal y bimodal de la estacionalidad de incendios en las regiones mediterránea y templada respectivamente. Modelos autorregresivos han sido utilizados para caracterizar la dinámica de la estacionalidad de incendios y para predecir de una forma muy precisa el riesgo de incendios forestales a nivel pixel. Ha sido demostrada la utilidad de la autocorrelación temporal para definir y caracterizar el funcionamiento de la vegetación a nivel pixel. Finalmente el concepto “Optical Functional Type” ha sido definido, donde se propone que los pixeles deberían ser considerados como unidades temporales y analizados en función de su dinámica temporal. ix SUMMARY A good understanding of land surface processes is considered as a key subject in environmental sciences. The spatial-temporal coverage of remote sensing data provides continuous observations with a high temporal frequency allowing the assessment of ecosystem evolution at different temporal and spatial scales. Although the value of remote sensing time series has been firmly proved, only few time series methods have been developed for analyzing this data in a quantitative and continuous manner. In the present dissertation a working framework to exploit Remote Sensing time series is proposed based on the combination of Time Series Analysis and phenometric approach. The main goal is to demonstrate the use of remote sensing time series to analyze quantitatively environmental variable dynamics. The specific objectives are (1) to assess environmental variables based on remote sensing time series and (2) to develop empirical models to forecast environmental variables. These objectives have been achieved in four applications which specific objectives are (1) assessing and mapping cotton crop phenological stages using spectral and phenometric analyses, (2) assessing and modeling fire seasonality in two different ecoregions by dynamic models, (3) forecasting forest fire risk on a pixel basis by dynamic models, and (4) assessing vegetation functioning based on temporal autocorrelation and phenometric analysis. The results of this dissertation show the usefulness of function fitting procedures to model AS1 and AS2. Phenometrics derived from function fitting procedure makes it possible to identify cotton crop phenological stages. Spectral analysis has demonstrated quantitatively the presence of one cycle in AS2 and two in AS1 and the unimodal and bimodal behaviour of fire seasonality in the Mediterranean and temperate ecoregions respectively. Autoregressive models has been used to characterize the dynamics of fire seasonality in two ecoregions and to forecasts accurately fire risk on a pixel basis. The usefulness of temporal autocorrelation to define and characterized land surface functioning has been demonstrated. And finally the “Optical Functional Types” concept has been proposed, in this approach pixels could be as temporal unities based on its temporal dynamics or functioning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Corrosion of reinforcing steel in concrete due to chloride ingress is one of the main causes of the deterioration of reinforced concrete structures. Structures most affected by such a corrosion are marine zone buildings and structures exposed to de-icing salts like highways and bridges. Such process is accompanied by an increase in volume of the corrosión products on the rebarsconcrete interface. Depending on the level of oxidation, iron can expand as much as six times its original volume. This increase in volume exerts tensile stresses in the surrounding concrete which result in cracking and spalling of the concrete cover if the concrete tensile strength is exceeded. The mechanism by which steel embedded in concrete corrodes in presence of chloride is the local breakdown of the passive layer formed in the highly alkaline condition of the concrete. It is assumed that corrosion initiates when a critical chloride content reaches the rebar surface. The mathematical formulation idealized the corrosion sequence as a two-stage process: an initiation stage, during which chloride ions penetrate to the reinforcing steel surface and depassivate it, and a propagation stage, in which active corrosion takes place until cracking of the concrete cover has occurred. The aim of this research is to develop computer tools to evaluate the duration of the service life of reinforced concrete structures, considering both the initiation and propagation periods. Such tools must offer a friendly interface to facilitate its use by the researchers even though their background is not in numerical simulation. For the evaluation of the initiation period different tools have been developed: Program TavProbabilidade: provides means to carry out a probability analysis of a chloride ingress model. Such a tool is necessary due to the lack of data and general uncertainties associated with the phenomenon of the chloride diffusion. It differs from the deterministic approach because it computes not just a chloride profile at a certain age, but a range of chloride profiles for each probability or occurrence. Program TavProbabilidade_Fiabilidade: carries out reliability analyses of the initiation period. It takes into account the critical value of the chloride concentration on the steel that causes breakdown of the passive layer and the beginning of the propagation stage. It differs from the deterministic analysis in that it does not predict if the corrosion is going to begin or not, but to quantifies the probability of corrosion initiation. Program TavDif_1D: was created to do a one dimension deterministic analysis of the chloride diffusion process by the finite element method (FEM) which numerically solves Fick’second Law. Despite of the different FEM solver already developed in one dimension, the decision to create a new code (TavDif_1D) was taken because of the need to have a solver with friendly interface for pre- and post-process according to the need of IETCC. An innovative tool was also developed with a systematic method devised to compare the ability of the different 1D models to predict the actual evolution of chloride ingress based on experimental measurements, and also to quantify the degree of agreement of the models with each others. For the evaluation of the entire service life of the structure: a computer program has been developed using finite elements method to do the coupling of both service life periods: initiation and propagation. The program for 2D (TavDif_2D) allows the complementary use of two external programs in a unique friendly interface: • GMSH - an finite element mesh generator and post-processing viewer • OOFEM – a finite element solver. This program (TavDif_2D) is responsible to decide in each time step when and where to start applying the boundary conditions of fracture mechanics module in function of the amount of chloride concentration and corrosion parameters (Icorr, etc). This program is also responsible to verify the presence and the degree of fracture in each element to send the Information of diffusion coefficient variation with the crack width. • GMSH - an finite element mesh generator and post-processing viewer • OOFEM – a finite element solver. The advantages of the FEM with the interface provided by the tool are: • the flexibility to input the data such as material property and boundary conditions as time dependent function. • the flexibility to predict the chloride concentration profile for different geometries. • the possibility to couple chloride diffusion (initiation stage) with chemical and mechanical behavior (propagation stage). The OOFEM code had to be modified to accept temperature, humidity and the time dependent values for the material properties, which is necessary to adequately describe the environmental variations. A 3-D simulation has been performed to simulate the behavior of the beam on both, action of the external load and the internal load caused by the corrosion products, using elements of imbedded fracture in order to plot the curve of the deflection of the central region of the beam versus the external load to compare with the experimental data.