919 resultados para recent 300 years
Resumo:
Chemical analyses have been carried out on 40 samples from the sediment surface and 210 samples from cores that were taken from the edge of the African continental block at the Arabian Sea (coasts of Somalia and Kenya, from Cape Guardafui to Mombasa) on the occasion of the Indian Ocean Expedition of the German research vessel "Meteor" during the years 1964/65. The carbonate content shows its maximum on the northern part of the continental shelf of Africa, where fossil reef debris furnish the detritic portion of carbonate. In the southern part of the continental shelf of Africa the portion of carbonate is low, as it is heavily diluted by the non-carbonatic detritus. It is also in the deep-sea that a lower carbonate content is encountered below the calcite compensation depth. Trace elements in the carbonates: On the shelf and in its vicinity Sr and Mg are enriched. The enrichment has been brought about by the portion of reef debris, as this latter contains aragonite (enrichment of Sr) as well as high-magnesium calcite. The greatest part of the slope contains carbonates that are poor in trace elements and mainly made up of foraminifera (and of coccoliths). Below the carbonate compensation depth another enrichment of Mg takes place in the carbonates, which is probably due to a selective dissolution of calcite in comparison to dolomite. The iron and manganese contents of the carbonates are high (iron higher in coast proximity, manganese higher in the depth), but not genuine, as they come about in the course of the extraction of the carbonates as a result of the dissolution of authigenic Mn-Fe-minerals. Non-carbonatic portion of the sediments: In coast proximity an enrichment of quartz comes about. Within the quartz-rich zone it is the elements V, Cr, Fe, Ti, and B that have been enriched in the non-carbonatic components. This enrichment must be attributed to an elevated content of heavy minerals. In the case of Ti and Fe the preliminary enrichment brought about by processes of lateritisation on the continent plays a certain role. Toward the deep-sea an enrichment of the elements Mn Ni, Cu, and Zn takes place; these enrichments must be explained by authigenic Mn-Fe-minerals. Within the Mn-rich zone a belt running parallel to the coast stands out that shows an increased Mn-enrichment. However, this increase in enrichment does not apply to the elements Ni, Cu, and Zn. It is probable that this latter increased enrichment comes about as a result of the migration of manganese to the sediment surface. (Within the sediments there prevail reductive conditions, in the presence of which Mn is capable of migration, whereas at the sediment surface its precipitation comes about under oxidizing conditions). The quantity of organic matter mainly is dependent on grain size and on the rate of sedimentation. On the shelf an impoverishment of organic matter is to be encountered, as the sediments are coarse-grained. In the depth the impoverishment must be explained on the strength of a small rate of sedimentation. Between those two ranges organic substance is enriched. P and N show an enrichment in comparison to Corg with this applying all the more the smaller the absolute quantity of Corg is. In this particular case one has to do with an enrichment coming about during the diagenetic processes of organic matter. A comparison with the sediments from the Indian and Pakistani continental border in Arabian Sea shows as follows: on the African continental border the coarse detrital material has been transported farther out to deep-sea, which has something to do with the greater inclination of the surface of sedimentation. Carbonate is found in greater abundance on the African side. Its chemical composition is influenced by reef-debris which is missing by Indian-Pakistani side. The content of organic matter is lower on the African side. Contrary to that, the enrichments of N and P compared to organic matter are of an equal order of magnitude on both sides of the Arabian Sea.
Resumo:
A high-resolution, accelerator radiocarbon dated climate record of the interval 8,000-18,000 years B.P. from Deep Sea Drilling Project site 480 (Guaymas Basin, Gulf of California) shows geochemical and lithological oscillations of oceanographic and climatic significance during deglaciation. Nonlaminated sediments are associated with cooler climatic conditions during the late glacial (up to 13,000 years B.P.), and from 10,300 to 10,800 years B.P., equivalent to the Younger Dryas event of the North Atlantic region. We propose that the changes from laminated (varved) to nonlaminated sediments resulted from increased oxygen content in Pacific intermediate waters during the glacial and the Younger Dryas episodes, and that the forcing for the latter event was global in scope. Prominent events of low delta18O are recorded in benthic foraminifera from 8,000 to 10,000 and at 12,000 years B.P.; evidence for an earlier event between 13,500 and 15,000 years B.P. is weaker. Maximum delta18O is found to have occurred 10,500, 13,500, and 15,000 years ago (and beyond). Oxygen isotopic variability most likely reflects changing temperature and salinity characteristics of Pacific waters of intermediate depth during deglaciation or environmental changes within the Gulf of California region. Several lines of evidence suggest that during deglaciation the climate of the American southwest was marked by increased precipitation that could have lowered salinity in the Gulf of California. Recent modelling studies show that cooling of the Gulf of Mexico due to glacial meltwater injection, which is believed to have occurred at least twice during deglaciation, would have resulted in increased precipitation with respect to evaporation in the American southwest during summertime. The timing of deglacial events in the Gulf of Mexico and the Gulf of California supports such an atmospheric teleconnection.
Resumo:
For many years the Torino Cosmogeophysics group has been studying sediment cores drilled from the Gallipoli Terrace in the Gulf of Taranto (Ionian Sea) and deposited in the last millennia. The gravity core GT90-3, in which the 18O series was measured, was drilled from the Gallipoli Terrace in the Gulf of Taranto (Ionian Sea) at 39°45'53''N, 17°53'33''E. It was extracted at a depth of 178 m and its length is 3.57 m. Thanks to its geographical location, the Gallipoli Terrace is a favourable site for climatic studies based on marine sediments, because of its closeness to the volcanically active Campanian area, a region that is unique in the world for its detailed historical documentation of volcanic eruptions. Tephra layers corresponding to historical eruptions were identified along the cores, thus allowing for accurate dating and determination of the sedimentation rate. The measurements performed in different cores from the same area showed that the sedimentation rate is uniform across the whole Gallipoli Terrace. We measured the oxygen isotope composition d18O of planktonic foraminifera. These measurements provided a high-resolution 2,200-year-long record. We sampled the core using a spacing of 2.5 mm corresponding to 3.87 years. Each sample of sediment (5 g) was soaked in 5% calgon solution overnight, then treated in 10% H2O2 to remove any residual organic material. Subsequently it was washed with a distilled-water jet through a sieve with a 150 µm mesh. The fraction > 150 µm was kept and oven-dried at 5°C. The planktonic foraminifera Globigerinoides ruber were picked out of the samples under a microscope. For each sample, 20-30 specimens were selected from the fraction comprised between 150 µm and 300 µm. The use of a relatively large number of specimens for each sample reduces the isotopic variability of individual organisms, giving a more representative d18O value. The stable isotope measurements were performed using a VG-PRISM mass spectrometer fitted with an automated ISO-CARB preparation device. Analytical precision based on internal standards was better than 0.1 per mil. Calibration of the mass spectrometer to VPDB scale was done using NBS19 and NBS18 carbonate standards. The strategic location of the drilling area makes this record a unique tool for climate and oceanographic studies of the Central Mediterranean.
Resumo:
Ocean observations carried out in the framework of the Collaborative Research Center 754 (SFB 754) "Climate-Biogeochemistry Interactions in the Tropical Ocean" are used to study (1) the structure of tropical oxygen minimum zones (OMZs), (2) the processes that contribute to the oxygen budget, and (3) long-term changes in the oxygen distribution. The OMZ of the eastern tropical North Atlantic (ETNA), located between the well-ventilated subtropical gyre and the equatorial oxygen maximum, is composed of a deep OMZ at about 400 m depth with its core region centred at about 20° W, 10° N and a shallow OMZ at about 100 m depth with lowest oxygen concentrations in proximity to the coastal upwelling region off Mauritania and Senegal. The oxygen budget of the deep OMZ is given by oxygen consumption mainly balanced by the oxygen supply due to meridional eddy fluxes (about 60%) and vertical mixing (about 20%, locally up to 30%). Advection by zonal jets is crucial for the establishment of the equatorial oxygen maximum. In the latitude range of the deep OMZ, it dominates the oxygen supply in the upper 300 to 400 m and generates the intermediate oxygen maximum between deep and shallow OMZs. Water mass ages from transient tracers indicate substantially older water masses in the core of the deep OMZ (about 120-180 years) compared to regions north and south of it. The deoxygenation of the ETNA OMZ during recent decades suggests a substantial imbalance in the oxygen budget: about 10% of the oxygen consumption during that period was not balanced by ventilation. Long-term oxygen observations show variability on interannual, decadal and multidecadal time scales that can partly be attributed to circulation changes. In comparison to the ETNA OMZ the eastern tropical South Pacific OMZ shows a similar structure including an equatorial oxygen maximum driven by zonal advection, but overall much lower oxygen concentrations approaching zero in extended regions. As the shape of the OMZs is set by ocean circulation, the widespread misrepresentation of the intermediate circulation in ocean circulation models substantially contributes to their oxygen bias, which might have significant impacts on predictions of future oxygen levels.
Resumo:
This tutorial will show how results from various Stata commands can be processed efficiently for inclusion in customized reports. A two-step procedure is proposed in which results are gathered and archived in the first step and then tabulated in the second step. Such an approach disentangles the tasks of computing results (which may take long) and preparing results for inclusion in presentations, papers, and reports (which you may have to do over and over). Examples using results from model estimation commands and also various other Stata commands such as tabulate, summarize, or correlate are presented. Furthermore, this tutorial shows how to dynamically link results into word processors or into LaTeX documents.
Resumo:
In recent years, applications in domains such as telecommunications, network security or large scale sensor networks showed the limits of the traditional store-then-process paradigm. In this context, Stream Processing Engines emerged as a candidate solution for all these applications demanding for high processing capacity with low processing latency guarantees. With Stream Processing Engines, data streams are not persisted but rather processed on the fly, producing results continuously. Current Stream Processing Engines, either centralized or distributed, do not scale with the input load due to single-node bottlenecks. Moreover, they are based on static configurations that lead to either under or over-provisioning. This Ph.D. thesis discusses StreamCloud, an elastic paralleldistributed stream processing engine that enables for processing of large data stream volumes. Stream- Cloud minimizes the distribution and parallelization overhead introducing novel techniques that split queries into parallel subqueries and allocate them to independent sets of nodes. Moreover, Stream- Cloud elastic and dynamic load balancing protocols enable for effective adjustment of resources depending on the incoming load. Together with the parallelization and elasticity techniques, Stream- Cloud defines a novel fault tolerance protocol that introduces minimal overhead while providing fast recovery. StreamCloud has been fully implemented and evaluated using several real word applications such as fraud detection applications or network analysis applications. The evaluation, conducted using a cluster with more than 300 cores, demonstrates the large scalability, the elasticity and fault tolerance effectiveness of StreamCloud. Resumen En los útimos años, aplicaciones en dominios tales como telecomunicaciones, seguridad de redes y redes de sensores de gran escala se han encontrado con múltiples limitaciones en el paradigma tradicional de bases de datos. En este contexto, los sistemas de procesamiento de flujos de datos han emergido como solución a estas aplicaciones que demandan una alta capacidad de procesamiento con una baja latencia. En los sistemas de procesamiento de flujos de datos, los datos no se persisten y luego se procesan, en su lugar los datos son procesados al vuelo en memoria produciendo resultados de forma continua. Los actuales sistemas de procesamiento de flujos de datos, tanto los centralizados, como los distribuidos, no escalan respecto a la carga de entrada del sistema debido a un cuello de botella producido por la concentración de flujos de datos completos en nodos individuales. Por otra parte, éstos están basados en configuraciones estáticas lo que conducen a un sobre o bajo aprovisionamiento. Esta tesis doctoral presenta StreamCloud, un sistema elástico paralelo-distribuido para el procesamiento de flujos de datos que es capaz de procesar grandes volúmenes de datos. StreamCloud minimiza el coste de distribución y paralelización por medio de una técnica novedosa la cual particiona las queries en subqueries paralelas repartiéndolas en subconjuntos de nodos independientes. Ademas, Stream- Cloud posee protocolos de elasticidad y equilibrado de carga que permiten una optimización de los recursos dependiendo de la carga del sistema. Unidos a los protocolos de paralelización y elasticidad, StreamCloud define un protocolo de tolerancia a fallos que introduce un coste mínimo mientras que proporciona una rápida recuperación. StreamCloud ha sido implementado y evaluado mediante varias aplicaciones del mundo real tales como aplicaciones de detección de fraude o aplicaciones de análisis del tráfico de red. La evaluación ha sido realizada en un cluster con más de 300 núcleos, demostrando la alta escalabilidad y la efectividad tanto de la elasticidad, como de la tolerancia a fallos de StreamCloud.
Resumo:
The study of atmospheric propagation impairments at submillimeter and THz frequencies is becoming increasingly relevant due to the strong effects caused by the composition of the troposphere and the phenomena occurring in it. The present paper is devoted to the estimation of total attenuation at 100 GHz and 300 GHz under non-rainy scenarios. With this purpose, 4 years of meteorological data from Madrid have been collected, including radiosoundings from Madrid-Barajas Airport and co-site SYNOP observations. This volume of data has been analyzed with the aim of also introducing a detection method of rain conditions, which cannot be easily identified in radiosounding profiles. Finally, the method has been used to discard several probable events which would be responsible of scattering conditions and, hence, yearly CDFs of total attenuation have been obtained. It is expected that the statistics would be closest to the ones obtained by experimental techniques under similar atmospheric conditions.
Resumo:
The city of Lorca (Spain) was hit on May 11th 2011 by two consecutive earthquakes with 4.6 and 5.2 Mw respectively, causing casualties and important damage in buildings. Lorca is located in the south-east region of Spain and settled on the trace of the Murcia-Totana-Lorca fault. Although the magnitudes of these ground motions were not severe, the damage observed was considerable over a great amount of buildings. More than 300 of them have been demolished and many others are being retrofitted. This paper reports a field study on the damage caused by these earthquakes. The observed damage is related with the structural typology. Further, prototypes of the damaged buildings are idealized with nonlinear numerical models and their seismic behavior and proneness to damage concentration is further investigated through dynamic response analyses.
Resumo:
La diabetes mellitus es el conjunto de alteraciones provocadas por un defecto en la cantidad de insulina secretada o por un aprovechamiento deficiente de la misma. Es causa directa de complicaciones a corto, medio y largo plazo que disminuyen la calidad y las expectativas de vida de las personas con diabetes. La diabetes mellitus es en la actualidad uno de los problemas más importantes de salud. Ha triplicado su prevalencia en los últimos 20 anos y para el año 2025 se espera que existan casi 300 millones de personas con diabetes. Este aumento de la prevalencia junto con la morbi-mortalidad asociada a sus complicaciones micro y macro-vasculares convierten la diabetes en una carga para los sistemas sanitarios, sus recursos económicos y sus profesionales, haciendo de la enfermedad un problema individual y de salud pública de enormes proporciones. De momento no existe cura a esta enfermedad, de modo que el objetivo terapéutico del tratamiento de la diabetes se centra en la normalización de la glucemia intentando minimizar los eventos de hiper e hipoglucemia y evitando la aparición o al menos retrasando la evolución de las complicaciones vasculares, que constituyen la principal causa de morbi-mortalidad de las personas con diabetes. Un adecuado control diabetológico implica un tratamiento individualizado que considere multitud de factores para cada paciente (edad, actividad física, hábitos alimentarios, presencia de complicaciones asociadas o no a la diabetes, factores culturales, etc.). Sin embargo, a corto plazo, las dos variables más influyentes que el paciente ha de manejar para intervenir sobre su nivel glucémico son la insulina administrada y la dieta. Ambas presentan un retardo entre el momento de su aplicación y el comienzo de su acción, asociado a la absorción de los mismos. Por este motivo la capacidad de predecir la evolución del perfil glucémico en un futuro cercano, ayudara al paciente a tomar las decisiones adecuadas para mantener un buen control de su enfermedad y evitar situaciones de riesgo. Este es el objetivo de la predicción en diabetes: adelantar la evolución del perfil glucémico en un futuro cercano para ayudar al paciente a adaptar su estilo de vida y sus acciones correctoras, con el propósito de que sus niveles de glucemia se aproximen a los de una persona sana, evitando así los síntomas y complicaciones de un mal control. La aparición reciente de los sistemas de monitorización continua de glucosa ha proporcionado nuevas alternativas. La disponibilidad de un registro exhaustivo de las variaciones del perfil glucémico, con un periodo de muestreo de entre uno y cinco minutos, ha favorecido el planteamiento de nuevos modelos que tratan de predecir la glucemia utilizando tan solo las medidas anteriores de glucemia o al menos reduciendo significativamente la información de entrada a los algoritmos. El hecho de requerir menor intervención por parte del paciente, abre nuevas posibilidades de aplicación de los predictores de glucemia, haciéndose viable su uso en tiempo real, como sistemas de ayuda a la decisión, como detectores de situaciones de riesgo o integrados en algoritmos automáticos de control. En esta tesis doctoral se proponen diferentes algoritmos de predicción de glucemia para pacientes con diabetes, basados en la información registrada por un sistema de monitorización continua de glucosa así como incorporando la información de la insulina administrada y la ingesta de carbohidratos. Los algoritmos propuestos han sido evaluados en simulación y utilizando datos de pacientes registrados en diferentes estudios clínicos. Para ello se ha desarrollado una amplia metodología, que trata de caracterizar las prestaciones de los modelos de predicción desde todos los puntos de vista: precisión, retardo, ruido y capacidad de detección de situaciones de riesgo. Se han desarrollado las herramientas de simulación necesarias y se han analizado y preparado las bases de datos de pacientes. También se ha probado uno de los algoritmos propuestos para comprobar la validez de la predicción en tiempo real en un escenario clínico. Se han desarrollado las herramientas que han permitido llevar a cabo el protocolo experimental definido, en el que el paciente consulta la predicción bajo demanda y tiene el control sobre las variables metabólicas. Este experimento ha permitido valorar el impacto sobre el control glucémico del uso de la predicción de glucosa. ABSTRACT Diabetes mellitus is the set of alterations caused by a defect in the amount of secreted insulin or a suboptimal use of insulin. It causes complications in the short, medium and long term that affect the quality of life and reduce the life expectancy of people with diabetes. Diabetes mellitus is currently one of the most important health problems. Prevalence has tripled in the past 20 years and estimations point out that it will affect almost 300 million people by 2025. Due to this increased prevalence, as well as to morbidity and mortality associated with micro- and macrovascular complications, diabetes has become a burden on health systems, their financial resources and their professionals, thus making the disease a major individual and a public health problem. There is currently no cure for this disease, so that the therapeutic goal of diabetes treatment focuses on normalizing blood glucose events. The aim is to minimize hyper- and hypoglycemia and to avoid, or at least to delay, the appearance and development of vascular complications, which are the main cause of morbidity and mortality among people with diabetes. A suitable, individualized and controlled treatment for diabetes involves many factors that need to be considered for each patient: age, physical activity, eating habits, presence of complications related or unrelated to diabetes, cultural factors, etc. However, in the short term, the two most influential variables that the patient has available in order to manage his/her glycemic levels are administered insulin doses and diet. Both suffer from a delay between their time of application and the onset of the action associated with their absorption. Therefore, the ability to predict the evolution of the glycemic profile in the near future could help the patient to make appropriate decisions on how to maintain good control of his/her disease and to avoid risky situations. Hence, the main goal of glucose prediction in diabetes consists of advancing the evolution of glycemic profiles in the near future. This would assist the patient in adapting his/her lifestyle and in taking corrective actions in a way that blood glucose levels approach those of a healthy person, consequently avoiding the symptoms and complications of a poor glucose control. The recent emergence of continuous glucose monitoring systems has provided new alternatives in this field. The availability of continuous records of changes in glycemic profiles (with a sampling period of one or five minutes) has enabled the design of new models which seek to predict blood glucose by using automatically read glucose measurements only (or at least, reducing significantly the data input manually to the algorithms). By requiring less intervention by the patient, new possibilities are open for the application of glucose predictors, making its use feasible in real-time applications, such as: decision support systems, hypo- and hyperglycemia detectors, integration into automated control algorithms, etc. In this thesis, different glucose prediction algorithms are proposed for patients with diabetes. These are based on information recorded by a continuous glucose monitoring system and incorporate information of the administered insulin and carbohydrate intakes. The proposed algorithms have been evaluated in-silico and using patients’ data recorded in different clinical trials. A complete methodology has been developed to characterize the performance of predictive models from all points of view: accuracy, delay, noise and ability to detect hypo- and hyperglycemia. In addition, simulation tools and patient databases have been deployed. One of the proposed algorithms has additionally been evaluated in terms of real-time prediction performance in a clinical scenario in which the patient checked his/her glucose predictions on demand and he/she had control on his/her metabolic variables. This has allowed assessing the impact of using glucose prediction on glycemic control. The tools to carry out the defined experimental protocols were also developed in this thesis.
Resumo:
Glutens, the storage proteins in wheat grains, are a major source of protein in human nutrition. The protein composition of wheat has therefore been an important focus of cereal research. Proteomic tools have been used to describe the genetic diversity of wheat germplasms from different origins at the level of polymorphisms in alleles encoding glutenin and gliadin, the two main proteins of gluten. More recently, proteomics has been used to understand the impact of specific gluten proteins on wheat quality. Here we review the impact of proteomics on the study of gluten proteins as it has evolved from fractionation and electrophoretic techniques to advanced mass spectrometry. In the postgenome era, proteomics is proving to be essential in the effort to identify and understand the interactions between different gluten proteins. This is helping to fill in gaps in our knowledge of how the technological quality of wheat is determined by the interaction between genotype and environment. We also collate information on the various storage protein alleles identified and their prevalence, which makes it possible to infer the effects of wheat selection on grain protein content. We conclude by reviewing the more recent use of transgenesis aimed at improving the quality of gluten.
Resumo:
En los últimos años, ha crecido de forma significativa el interés por la utilización de dispositivos capaces de reconocer gestos humanos. En este trabajo, se pretenden reconocer gestos manuales colocando sensores en la mano de una persona. El reconocimiento de gestos manuales puede ser implementado para diversos usos y bajo diversas plataformas: juegos (Wii), control de brazos robóticos, etc. Como primer paso, se realizará un estudio de las actuales técnicas de reconocimiento de gestos que utilizan acelerómetros como sensor de medida. En un segundo paso, se estudiará como los acelerómetros pueden utilizarse para intentar reconocer los gestos que puedan realizar una persona (mover el brazo hacia un lado, girar la mano, dibujar un cuadrado, etc.) y los problemas que de su utilización puedan derivarse. Se ha utilizado una IMU (Inertial Measurement Unit) como sensor de medida. Está compuesta por tres acelerómetros y tres giróscopos (MTi-300 de Xsens). Con las medidas que proporcionan estos sensores se realiza el cálculo de la posición y orientación de la mano, representando esta última en función de los ángulos de Euler. Un aspecto importante a destacar será el efecto de la gravedad en las medidas de las aceleraciones. A través de diversos cálculos y mediante la ayuda de los giróscopos se podrá corregir dicho efecto. Por último, se desarrollará un sistema que identifique la posición y orientación de la mano como gestos reconocidos utilizando lógica difusa. Tanto para la adquisición de las muestras, como para los cálculos de posicionamiento, se ha desarrollado un código con el programa Matlab. También, con este mismo software, se ha implementado un sistema de lógica difusa con la que se realizará el reconocimiento de los gestos, utilizando la herramienta FIS Editor. Las pruebas realizadas han consistido en la ejecución de nueve gestos por diferentes personas teniendo una tasa de reconocimiento comprendida entre el 90 % y 100 % dependiendo del gesto a identificar. ABSTRACT In recent years, it has grown significantly interest in the use of devices capable of recognizing human gestures. In this work, we aim to recognize hand gestures placing sensors on the hand of a person. The recognition of hand gestures can be implemented for different applications on different platforms: games (Wii), control of robotic arms ... As a first step, a study of current gesture recognition techniques that use accelerometers and sensor measurement is performed. In a second step, we study how accelerometers can be used to try to recognize the gestures that can make a person (moving the arm to the side, rotate the hand, draw a square, etc...) And the problems of its use can be derived. We used an IMU (Inertial Measurement Unit) as a measuring sensor. It comprises three accelerometers and three gyroscopes (Xsens MTI-300). The measures provided by these sensors to calculate the position and orientation of the hand are made, with the latter depending on the Euler angles. An important aspect to note is the effect of gravity on the measurements of the accelerations. Through various calculations and with the help of the gyroscopes can correct this effect. Finally, a system that identifies the position and orientation of the hand as recognized gestures developed using fuzzy logic. Both the acquisition of samples to calculate position, a code was developed with Matlab program. Also, with the same software, has implemented a fuzzy logic system to be held with the recognition of gestures using the FIS Editor. Tests have involved the execution of nine gestures by different people having a recognition rate between 90% and 100% depending on the gesture to identify.
Resumo:
El gran esfuerzo realizado durante la última década con el fin de integrar los diferentes materiales superconductores en el campo de los sistemas eléctricos y en otras aplicaciones tecnológicas ha dado lugar a un campo de investigación amplio y prometedor. El comportamiento eléctrico de los Superconductores de Alta Temperatura (SAT) crítica (masivo y cintas) depende de diferentes parámetros desde su fabricación hasta la aplicación final con imanes o cables. Sin embargo, las aplicaciones prácticas de estos materiales están fuertemente vinculadas con su comportamiento mecánico tanto a temperatura ambiente (manipulación durante fabricación o instalación) como a temperaturas criogénicas (condiciones de servicio). En esta tesis se ha estudiado el comportamiento mecánico de materiales masivos y cintas de alta temperatura crítica a 300 y 77 K (utilizando nitrógeno líquido). Se han obtenido la resistencia en flexión, la tenacidad de fractura y la resistencia a tracción a la temperatura de servicio y a 300 K. Adicionalmente, se ha medido la dureza mediante el ensayo Vickers y nanoindentación. El módulo Young se midió mediante tres métodos diferentes: 1) nanoindentación, 2) ensayos de flexión en tres puntos y 3) resonancia vibracional mediante grindosonic. Para cada condición de ensayo, se han analizado detalladamente las superficies de fractura y los micromecanismos de fallo. Las propiedades mecánicas de los materiales se han comparado con el fin de entender la influencia de las técnicas de procesado y de las características microestructurales de los monocristales en su comportamiento mecánico. Se ha estudiado el comportamiento electromecánico de cintas comerciales superconductoras de YBCO mediante ensayos de tracción y fatiga a 77 y 300 K. El campo completo de deformaciones en la superficie del material se ha obtenido utilizando Correlación Digital de Imágenes (DIC, por sus siglas en inglés) a 300 K. Además, se realizaron ensayos de fragmentación in situ dentro de un microscopio electrónico con el fin de estudiar la fractura de la capa superconductora y determinar la resistencia a cortante de la intercara entre el substrato y la capa cerámica. Se ha conseguido ver el proceso de la fragmentación aplicando tensión axial y finalmente, se han implementado simulaciones mediante elementos finitos para reproducir la delaminación y el fenómeno de la fragmentación. Por último, se han preparado uniones soldadas entre las capas de cobre de dos cintas superconductoras. Se ha medido la resistencia eléctrica de las uniones con el fin de evaluar el metal de soldadura y el proceso. Asimismo, se ha llevado a cabo la caracterización mecánica de las uniones mediante ensayos "single lap shear" a 300 y 77 K. El efecto del campo magnético se ha estudiado aplicando campo externo hasta 1 T perpendicular o paralelo a la cinta-unión a la temperatura de servicio (77 K). Finalmente, la distribución de tensiones en cada una de las capas de la cinta se estudió mediante simulaciones de elementos finitos, teniendo en cuenta las capas de la cinta mecánicamente más representativas (Cu-Hastelloy-Cu) que influyen en su comportamiento mecánico. The strong effort that has been made in the last years to integrate the different superconducting materials in the field of electrical power systems and other technological applications led to a wide and promising research field. The electrical behavior of High Temperature Superconducting (HTS) materials (bulk and coated conductors) depends on different parameters since their processing until their final application as magnets or cables. However, practical applications of such materials are strongly related with their mechanical performance at room temperature (handling) as well as at cryogenic temperatures (service conditions). In this thesis, the mechanical behavior of HTS bulk and coated conductors was investigated at 300 and 77 K (by immersion in liquid nitrogen). The flexural strength, the fracture toughness and the tensile strength were obtained at service temperature as well as at 300 K. Furthermore, their hardness was determined by Vickers measurements and nanoindentation and the Young's modulus was measured by three different techniques: 1) nanoindentation, 2) three-point bending tests and 3) vibrational resonance with a grindosonic device. The fracture and deformation micromechanics have been also carefully analyzed for each testing condition. The comparison between the studied materials has been performed in order to understand the influence of the main sintering methods and the microstructural characteristics of the single grains on the macroscopic mechanical behavior. The electromechanical behavior of commercial YBCO coated conductors was studied. The mechanical behavior of the tapes was studied under tensile and fatigue tests at 77 and 300 K. The complete strain field on the surface of the sample was obtained by applying Digital Image Correlation (DIC) at 300 K. Addionally, in situ fragmentation tests inside a Scanning Electron Microscope (SEM) were carried out in order to study the fragmentation of the superconducting layer and determine the interfacial shear strength between substrate and ceramic layer. The fragmentation process upon loading of the YBCO layer has been observed and finally, Finite Element Simulations were employed to reproduce delamination and fragmentation phenomena. Finally, joints between the stabilizing Cu sides of two coated conductors have been prepared. The electrical resistivity of the joints was measured for the purpose of qualifying the soldering material and evaluating the soldering process. Additionally, mechanical characterization under single lap shear tests at 300 and 77 K has been carried out. The effect of the applied magnetic field has been studied by applying external magnetic field up to 1 T perpendicular and parallel to the tape-joint at service temperature (77 K). Finally, finite element simulations were employed to study the distribution of the stresses in earch layer, taking into account the three mechanically relevant layers of the coated conductor (Cu-Hastelloy-Cu) that affect its mechanical behavior
Resumo:
Extensive spatial and temporal surveys, over 15 years, have been conducted in soil in urban parks and street dusts in one of the most polluted cities in western Europe, Avilés (NW Spain). The first survey was carried out in 1996, and since then monitoring has been undertaken every five years. Whilst the sampling site is a relatively small town, industrial activities (mainly the steel industry and Zn and Al metallurgy) and other less significant urban sources, such as traffic, strongly affect the load of heavy metals in the urban aerosol. Elemental tracers have been used to characterise the influence of these sources on the composition of soil and dust. Although PM10 has decreased over these years as a result of environmental measures undertaken in the city, some of the “industrial” elements still remain in concentrations of concern for example, up to 4.6% and 0.5% of Zn in dust and soil, respectively. Spatial trends in metals such as Zn and Cd clearly reflect sources from the processing industries. The concentrations of these elements across Europe have reduced over time, however the most recent results from Avilés revealed an upward trend in concentration for Zn, Cd, Hg and As. A risk assessment of the soil highlighted As as an element of concern since its cancer risk in adults was more than double the value above which regulatory agencies deem it to be unacceptable. If children were considered to be the receptors, then the risk nearly doubles from this element.
Resumo:
Spanish coastal legislation has changed in response to changing circumstances. The objective of the 1969 Spanish Coastal Law was to assign responsibilities in the Public Domain to the authorities. The 1980 Spanish Coastal Law addressed infractions and sanctions issues. The 1988 Spanish Coastal Law completed the responsibilities and sanctions aspects and included others related to the delimitation of the Public Domain, the private properties close to the Public Domain, and limitations on landuse in this area. The 1988 Spanish Coastal Law has been controversial since its publication. The “European Parliament Report on the impact of extensive urbanization in Spain on individual rights of European citizen, on the environment and on the application of EU law, based upon petitions received”, published in 2009 recommended that the Spanish Authorities make an urgent revision of the Coastal Law with the main objective of protecting property owners whose buildings do not have negative effects on the coastal environment. The revision recommended has been carried out, in the new Spanish Coastal Law “Ley 2/2013, de 29 de mayo, de protección y uso sostenible del litoral y de modificación de la Ley 22/1988, de 28 de Julio, de Costas”, published in May of 2013. This is the first major change in the 25 years since the previous 1988 Spanish Coastal Law. This paper compares the 1988 and 2013 Spanish Coastal Law documents, highlighting the most important issues like the Public Domain description, limitations in private properties close to the Public Domain limit, climate change influence, authorizations length, etc. The paper includes proposals for further improvements.