919 resultados para Ostergren, Robert C.: The Europeans: a geography of people, culture and environment
Resumo:
Deep-sea pore fluids are potential archives of ancient seawater chemistry. However, the primary signal recorded in pore fluids is often overprinted by diagenetic processes. Recent studies have suggested that depth profiles of Mg concentration in deep-sea carbonate pore fluids are best explained by a rapid rise in seawater Mg over the last 10-20 Myr. To explore this possibility we measured the Mg isotopic composition of pore fluids and carbonate sediments from Ocean Drilling Program (ODP) site 807. Whereas the concentration of Mg in the pore fluid declines with depth, the isotopic composition of Mg in the pore fluid increases from -0.78 per mil near the sediment-water interface to -0.15 per mil at 778 mbsf. The Mg isotopic composition of the sediment, with few important exceptions, does not change with depth and has an average d26Mg value of -4.72 per mil. We reproduce the observed changes in sediment and pore-fluid Mg isotope values using a numerical model that incorporates Mg, Ca and Sr cycling and satisfies existing pore-fluid Ca isotope and Sr data. Our model shows that the observed trends in magnesium concentrations and isotopes are best explained as a combination of two processes: a secular rise in the seawater Mg over the Neogene and the recrystallization of low-Mg biogenic carbonate to a higher-Mg diagenetic calcite. These results indicate that burial recrystallization will add Mg to pelagic carbonate sediments, leading to an overestimation of paleo-temperatures from measured Mg/Ca ratios. The Mg isotopic composition of foraminiferal calcite appears to be only slightly altered by recrystallization making it possible to reconstruct the Mg isotopic composition of seawater through time.
Resumo:
Oxygen- and carbon-isotope analyses have been performed on the Quaternary planktonic foraminifers of Sites 548 and 549 (DSDP Leg 80) to investigate major water mass changes that occurred in the northeastern Atlantic at different glacial-interglacial cycles and to compare them with the well-defined picture of 18,000 yr. ago. Oxygen-isotope stratigraphy also provides a chronological framework for the more important data on the fauna and flora. Although bioturbation and sedimentary gaps obliterate the climatic and stratigraphic record, general trends in the oceanographic history can be deduced from the isotopic data. Isotopic stratigraphy has tentatively been delineated down to isotopic Stage 16 at Site 548 and in Hole 549A. This stratigraphy fits well with that deduced from benthic foraminiferal d18O changes and with bioclimatic zonations based on foraminiferal associations at Site 549. Variations in the geographic extension and in the flux of the Gulf Stream subtropical waters are inferred from both d18O and d13C changes. Maximal fluxes occurred during the late Pliocene. Northward extension of subtropical waters increased through the various interglacial phases of the early Pleistocene and decreased through the late Pleistocene interglacial phases. Conversely, glacial maxima were more intense after Stage 16. Isotopic Stages 12 and 16 mark times of important change in water mass circulation. Oxygen- and carbon-isotope analyses have been performed on the Quaternary planktonic foraminifers of Sites 548 and 549 (DSDP Leg 80) to investigate major water mass changes that occurred in the northeastern Atlantic at different glacial-interglacial cycles and to compare them with the well-defined picture of 18,000 yr. ago. Oxygen-isotope stratigraphy also provides a chronological framework for the more important data on the fauna and flora. Although bioturbation and sedimentary gaps obliterate the climatic and stratigraphic record, general trends in the oceanographic history can be deduced from the isotopic data. Isotopic stratigraphy has tentatively been delineated down to isotopic Stage 16 at Site 548 and in Hole 549A. This stratigraphy fits well with that deduced from benthic foraminiferal d18O changes and with bioclimatic zonations based on foraminiferal associations at Site 549. Variations in the geographic extension and in the flux of the Gulf Stream subtropical waters are inferred from both d18O and d13C changes. Maximal fluxes occurred during the late Pliocene. Northward extension of subtropical waters increased through the various interglacial phases of the early Pleistocene and decreased through the late Pleistocene interglacial phases. Conversely, glacial maxima were more intense after Stage 16. Isotopic Stages 12 and 16 mark times of important change in water mass circulation.
Resumo:
Acceleration of Greenland's three largest outlet glaciers, Helheim, Kangerdlugssuaq and Jakobshavn Isbræ, accounted for a substantial portion of the ice sheet's mass loss over the past decade. Rapid changes in their discharge, however, make their cumulative mass-change uncertain. We derive monthly mass balance rates and cumulative balance from discharge and surface mass balance (SMB) rates for these glaciers from 2000 through 2010. Despite the dramatic changes observed at Helheim, the glacier gained mass over the period, due primarily to the short duration of acceleration and a likely longer-term positive balance. In contrast, Jakobshavn Isbræ lost an equivalent of over 11 times the average annual SMB and loss continues to accelerate. Kangerdlugssuaq lost over 7 times its annual average SMB, but loss has returned to the 2000 rate. These differences point to contrasts in the long-term evolution of these glaciers and the danger in basing predictions on extrapolations of recent changes.
Resumo:
La termografía infrarroja (TI) es una técnica no invasiva y de bajo coste que permite, con el simple acto de tomar una fotografía, el registro sin contacto de la energía que irradia el cuerpo humano (Akimov & Son’kin, 2011, Merla et al., 2005, Ng et al., 2009, Costello et al., 2012, Hildebrandt et al., 2010). Esta técnica comenzó a utilizarse en el ámbito médico en los años 60, pero debido a los malos resultados como herramienta diagnóstica y la falta de protocolos estandarizados (Head & Elliot, 2002), ésta se dejó de utilizar en detrimento de otras técnicas más precisas a nivel diagnóstico. No obstante, las mejoras tecnológicas de la TI en los últimos años han hecho posible un resurgimiento de la misma (Jiang et al., 2005, Vainer et al., 2005, Cheng et al., 2009, Spalding et al., 2011, Skala et al., 2012), abriendo el camino a nuevas aplicaciones no sólo centradas en el uso diagnóstico. Entre las nuevas aplicaciones, destacamos las que se desarrollan en el ámbito de la actividad física y el deporte, donde recientemente se ha demostrado que los nuevos avances con imágenes de alta resolución pueden proporcionar información muy interesante sobre el complejo sistema de termorregulación humana (Hildebrandt et al., 2010). Entre las nuevas aplicaciones destacan: la cuantificación de la asimilación de la carga de trabajo físico (Čoh & Širok, 2007), la valoración de la condición física (Chudecka et al., 2010, 2012, Akimov et al., 2009, 2011, Merla et al., 2010), la prevención y seguimiento de lesiones (Hildebrandt et al., 2010, 2012, Badža et al., 2012, Gómez Carmona, 2012) e incluso la detección de agujetas (Al-Nakhli et al., 2012). Bajo estas circunstancias, se acusa cada vez más la necesidad de ampliar el conocimiento sobre los factores que influyen en la aplicación de la TI en los seres humanos, así como la descripción de la respuesta de la temperatura de la piel (TP) en condiciones normales, y bajo la influencia de los diferentes tipos de ejercicio. Por consiguiente, este estudio presenta en una primera parte una revisión bibliográfica sobre los factores que afectan al uso de la TI en los seres humanos y una propuesta de clasificación de los mismos. Hemos analizado la fiabilidad del software Termotracker, así como su reproducibilidad de la temperatura de la piel en sujetos jóvenes, sanos y con normopeso. Finalmente, se analizó la respuesta térmica de la piel antes de un entrenamiento de resistencia, velocidad y fuerza, inmediatamente después y durante un período de recuperación de 8 horas. En cuanto a la revisión bibliográfica, hemos propuesto una clasificación para organizar los factores en tres grupos principales: los factores ambientales, individuales y técnicos. El análisis y descripción de estas influencias deben representar la base de nuevas investigaciones con el fin de utilizar la TI en las mejores condiciones. En cuanto a la reproducibilidad, los resultados mostraron valores excelentes para imágenes consecutivas, aunque la reproducibilidad de la TP disminuyó ligeramente con imágenes separadas por 24 horas, sobre todo en las zonas con valores más fríos (es decir, zonas distales y articulaciones). Las asimetrías térmicas (que normalmente se utilizan para seguir la evolución de zonas sobrecargadas o lesionadas) también mostraron excelentes resultados pero, en este caso, con mejores valores para las articulaciones y el zonas centrales (es decir, rodillas, tobillos, dorsales y pectorales) que las Zonas de Interés (ZDI) con valores medios más calientes (como los muslos e isquiotibiales). Los resultados de fiabilidad del software Termotracker fueron excelentes en todas las condiciones y parámetros. En el caso del estudio sobre los efectos de los entrenamientos de la velocidad resistencia y fuerza en la TP, los resultados muestran respuestas específicas según el tipo de entrenamiento, zona de interés, el momento de la evaluación y la función de las zonas analizadas. Los resultados mostraron que la mayoría de las ZDI musculares se mantuvieron significativamente más calientes 8 horas después del entrenamiento, lo que indica que el efecto del ejercicio sobre la TP perdura por lo menos 8 horas en la mayoría de zonas analizadas. La TI podría ser útil para cuantificar la asimilación y recuperación física después de una carga física de trabajo. Estos resultados podrían ser muy útiles para entender mejor el complejo sistema de termorregulación humano, y por lo tanto, para utilizar la TI de una manera más objetiva, precisa y profesional con visos a mejorar las nuevas aplicaciones termográficas en el sector de la actividad física y el deporte Infrared Thermography (IRT) is a safe, non-invasive and low-cost technique that allows the rapid and non-contact recording of the irradiated energy released from the body (Akimov & Son’kin, 2011; Merla et al., 2005; Ng et al., 2009; Costello et al., 2012; Hildebrandt et al., 2010). It has been used since the early 1960’s, but due to poor results as diagnostic tool and a lack of methodological standards and quality assurance (Head et al., 2002), it was rejected from the medical field. Nevertheless, the technological improvements of IRT in the last years have made possible a resurgence of this technique (Jiang et al., 2005; Vainer et al., 2005; Cheng et al., 2009; Spalding et al., 2011; Skala et al., 2012), paving the way to new applications not only focused on the diagnose usages. Among the new applications, we highlighted those in physical activity and sport fields, where it has been recently proven that a high resolution thermal images can provide us with interesting information about the complex thermoregulation system of the body (Hildebrandt et al., 2010), information than can be used as: training workload quantification (Čoh & Širok, 2007), fitness and performance conditions (Chudecka et al., 2010, 2012; Akimov et al., 2009, 2011; Merla et al., 2010; Arfaoui et al., 2012), prevention and monitoring of injuries (Hildebrandt et al., 2010, 2012; Badža et al., 2012, Gómez Carmona, 2012) and even detection of Delayed Onset Muscle Soreness – DOMS- (Al-Nakhli et al., 2012). Under this context, there is a relevant necessity to broaden the knowledge about factors influencing the application of IRT on humans, and to better explore and describe the thermal response of Skin Temperature (Tsk) in normal conditions, and under the influence of different types of exercise. Consequently, this study presents a literature review about factors affecting the application of IRT on human beings and a classification proposal about them. We analysed the reliability of the software Termotracker®, and also its reproducibility of Tsk on young, healthy and normal weight subjects. Finally, we examined the Tsk thermal response before an endurance, speed and strength training, immediately after and during an 8-hour recovery period. Concerning the literature review, we proposed a classification to organise the factors into three main groups: environmental, individual and technical factors. Thus, better exploring and describing these influence factors should represent the basis of further investigations in order to use IRT in the best and optimal conditions to improve its accuracy and results. Regarding the reproducibility results, the outcomes showed excellent values for consecutive images, but the reproducibility of Tsk slightly decreased with time, above all in the colder Regions of Interest (ROI) (i.e. distal and joint areas). The side-to-side differences (ΔT) (normally used to follow the evolution of some injured or overloaded ROI) also showed highly accurate results, but in this case with better values for joints and central ROI (i.e. Knee, Ankles, Dorsal and Pectoral) than the hottest muscle ROI (as Thigh or Hamstrings). The reliability results of the IRT software Termotracker® were excellent in all conditions and parameters. In the part of the study about the effects on Tsk of aerobic, speed and strength training, the results of Tsk demonstrated specific responses depending on the type of training, ROI, moment of the assessment and the function of the considered ROI. The results showed that most of muscular ROI maintained warmer significant Tsk 8 hours after the training, indicating that the effect of exercise on Tsk last at least 8 hours in most of ROI, as well as IRT could help to quantify the recovery status of the athlete as workload assimilation indicator. Those results could be very useful to better understand the complex skin thermoregulation behaviour, and therefore, to use IRT in a more objective, accurate and professional way to improve the new IRT applications for the physical activity and sport sector.
Resumo:
This work focuses on the analysis of a structural element of MetOP-A satellite. Given the special interest in the influence of equipment installed on structural elements, the paper studies one of the lateral faces on which the Advanced SCATterometer (ASCAT) is installed. The work is oriented towards the modal characterization of the specimen, describing the experimental set-up and the application of results to the development of a Finite Element Method (FEM) model to study the vibro-acoustic response. For the high frequency range, characterized by a high modal density, a Statistical Energy Analysis (SEA) model is considered, and the FEM model is used when modal density is low. The methodology for developing the SEA model and a compound FEM and Boundary Element Method (BEM) model to provide continuity in the medium frequency range is presented, as well as the necessary updating, characterization and coupling between models required to achieve numerical models that match experimental results.
Resumo:
Resource analysis aims at inferring the cost of executing programs for any possible input, in terms of a given resource, such as the traditional execution steps, time ormemory, and, more recently energy consumption or user defined resources (e.g., number of bits sent over a socket, number of database accesses, number of calls to particular procedures, etc.). This is performed statically, i.e., without actually running the programs. Resource usage information is useful for a variety of optimization and verification applications, as well as for guiding software design. For example, programmers can use such information to choose different algorithmic solutions to a problem; program transformation systems can use cost information to choose between alternative transformations; parallelizing compilers can use cost estimates for granularity control, which tries to balance the overheads of task creation and manipulation against the benefits of parallelization. In this thesis we have significatively improved an existing prototype implementation for resource usage analysis based on abstract interpretation, addressing a number of relevant challenges and overcoming many limitations it presented. The goal of that prototype was to show the viability of casting the resource analysis as an abstract domain, and howit could overcome important limitations of the state-of-the-art resource usage analysis tools. For this purpose, it was implemented as an abstract domain in the abstract interpretation framework of the CiaoPP system, PLAI.We have improved both the design and implementation of the prototype, for eventually allowing an evolution of the tool to the industrial application level. The abstract operations of such tool heavily depend on the setting up and finding closed-form solutions of recurrence relations representing the resource usage behavior of program components and the whole program as well. While there exist many tools, such as Computer Algebra Systems (CAS) and libraries able to find closed-form solutions for some types of recurrences, none of them alone is able to handle all the types of recurrences arising during program analysis. In addition, there are some types of recurrences that cannot be solved by any existing tool. This clearly constitutes a bottleneck for this kind of resource usage analysis. Thus, one of the major challenges we have addressed in this thesis is the design and development of a novel modular framework for solving recurrence relations, able to combine and take advantage of the results of existing solvers. Additionally, we have developed and integrated into our novel solver a technique for finding upper-bound closed-form solutions of a special class of recurrence relations that arise during the analysis of programs with accumulating parameters. Finally, we have integrated the improved resource analysis into the CiaoPP general framework for resource usage verification, and specialized the framework for verifying energy consumption specifications of embedded imperative programs in a real application, showing the usefulness and practicality of the resulting tool.---ABSTRACT---El Análisis de recursos tiene como objetivo inferir el coste de la ejecución de programas para cualquier entrada posible, en términos de algún recurso determinado, como pasos de ejecución, tiempo o memoria, y, más recientemente, el consumo de energía o recursos definidos por el usuario (por ejemplo, número de bits enviados a través de un socket, el número de accesos a una base de datos, cantidad de llamadas a determinados procedimientos, etc.). Ello se realiza estáticamente, es decir, sin necesidad de ejecutar los programas. La información sobre el uso de recursos resulta muy útil para una gran variedad de aplicaciones de optimización y verificación de programas, así como para asistir en el diseño de los mismos. Por ejemplo, los programadores pueden utilizar dicha información para elegir diferentes soluciones algorítmicas a un problema; los sistemas de transformación de programas pueden utilizar la información de coste para elegir entre transformaciones alternativas; los compiladores paralelizantes pueden utilizar las estimaciones de coste para realizar control de granularidad, el cual trata de equilibrar el coste debido a la creación y gestión de tareas, con los beneficios de la paralelización. En esta tesis hemos mejorado de manera significativa la implementación de un prototipo existente para el análisis del uso de recursos basado en interpretación abstracta, abordando diversos desafíos relevantes y superando numerosas limitaciones que éste presentaba. El objetivo de dicho prototipo era mostrar la viabilidad de definir el análisis de recursos como un dominio abstracto, y cómo se podían superar las limitaciones de otras herramientas similares que constituyen el estado del arte. Para ello, se implementó como un dominio abstracto en el marco de interpretación abstracta presente en el sistema CiaoPP, PLAI. Hemos mejorado tanto el diseño como la implementación del mencionado prototipo para posibilitar su evolución hacia una herramienta utilizable en el ámbito industrial. Las operaciones abstractas de dicha herramienta dependen en gran medida de la generación, y posterior búsqueda de soluciones en forma cerrada, de relaciones recurrentes, las cuales modelizan el comportamiento, respecto al consumo de recursos, de los componentes del programa y del programa completo. Si bien existen actualmente muchas herramientas capaces de encontrar soluciones en forma cerrada para ciertos tipos de recurrencias, tales como Sistemas de Computación Algebraicos (CAS) y librerías de programación, ninguna de dichas herramientas es capaz de tratar, por sí sola, todos los tipos de recurrencias que surgen durante el análisis de recursos. Existen incluso recurrencias que no las puede resolver ninguna herramienta actual. Esto constituye claramente un cuello de botella para este tipo de análisis del uso de recursos. Por lo tanto, uno de los principales desafíos que hemos abordado en esta tesis es el diseño y desarrollo de un novedoso marco modular para la resolución de relaciones recurrentes, combinando y aprovechando los resultados de resolutores existentes. Además de ello, hemos desarrollado e integrado en nuestro nuevo resolutor una técnica para la obtención de cotas superiores en forma cerrada de una clase característica de relaciones recurrentes que surgen durante el análisis de programas lógicos con parámetros de acumulación. Finalmente, hemos integrado el nuevo análisis de recursos con el marco general para verificación de recursos de CiaoPP, y hemos instanciado dicho marco para la verificación de especificaciones sobre el consumo de energía de programas imperativas embarcados, mostrando la viabilidad y utilidad de la herramienta resultante en una aplicación real.
Resumo:
Objective: To establish the mental health needs of homeless children and families before and after rehousing.
Resumo:
Recent evidence emerging from several laboratories, integrated with new data obtained by searching the genome databases, suggests that the area code hypothesis provides a good heuristic model for explaining the remarkable specificity of cell migration and tissue assembly that occurs throughout embryogenesis. The area code hypothesis proposes that cells assemble organisms, including their brains and nervous systems, with the aid of a molecular-addressing code that functions much like the country, area, regional, and local portions of the telephone dialing system. The complexity of the information required to code cells for the construction of entire organisms is so enormous that we assume that the code must make combinatorial use of members of large multigene families. Such a system would reuse the same receptors as molecular digits in various regions of the embryo, thus greatly reducing the total number of genes required. We present the hypothesis that members of the very large families of olfactory receptors and vomeronasal receptors fulfill the criteria proposed for area code molecules and could serve as the last digits in such a code. We discuss our evidence indicating that receptors of these families are expressed in many parts of developing embryos and suggest that they play a key functional role in cell recognition and targeting not only in the olfactory system but also throughout the brain and numerous other organs as they are assembled.
Resumo:
Two major pathways of recombination-dependent DNA replication, “join-copy” and “join-cut-copy,” can be distinguished in phage T4: join-copy requires only early and middle genes, but two late proteins, endonuclease VII and terminase, are uniquely important in the join-cut-copy pathway. In wild-type T4, timing of these pathways is integrated with the developmental program and related to transcription and packaging of DNA. In primase mutants, which are defective in origin-dependent lagging-strand DNA synthesis, the late pathway can bypass the lack of primers for lagging-strand DNA synthesis. The exquisitely regulated synthesis of endo VII, and of two proteins from its gene, explains the delay of recombination-dependent DNA replication in primase (as well as topoisomerase) mutants, and the temperature-dependence of the delay. Other proteins (e.g., the single-stranded DNA binding protein and the products of genes 46 and 47) are important in all recombination pathways, but they interact differently with other proteins in different pathways. These homologous recombination pathways contribute to evolution because they facilitate acquisition of any foreign DNA with limited sequence homology during horizontal gene transfer, without requiring transposition or site-specific recombination functions. Partial heteroduplex repair can generate what appears to be multiple mutations from a single recombinational intermediate. The resulting sequence divergence generates barriers to formation of viable recombinants. The multiple sequence changes can also lead to erroneous estimates in phylogenetic analyses.
Resumo:
Anti-viral drug treatment of human immunodeficiency virus type I (HIV-1) and hepatitis B virus (HBV) infections causes rapid reduction in plasma virus load. Viral decline occurs in several phases and provides information on important kinetic constants of virus replication in vivo and pharmacodynamical properties. We develop a mathematical model that takes into account the intracellular phase of the viral life-cycle, defined as the time between infection of a cell and production of new virus particles. We derive analytic solutions for the dynamics following treatment with reverse transcriptase inhibitors, protease inhibitors, or a combination of both. For HIV-1, our results show that the phase of rapid decay in plasma virus (days 2-7) allows precise estimates for the turnover rate of productively infected cells. The initial quasi-stationary phase (days 0-1) and the transition phase (days 1-2) are explained by the combined effects of pharmacological and intracellular delays, the clearance of free virus particles, and the decay of infected cells. Reliable estimates of the first three quantities are not possible from data on virus load only; such estimates require additional measurements. In contrast with HIV-1, for HBV our model predicts that frequent early sampling of plasma virus will lead to reliable estimates of the free virus half-life and the pharmacological properties of the administered drug. On the other hand, for HBV the half-life of infected cells cannot be estimated from plasma virus decay.
Resumo:
In PCR, DNA polymerases from thermophilic bacteria catalyze the extension of primers annealed to templates as well as the structure-specific cleavage of the products of primer extension. Here we show that cleavage by Thermus aquaticus and Thermus thermophilus DNA polymerases can be precise and substantial: it occurs at the base of the stem-loop structure assumed by the single strand products of primer extension using as template a common genetic element, the promoter-operator of the Escherichia coli lactose operon, and may involve up to 30% of the products. The cleavage is independent of primer, template, and triphosphates, is dependent on substrate length and temperature, requires free ends and Mg2+, and is absent in DNA polymerases lacking the 5'-->3' exonuclease, such as the Stoffel fragment and the T7 DNA polymerase. Heterogeneity of the extension products results also from premature detachment of the enzyme approaching the 5' end of the template.
Resumo:
Este artículo analiza la función de la figura de san Vicente Ferrer en la política de afianzamiento y expansión de la Corona de Aragón en la Italia de mediados del siglo XV y hasta el siglo XVIII. San Vicente Ferrer, clásico innegable de la cultura de la Corona de Aragón, ejerció una extraordinaria influencia en el pensamiento, la predicación y la ortodoxia católicas, así como también, en el tablero de ajedrez de la alta política de fines del siglo XIV y principios del XV. Su influencia fue prácticamente ubicua y omnímoda: predicaba a las masas de casi toda Europa occidental, enfervorizadas por sus dotes oratorias y su dominio de las artes de la predicación, al mismo tiempo que ejercía de consejero de máxima confianza de papas, reyes y gobernantes, escribía densos tratados de teología y filosofía moral, y obraba milagros (más de 900 registrados en su Causa de Canonización). Caló muy hondo, también después de su muerte y durante siglos, en toda Italia, que era el gran escenario de la política y de la cultura humanística y del Renacimiento. Ello se aprovechó por parte de la Corona de Aragón para su expansión en Italia, desde la conquista de Nápoles por Alfonso el Magnánimo. Todo ello se analiza en este estudio a partir de obras de arte (capillas, cuadros, retablos, frescos y mosaicos), nunca tenidas en cuenta en este sentido, pero que, como queda demostrado, son muestra y prueba de esa influencia tanto religiosa como también política de la “figura” de este santo valenciano. Se trata en definitiva del análisis de la poliédrica función de los clásicos.
Resumo:
Prepared for the U.S. Dept. of Labor, Employment and Training Administration under contract no. F-5532-5-00-80-30.
Resumo:
Description based on: No. 3 (1958-1962).