973 resultados para correlated data


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyzed strontium/calcium ratios (Sr/Ca) in four colonies of the Atlantic coral genus Montastrea with growth rates ranging from 2.3 to 12.6 mm/a. Derived Sr/Ca-sea surface temperature (SST) calibrations exhibit significant differences among the four colonies that cannot be explained by variations in SST or seawater Sr/Ca. For a single coral Sr/Ca ratio of 8.8 mmol/mol, the four calibrations predict SSTs ranging from 24.0° to 30.9°C. We find that differences in the Sr/Ca-SST relationships are correlated systematically with the average annual extension rate (ext) of each colony such that Sr/Ca (mmol/mol) = 11.82 (±0.13) - 0.058 (±0.004) * ext (mm/a) - 0.092 (±0.005) * SST (°C). This observation is consistent with previous reports of a link between coral Sr/Ca and growth rate. Verification of our growth-dependent Sr/Ca-SST calibration using a coral excluded from the calibration reconstructs the mean and seasonal amplitude of the actual recorded SST to within 0.3°C. Applying a traditional, nongrowth-dependent Sr/Ca-SST calibration derived from a modern Montastrea to the Sr/Ca ratios of a conspecific coral that grew during the early Little Ice Age (LIA) (400 years B.P.) suggests that Caribbean SSTs were >5°C cooler than today. Conversely, application of our growth-dependent Sr/Ca-SST calibration to Sr/Ca ratios derived from the LIA coral indicates that SSTs during the 5-year period analyzed were within error (±1.4°C) of modern values.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We studied the relationship between flower size and nectar properties of hummingbird-visited flowers in the Brazilian Atlantic Forest. We analysed the nectar volume and concentration as a function of corolla length and the average bill size of visitors for 150 plant species, using the phylogenetic generalized least squares (PGLS) to control for phylogenetic signals in the data. We found that nectar volume is positively correlated with corolla length due to phylogenetic allometry. We also demonstrated that larger flowers provide better rewards for long-billed hummingbirds. Regardless of the causal mechanisms, our results support the hypothesis that morphological floral traits that drive partitioning among hummingbirds correspond to the quantity of resources produced by the flowers in the Atlantic Forest. We demonstrate that the relationship between nectar properties and flower size is affected by phylogenetic constraints and thus future studies assessing the interaction between floral traits need to control for phylogenetic signals in the data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Pacific Decadal Oscillation (PDO), the leading mode of sea surface temperature (SST) anomalies in the extratropical North Pacific Ocean, has widespread impacts on precipitation in the Americas and marine fisheries in the North Pacific. However, marine proxy records with a temporal resolution that resolves interannual to interdecadal SST variability in the extratropical North Pacific are extremely rare. Here we demonstrate that the winter Sr/Ca and U/Ca records of an annually-banded reef coral from the Ogasawara Islands in the western subtropical North Pacific are significantly correlated with the instrumental winter PDO index over the last century. The reconstruction of the PDO is further improved by combining the coral data with an existing eastern mid-latitude North Pacific growth ring record of geoduck clams. The spatial correlations of this combined index with global climate fields suggest that SST proxy records from these locations provide potential for PDO reconstructions further back in time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large scale patterns of ecologically relevant traits may help identify drivers of their variability and conditions beneficial or adverse to the expression of these traits. Antimicrofouling defenses in scleractinian corals regulate the establishment of the associated biofilm as well as the risks of infection. The Saudi Arabian Red Sea coast features a pronounced thermal and nutritional gradient including regions and seasons with potentially stressful conditions to corals. Assessing the patterns of antimicrofouling defenses across the Red Sea may hint at the susceptibility of corals to global change. We investigated microfouling pressure as well as the relative strength of 2 alternative antimicrofouling defenses (chemical antisettlement activity, mucus release) along the pronounced environmental gradient along the Saudi Arabian Red Sea coast in 2 successive years. Microfouling pressure was exceptionally low along most of the coast but sharply increased at the southernmost sites. Mucus release correlated with temperature. Chemical defense tended to anti-correlate with mucus release. As a result, the combined action of mucus release and chemical antimicrofouling defense seemed to warrant sufficient defense against microbes along the entire coast. In the future, however, we expect enhanced energetic strain on corals when warming and/or eutrophication lead to higher bacterial fouling pressure and a shift towards putatively more costly defense by mucus release.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Compile-time program analysis techniques can be applied to Web service orchestrations to prove or check various properties. In particular, service orchestrations can be subjected to resource analysis, in which safe approximations of upper and lower resource usage bounds are deduced. A uniform analysis can be simultaneously performed for different generalized resources that can be directiy correlated with cost- and performance-related quality attributes, such as invocations of partners, network traffic, number of activities, iterations, and data accesses. The resulting safe upper and lower bounds do not depend on probabilistic assumptions, and are expressed as functions of size or length of data components from an initiating message, using a finegrained structured data model that corresponds to the XML-style of information structuring. The analysis is performed by transforming a BPEL-like representation of an orchestration into an equivalent program in another programming language for which the appropriate analysis tools already exist.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Highly correlated ab initio calculations (CCSD(T)) are used to compute gas phase spectroscopic parameters of three isotopologues of the methyl acetate (CH3COOCH3, CD3COOCH3, and CH3COOCD3), searching to help experimental assignments and astrophysical detections. The molecule shows two conformers cis and trans separated by a barrier of 4457 cm−1. The potential energy surface presents 18 minima that intertransform through three internal rotation motions. To analyze the far infrared spectrum at low temperatures, a three-dimensional Hamiltonian is solved variationally. The two methyl torsion barriers are calculated to be 99.2 cm−1 (C–CH3) and 413.1 cm−1 (O–CH3), for the cis-conformer. The three fundamental torsional band centers of CH3COOCH3 are predicted to lie at 63.7 cm−1 (C–CH3), 136.1 cm−1 (O–CH3), and 175.8 cm−1 (C–O torsion) providing torsional state separations. For the 27 vibrational modes, anharmonic fundamentals and rovibrational parameters are provided. Computed parameters are compared with those fitted using experimental data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last few years there has been a heightened interest in data treatment and analysis with the aim of discovering hidden knowledge and eliciting relationships and patterns within this data. Data mining techniques (also known as Knowledge Discovery in Databases) have been applied over a wide range of fields such as marketing, investment, fraud detection, manufacturing, telecommunications and health. In this study, well-known data mining techniques such as artificial neural networks (ANN), genetic programming (GP), forward selection linear regression (LR) and k-means clustering techniques, are proposed to the health and sports community in order to aid with resistance training prescription. Appropriate resistance training prescription is effective for developing fitness, health and for enhancing general quality of life. Resistance exercise intensity is commonly prescribed as a percent of the one repetition maximum. 1RM, dynamic muscular strength, one repetition maximum or one execution maximum, is operationally defined as the heaviest load that can be moved over a specific range of motion, one time and with correct performance. The safety of the 1RM assessment has been questioned as such an enormous effort may lead to muscular injury. Prediction equations could help to tackle the problem of predicting the 1RM from submaximal loads, in order to avoid or at least, reduce the associated risks. We built different models from data on 30 men who performed up to 5 sets to exhaustion at different percentages of the 1RM in the bench press action, until reaching their actual 1RM. Also, a comparison of different existing prediction equations is carried out. The LR model seems to outperform the ANN and GP models for the 1RM prediction in the range between 1 and 10 repetitions. At 75% of the 1RM some subjects (n = 5) could perform 13 repetitions with proper technique in the bench press action, whilst other subjects (n = 20) performed statistically significant (p < 0:05) more repetitions at 70% than at 75% of their actual 1RM in the bench press action. Rate of perceived exertion (RPE) seems not to be a good predictor for 1RM when all the sets are performed until exhaustion, as no significant differences (p < 0:05) were found in the RPE at 75%, 80% and 90% of the 1RM. Also, years of experience and weekly hours of strength training are better correlated to 1RM (p < 0:05) than body weight. O'Connor et al. 1RM prediction equation seems to arise from the data gathered and seems to be the most accurate 1RM prediction equation from those proposed in literature and used in this study. Epley's 1RM prediction equation is reproduced by means of data simulation from 1RM literature equations. Finally, future lines of research are proposed related to the problem of the 1RM prediction by means of genetic algorithms, neural networks and clustering techniques. RESUMEN En los últimos años ha habido un creciente interés en el tratamiento y análisis de datos con el propósito de descubrir relaciones, patrones y conocimiento oculto en los mismos. Las técnicas de data mining (también llamadas de \Descubrimiento de conocimiento en bases de datos\) se han aplicado consistentemente a lo gran de un gran espectro de áreas como el marketing, inversiones, detección de fraude, producción industrial, telecomunicaciones y salud. En este estudio, técnicas bien conocidas de data mining como las redes neuronales artificiales (ANN), programación genética (GP), regresión lineal con selección hacia adelante (LR) y la técnica de clustering k-means, se proponen a la comunidad del deporte y la salud con el objetivo de ayudar con la prescripción del entrenamiento de fuerza. Una apropiada prescripción de entrenamiento de fuerza es efectiva no solo para mejorar el estado de forma general, sino para mejorar la salud e incrementar la calidad de vida. La intensidad en un ejercicio de fuerza se prescribe generalmente como un porcentaje de la repetición máxima. 1RM, fuerza muscular dinámica, una repetición máxima o una ejecución máxima, se define operacionalmente como la carga máxima que puede ser movida en un rango de movimiento específico, una vez y con una técnica correcta. La seguridad de las pruebas de 1RM ha sido cuestionada debido a que el gran esfuerzo requerido para llevarlas a cabo puede derivar en serias lesiones musculares. Las ecuaciones predictivas pueden ayudar a atajar el problema de la predicción de la 1RM con cargas sub-máximas y son empleadas con el propósito de eliminar o al menos, reducir los riesgos asociados. En este estudio, se construyeron distintos modelos a partir de los datos recogidos de 30 hombres que realizaron hasta 5 series al fallo en el ejercicio press de banca a distintos porcentajes de la 1RM, hasta llegar a su 1RM real. También se muestra una comparación de algunas de las distintas ecuaciones de predicción propuestas con anterioridad. El modelo LR parece superar a los modelos ANN y GP para la predicción de la 1RM entre 1 y 10 repeticiones. Al 75% de la 1RM algunos sujetos (n = 5) pudieron realizar 13 repeticiones con una técnica apropiada en el ejercicio press de banca, mientras que otros (n = 20) realizaron significativamente (p < 0:05) más repeticiones al 70% que al 75% de su 1RM en el press de banca. El ínndice de esfuerzo percibido (RPE) parece no ser un buen predictor del 1RM cuando todas las series se realizan al fallo, puesto que no existen diferencias signifiativas (p < 0:05) en el RPE al 75%, 80% y el 90% de la 1RM. Además, los años de experiencia y las horas semanales dedicadas al entrenamiento de fuerza están más correlacionadas con la 1RM (p < 0:05) que el peso corporal. La ecuación de O'Connor et al. parece surgir de los datos recogidos y parece ser la ecuación de predicción de 1RM más precisa de aquellas propuestas en la literatura y empleadas en este estudio. La ecuación de predicción de la 1RM de Epley es reproducida mediante simulación de datos a partir de algunas ecuaciones de predicción de la 1RM propuestas con anterioridad. Finalmente, se proponen futuras líneas de investigación relacionadas con el problema de la predicción de la 1RM mediante algoritmos genéticos, redes neuronales y técnicas de clustering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Una apropiada evaluación de los márgenes de seguridad de una instalación nuclear, por ejemplo, una central nuclear, tiene en cuenta todas las incertidumbres que afectan a los cálculos de diseño, funcionanmiento y respuesta ante accidentes de dicha instalación. Una fuente de incertidumbre son los datos nucleares, que afectan a los cálculos neutrónicos, de quemado de combustible o activación de materiales. Estos cálculos permiten la evaluación de las funciones respuesta esenciales para el funcionamiento correcto durante operación, y también durante accidente. Ejemplos de esas respuestas son el factor de multiplicación neutrónica o el calor residual después del disparo del reactor. Por tanto, es necesario evaluar el impacto de dichas incertidumbres en estos cálculos. Para poder realizar los cálculos de propagación de incertidumbres, es necesario implementar metodologías que sean capaces de evaluar el impacto de las incertidumbres de estos datos nucleares. Pero también es necesario conocer los datos de incertidumbres disponibles para ser capaces de manejarlos. Actualmente, se están invirtiendo grandes esfuerzos en mejorar la capacidad de analizar, manejar y producir datos de incertidumbres, en especial para isótopos importantes en reactores avanzados. A su vez, nuevos programas/códigos están siendo desarrollados e implementados para poder usar dichos datos y analizar su impacto. Todos estos puntos son parte de los objetivos del proyecto europeo ANDES, el cual ha dado el marco de trabajo para el desarrollo de esta tesis doctoral. Por tanto, primero se ha llevado a cabo una revisión del estado del arte de los datos nucleares y sus incertidumbres, centrándose en los tres tipos de datos: de decaimiento, de rendimientos de fisión y de secciones eficaces. A su vez, se ha realizado una revisión del estado del arte de las metodologías para la propagación de incertidumbre de estos datos nucleares. Dentro del Departamento de Ingeniería Nuclear (DIN) se propuso una metodología para la propagación de incertidumbres en cálculos de evolución isotópica, el Método Híbrido. Esta metodología se ha tomado como punto de partida para esta tesis, implementando y desarrollando dicha metodología, así como extendiendo sus capacidades. Se han analizado sus ventajas, inconvenientes y limitaciones. El Método Híbrido se utiliza en conjunto con el código de evolución isotópica ACAB, y se basa en el muestreo por Monte Carlo de los datos nucleares con incertidumbre. En esta metodología, se presentan diferentes aproximaciones según la estructura de grupos de energía de las secciones eficaces: en un grupo, en un grupo con muestreo correlacionado y en multigrupos. Se han desarrollado diferentes secuencias para usar distintas librerías de datos nucleares almacenadas en diferentes formatos: ENDF-6 (para las librerías evaluadas), COVERX (para las librerías en multigrupos de SCALE) y EAF (para las librerías de activación). Gracias a la revisión del estado del arte de los datos nucleares de los rendimientos de fisión se ha identificado la falta de una información sobre sus incertidumbres, en concreto, de matrices de covarianza completas. Además, visto el renovado interés por parte de la comunidad internacional, a través del grupo de trabajo internacional de cooperación para evaluación de datos nucleares (WPEC) dedicado a la evaluación de las necesidades de mejora de datos nucleares mediante el subgrupo 37 (SG37), se ha llevado a cabo una revisión de las metodologías para generar datos de covarianza. Se ha seleccionando la actualización Bayesiana/GLS para su implementación, y de esta forma, dar una respuesta a dicha falta de matrices completas para rendimientos de fisión. Una vez que el Método Híbrido ha sido implementado, desarrollado y extendido, junto con la capacidad de generar matrices de covarianza completas para los rendimientos de fisión, se han estudiado diferentes aplicaciones nucleares. Primero, se estudia el calor residual tras un pulso de fisión, debido a su importancia para cualquier evento después de la parada/disparo del reactor. Además, se trata de un ejercicio claro para ver la importancia de las incertidumbres de datos de decaimiento y de rendimientos de fisión junto con las nuevas matrices completas de covarianza. Se han estudiado dos ciclos de combustible de reactores avanzados: el de la instalación europea para transmutación industrial (EFIT) y el del reactor rápido de sodio europeo (ESFR), en los cuales se han analizado el impacto de las incertidumbres de los datos nucleares en la composición isotópica, calor residual y radiotoxicidad. Se han utilizado diferentes librerías de datos nucleares en los estudios antreriores, comparando de esta forma el impacto de sus incertidumbres. A su vez, mediante dichos estudios, se han comparando las distintas aproximaciones del Método Híbrido y otras metodologías para la porpagación de incertidumbres de datos nucleares: Total Monte Carlo (TMC), desarrollada en NRG por A.J. Koning y D. Rochman, y NUDUNA, desarrollada en AREVA GmbH por O. Buss y A. Hoefer. Estas comparaciones demostrarán las ventajas del Método Híbrido, además de revelar sus limitaciones y su rango de aplicación. ABSTRACT For an adequate assessment of safety margins of nuclear facilities, e.g. nuclear power plants, it is necessary to consider all possible uncertainties that affect their design, performance and possible accidents. Nuclear data are a source of uncertainty that are involved in neutronics, fuel depletion and activation calculations. These calculations can predict critical response functions during operation and in the event of accident, such as decay heat and neutron multiplication factor. Thus, the impact of nuclear data uncertainties on these response functions needs to be addressed for a proper evaluation of the safety margins. Methodologies for performing uncertainty propagation calculations need to be implemented in order to analyse the impact of nuclear data uncertainties. Nevertheless, it is necessary to understand the current status of nuclear data and their uncertainties, in order to be able to handle this type of data. Great eórts are underway to enhance the European capability to analyse/process/produce covariance data, especially for isotopes which are of importance for advanced reactors. At the same time, new methodologies/codes are being developed and implemented for using and evaluating the impact of uncertainty data. These were the objectives of the European ANDES (Accurate Nuclear Data for nuclear Energy Sustainability) project, which provided a framework for the development of this PhD Thesis. Accordingly, first a review of the state-of-the-art of nuclear data and their uncertainties is conducted, focusing on the three kinds of data: decay, fission yields and cross sections. A review of the current methodologies for propagating nuclear data uncertainties is also performed. The Nuclear Engineering Department of UPM has proposed a methodology for propagating uncertainties in depletion calculations, the Hybrid Method, which has been taken as the starting point of this thesis. This methodology has been implemented, developed and extended, and its advantages, drawbacks and limitations have been analysed. It is used in conjunction with the ACAB depletion code, and is based on Monte Carlo sampling of variables with uncertainties. Different approaches are presented depending on cross section energy-structure: one-group, one-group with correlated sampling and multi-group. Differences and applicability criteria are presented. Sequences have been developed for using different nuclear data libraries in different storing-formats: ENDF-6 (for evaluated libraries) and COVERX (for multi-group libraries of SCALE), as well as EAF format (for activation libraries). A revision of the state-of-the-art of fission yield data shows inconsistencies in uncertainty data, specifically with regard to complete covariance matrices. Furthermore, the international community has expressed a renewed interest in the issue through the Working Party on International Nuclear Data Evaluation Co-operation (WPEC) with the Subgroup (SG37), which is dedicated to assessing the need to have complete nuclear data. This gives rise to this review of the state-of-the-art of methodologies for generating covariance data for fission yields. Bayesian/generalised least square (GLS) updating sequence has been selected and implemented to answer to this need. Once the Hybrid Method has been implemented, developed and extended, along with fission yield covariance generation capability, different applications are studied. The Fission Pulse Decay Heat problem is tackled first because of its importance during events after shutdown and because it is a clean exercise for showing the impact and importance of decay and fission yield data uncertainties in conjunction with the new covariance data. Two fuel cycles of advanced reactors are studied: the European Facility for Industrial Transmutation (EFIT) and the European Sodium Fast Reactor (ESFR), and response function uncertainties such as isotopic composition, decay heat and radiotoxicity are addressed. Different nuclear data libraries are used and compared. These applications serve as frameworks for comparing the different approaches of the Hybrid Method, and also for comparing with other methodologies: Total Monte Carlo (TMC), developed at NRG by A.J. Koning and D. Rochman, and NUDUNA, developed at AREVA GmbH by O. Buss and A. Hoefer. These comparisons reveal the advantages, limitations and the range of application of the Hybrid Method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this contribution a novel iterative bit- and power allocation (IBPA) approach has been developed when transmitting a given bit/s/Hz data rate over a correlated frequency non-selective (4× 4) Multiple-Input MultipleOutput (MIMO) channel. The iterative resources allocation algorithm developed in this investigation is aimed at the achievement of the minimum bit-error rate (BER) in a correlated MIMO communication system. In order to achieve this goal, the available bits are iteratively allocated in the MIMO active layers which present the minimum transmit power requirement per time slot.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

H3 phosphorylation has been correlated with mitosis temporally in mammalian cells and spatially in ciliated protozoa. In logarithmically growing Tetrahymena thermophila cells, for example, H3 phosphorylation can be detected in germline micronuclei that divide mitotically but not in somatic macronuclei that divide amitotically. Here, we demonstrate that micronuclear H3 phosphorylation occurs at a single site (Ser-10) in the amino-terminal domain of histone H3, the same site phosphorylated during mitosis in mammalian cells. Using an antibody specific for Ser-10 phosphorylated H3, we show that, in Tetrahymena, this modification is correlated with mitotic and meiotic divisions of micronuclei in a fashion that closely coincides with chromosome condensation. Our data suggest that H3 phosphorylation at Ser-10 is a highly conserved event among eukaryotes and is likely involved in both mitotic and meiotic chromosome condensation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The retroviral oncogene qin codes for a protein that belongs to the family of the winged helix transcription factors. The viral Qin protein, v-Qin, differs from its cellular counterpart, c-Qin, by functioning as a stronger transcriptional repressor and a more efficient inducer of tumors. This observation suggests that repression may be important in tumorigenesis. To test this possibility, chimeric proteins were constructed in which the Qin DNA-binding domain was fused to either a strong repressor domain (derived from the Drosophila Engrailed protein) or a strong activator domain (from the herpes simplex virus VP16 protein). The chimeric transcriptional repressor, Qin–Engrailed, transformed chicken embryo fibroblasts in culture and induced sarcomas in young chickens. The chimeric activator, Qin–VP16, failed to transform cells in vitro or in vivo and caused cellular resistance to oncogenic transformation by Qin. These data support the conclusion that the Qin protein induces oncogenic transformation by repressing the transcription of genes which function as negative growth regulators or tumor suppressors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Polymorphic regions consisting of a variable number of tandem repeats within intron 2 of the gene coding for the serotonin transporter protein 5-HTT have been associated with susceptibility to affective disorders. We have cloned two of these intronic polymorphisms, Stin2.10 and Stin2.12, into an expression vector containing a heterologous minimal promoter and the bacterial LacZ reporter gene. These constructs were then used to produce transgenic mice. In embryonic day 10.5 embryos, both Stin2.10 and Stin2.12 produced consistent β-galactosidase expression in the embryonic midbrain, hindbrain, and spinal cord floor plate. However, we observed that the levels of β-galactosidase expression produced by both the Stin2.10 and Stin2.12 within the rostral hindbrain differed significantly at embryonic day 10.5. Our data suggest that these polymorphic variable number of tandem repeats regions act as transcriptional regulators and have allele-dependent differential enhancer-like properties within an area of the hindbrain where the 5-HTT gene is known to be transcribed at this stage of development.