956 resultados para Biological analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Oxygen isotope measurements have been made in foraminifera from over 60 deep-sea sediment cores. Taken together with the oxygen isotope measurements published by Emiliani from Caribbean and Equatorial Atlantic cores, this comprises a unique body of stratigraphic data covering most of the important areas of calcareous sediment over the whole world ocean. The oxygen isotopic composition of foraminifera from cores of Late Pleistocene sediment varies in a similar manner in nearly all areas; the variations reflect changes in the oxygen isotopic composition of the ocean. The oceans are mixed in about 1 ka so that ocean isotopic changes, resulting from fluctuations in the quantity of ice stored on the continents, must have occurred almost synchronously in all regions. Thus the oxygen isotope record provides an excellent means of stratigraphic correlation. Cores accumulated at rates of over about 5 cm/ka provide records of oxygen isotopic composition change that are almost unaffected by post-depositional mixing of the sediment. Thus they preserve a detailed record of the advance and retreat of the ice masses in the northern hemisphere, and provide a unique source of information for the study of ice-sheet dynamics.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasing atmospheric CO2 concentration affects calcification in most planktonic calcifiers. Both reduced or stimulated calcification under high CO2 have been reported in the widespread coccolithophore Emiliania huxleyi. This might affect the response of cells to photosynthetically active radiation (PAR; 400-700 nm) and ultraviolet radiation (UVR; 280-400 nm) by altering the thickness of the coccolith layer. Here we show that in the absence of UVR, the calcification rates in E. huxleyi decrease under lowered pH levels (pHNBS of 7.9 and 7.6; pCO2 of 81 and 178 Pa or 804 and 1759 ppmv, respectively) leading to thinned coccolith layers, whereas photosynthetic carbon fixation was slightly enhanced at pH 7.9 but remained unaffected at pH 7.6. Exposure to UVR (UV-A 19.5 W m**-2, UV-B 0.67 W m**-2) in addition to PAR (88.5 W m**-2), however, results in significant inhibition of both photosynthesis and calcification, and these rates are further inhibited with increasing acidification. The combined effects of UVR and seawater acidification resulted in the inhibition of calcification rates by 96% and 99% and that of photosynthesis by 6% and 15%, at pH 7.9 and 7.6, respectively. This differential inhibition of calcification and photosynthesis leads to significant reduction of the ratio of calcification to photosynthesis. Seawater acidification enhanced the transmission of harmful UVR by about 26% through a reduction of the coccolith layer of 31%. Our data indicate that the effect of a high-CO2 and low-pH ocean on E. huxleyi (because of reduced calcification associated with changes in the carbonate system) enhances the detrimental effects of UVR on the main pelagic calcifier.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The combined effects of ocean warming and acidification were compared in larvae from two popula- tions of the cold-eurythermal spider crab Hyas araneus, from one of its southernmost populations (around Helgo- land, southern North Sea, 54°N, habitat temperature 3-18°C; collection: January 2008, hatch: January-February 2008) and from one of its northernmost populations (Svalbard, North Atlantic, 79°N, habitat temperature 0-6°C; collection: July 2008, hatch: February-April 2009). Larvae were exposed to temperatures of 3, 9 and 15°C combined with present-day normocapnic (380 ppm CO2) and projected future CO2 concentrations (710 and 3,000 ppm CO2). Calcium content of whole larvae was measured in freshly hatched Zoea I and after 3, 7 and 14 days during the Megalopa stage. Significant differences between Helgoland and Svalbard Megalopae were observed at all investigated temperatures and CO2 condi- tions. Under 380 ppm CO2, the calcium content increased with rising temperature and age of the larvae. At 3 and 9°C, Helgoland Megalopae accumulated more calcium than Svalbard Megalopae. Elevated CO2 levels, especially 3,000 ppm, caused a reduction in larval calcium contents at 3 and 9°C in both populations. This effect set in early, at 710 ppm CO2 only in Svalbard Megalopae at 9°C. Fur- thermore, at 3 and 9°C Megalopae from Helgoland replenished their calcium content to normocapnic levels and more rapidly than Svalbard Megalopae. However, Svalbard Megalopae displayed higher calcium contents under 3,000 ppm CO2 at 15°C. The findings of a lower capacity for calcium incorporation in crab larvae living at the cold end of their distribution range suggests that they might be more sensitive to ocean acidification than those in temperate regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sedimentary sequences in ancient or long-lived lakes can reach several thousands of meters in thickness and often provide an unrivalled perspective of the lake's regional climatic, environmental, and biological history. Over the last few years, deep-drilling projects in ancient lakes became increasingly multi- and interdisciplinary, as, among others, seismological, sedimentological, biogeochemical, climatic, environmental, paleontological, and evolutionary information can be obtained from sediment cores. However, these multi- and interdisciplinary projects pose several challenges. The scientists involved typically approach problems from different scientific perspectives and backgrounds, and setting up the program requires clear communication and the alignment of interests. One of the most challenging tasks, besides the actual drilling operation, is to link diverse datasets with varying resolution, data quality, and age uncertainties to answer interdisciplinary questions synthetically and coherently. These problems are especially relevant when secondary data, i.e., datasets obtained independently of the drilling operation, are incorporated in analyses. Nonetheless, the inclusion of secondary information, such as isotopic data from fossils found in outcrops or genetic data from extant species, may help to achieve synthetic answers. Recent technological and methodological advances in paleolimnology are likely to increase the possibilities of integrating secondary information. Some of the new approaches have started to revolutionize scientific drilling in ancient lakes, but at the same time, they also add a new layer of complexity to the generation and analysis of sediment-core data. The enhanced opportunities presented by new scientific approaches to study the paleolimnological history of these lakes, therefore, come at the expense of higher logistic, communication, and analytical efforts. Here we review types of data that can be obtained in ancient lake drilling projects and the analytical approaches that can be applied to empirically and statistically link diverse datasets to create an integrative perspective on geological and biological data. In doing so, we highlight strengths and potential weaknesses of new methods and analyses, and provide recommendations for future interdisciplinary deep-drilling projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, the presynaptic rule, a classical rule for hebbian learning, is revisited. It is shown that the presynaptic rule exhibits relevant synaptic properties like synaptic directionality, and LTP metaplasticity (long-term potentiation threshold metaplasticity). With slight modifications, the presynaptic model also exhibits metaplasticity of the long-term depression threshold, being also consistent with Artola, Brocher and Singer’s (ABS) influential model. Two asymptotically equivalent versions of the presynaptic rule were adopted for this analysis: the first one uses an incremental equation while the second, conditional probabilities. Despite their simplicity, both types of presynaptic rules exhibit sophisticated biological properties, specially the probabilistic version

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pragmatism is the leading motivation of regularization. We can understand regularization as a modification of the maximum-likelihood estimator so that a reasonable answer could be given in an unstable or ill-posed situation. To mention some typical examples, this happens when fitting parametric or non-parametric models with more parameters than data or when estimating large covariance matrices. Regularization is usually used, in addition, to improve the bias-variance tradeoff of an estimation. Then, the definition of regularization is quite general, and, although the introduction of a penalty is probably the most popular type, it is just one out of multiple forms of regularization. In this dissertation, we focus on the applications of regularization for obtaining sparse or parsimonious representations, where only a subset of the inputs is used. A particular form of regularization, L1-regularization, plays a key role for reaching sparsity. Most of the contributions presented here revolve around L1-regularization, although other forms of regularization are explored (also pursuing sparsity in some sense). In addition to present a compact review of L1-regularization and its applications in statistical and machine learning, we devise methodology for regression, supervised classification and structure induction of graphical models. Within the regression paradigm, we focus on kernel smoothing learning, proposing techniques for kernel design that are suitable for high dimensional settings and sparse regression functions. We also present an application of regularized regression techniques for modeling the response of biological neurons. Supervised classification advances deal, on the one hand, with the application of regularization for obtaining a na¨ıve Bayes classifier and, on the other hand, with a novel algorithm for brain-computer interface design that uses group regularization in an efficient manner. Finally, we present a heuristic for inducing structures of Gaussian Bayesian networks using L1-regularization as a filter. El pragmatismo es la principal motivación de la regularización. Podemos entender la regularización como una modificación del estimador de máxima verosimilitud, de tal manera que se pueda dar una respuesta cuando la configuración del problema es inestable. A modo de ejemplo, podemos mencionar el ajuste de modelos paramétricos o no paramétricos cuando hay más parámetros que casos en el conjunto de datos, o la estimación de grandes matrices de covarianzas. Se suele recurrir a la regularización, además, para mejorar el compromiso sesgo-varianza en una estimación. Por tanto, la definición de regularización es muy general y, aunque la introducción de una función de penalización es probablemente el método más popular, éste es sólo uno de entre varias posibilidades. En esta tesis se ha trabajado en aplicaciones de regularización para obtener representaciones dispersas, donde sólo se usa un subconjunto de las entradas. En particular, la regularización L1 juega un papel clave en la búsqueda de dicha dispersión. La mayor parte de las contribuciones presentadas en la tesis giran alrededor de la regularización L1, aunque también se exploran otras formas de regularización (que igualmente persiguen un modelo disperso). Además de presentar una revisión de la regularización L1 y sus aplicaciones en estadística y aprendizaje de máquina, se ha desarrollado metodología para regresión, clasificación supervisada y aprendizaje de estructura en modelos gráficos. Dentro de la regresión, se ha trabajado principalmente en métodos de regresión local, proponiendo técnicas de diseño del kernel que sean adecuadas a configuraciones de alta dimensionalidad y funciones de regresión dispersas. También se presenta una aplicación de las técnicas de regresión regularizada para modelar la respuesta de neuronas reales. Los avances en clasificación supervisada tratan, por una parte, con el uso de regularización para obtener un clasificador naive Bayes y, por otra parte, con el desarrollo de un algoritmo que usa regularización por grupos de una manera eficiente y que se ha aplicado al diseño de interfaces cerebromáquina. Finalmente, se presenta una heurística para inducir la estructura de redes Bayesianas Gaussianas usando regularización L1 a modo de filtro.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Digital atlases of animal development provide a quantitative description of morphogenesis, opening the path toward processes modeling. Prototypic atlases offer a data integration framework where to gather information from cohorts of individuals with phenotypic variability. Relevant information for further theoretical reconstruction includes measurements in time and space for cell behaviors and gene expression. The latter as well as data integration in a prototypic model, rely on image processing strategies. Developing the tools to integrate and analyze biological multidimensional data are highly relevant for assessing chemical toxicity or performing drugs preclinical testing. This article surveys some of the most prominent efforts to assemble these prototypes, categorizes them according to salient criteria and discusses the key questions in the field and the future challenges toward the reconstruction of multiscale dynamics in model organisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present work covers the first validation efforts of the EVA Tracking System for the assessment of minimally invasive surgery (MIS) psychomotor skills. Instrument movements were recorded for 42 surgeons (4 expert, 22 residents, 16 novice medical students) and analyzed for a box trainer peg transfer task. Construct validation was established for 7/9 motion analysis parameters (MAPs). Concurrent validation was determined for 8/9 MAPs against the TrEndo Tracking System. Finally, automatic determination of surgical proficiency based on the MAPs was sought by 3 different approaches to supervised classification (LDA, SVM, ANFIS), with accuracy results of 61.9%, 83.3% and 80.9% respectively. Results not only reflect on the validation of EVA for skills? assessment, but also on the relevance of motion analysis of instruments in the determination of surgical competence.