154 resultados para Fantôme de calibration


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Calibrated BOLD fMRI is a promising alternative to the classic BOLD contrast due to its reduced venous sensitivity and greater physiological specificity. The delayed adoption of this technique for cognitive studies may stem partly from a lack of information on the reproducibility of these measures in the context of cognitive tasks. In this study we have explored the applicability and reproducibility of a state-of-the-art calibrated BOLD technique using a complex functional task at 7 tesla. Reproducibility measures of BOLD, CBF, CMRO2 flow-metabolism coupling n and the calibration parameter M were compared and interpreted for three ROIs. We found an averaged intra-subject variation of CMRO2 of 8% across runs and 33% across days. BOLD (46% across runs, 36% across days), CBF (33% across runs, 46% across days) and M (41% across days) showed significantly higher intra-subject variability. Inter-subject variability was found to be high for all quantities, though CMRO2 was the most consistent across brain regions. The results of this study provide evidence that calibrated BOLD may be a viable alternative for longitudinal and cognitive MRI studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: To describe a method to obtain a profile of the duration and intensity (speed) of walking periods over 24 hours in women under free-living conditions. DESIGN: A new method based on accelerometry was designed for analyzing walking activity. In order to take into account inter-individual variability of acceleration, an individual calibration process was used. Different experiments were performed to highlight the variability of acceleration vs walking speed relationship, to analyze the speed prediction accuracy of the method, and to test the assessment of walking distance and duration over 24-h. SUBJECTS: Twenty-eight women were studied (mean+/-s.d.) age: 39.3+/-8.9 y; body mass: 79.7+/-11.1 kg; body height: 162.9+/-5.4 cm; and body mass index (BMI) 30.0+/-3.8 kg/m(2). RESULTS: Accelerometer output was significantly correlated with speed during treadmill walking (r=0.95, P<0.01), and short unconstrained walks (r=0.86, P<0.01), although with a large inter-individual variation of the regression parameters. By using individual calibration, it was possible to predict walking speed on a standard urban circuit (predicted vs measured r=0.93, P<0.01, s.e.e.=0.51 km/h). In the free-living experiment, women spent on average 79.9+/-36.0 (range: 31.7-168.2) min/day in displacement activities, from which discontinuous short walking activities represented about 2/3 and continuous ones 1/3. Total walking distance averaged 2.1+/-1.2 (range: 0.4-4.7) km/day. It was performed at an average speed of 5.0+/-0.5 (range: 4.1-6.0) km/h. CONCLUSION: An accelerometer measuring the anteroposterior acceleration of the body can estimate walking speed together with the pattern, intensity and duration of daily walking activity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In forensic science, there is a strong interest in determining the post-mortem interval (PMI) of human skeletal remains up to 50 years after death. Currently, there are no reliable methods to resolve PMI, the determination of which relies almost exclusively on the experience of the investigating expert. Here we measured (90)Sr and (210)Pb ((210)Po) incorporated into bones through a biogenic process as indicators of the time elapsed since death. We hypothesised that the activity of radionuclides incorporated into trabecular bone will more accurately match the activity in the environment and the food chain at the time of death than the activity in cortical bone because of a higher remodelling rate. We found that determining (90)Sr can yield reliable PMI estimates as long as a calibration curve exists for (90)Sr covering the studied area and the last 50 years. We also found that adding the activity of (210)Po, a proxy for naturally occurring (210)Pb incorporated through ingestion, to the (90)Sr dating increases the reliability of the PMI value. Our results also show that trabecular bone is subject to both (90)Sr and (210)Po diagenesis. Accordingly, we used a solubility profile method to determine the biogenic radionuclide only, and we are proposing a new method of bone decontamination to be used prior to (90)Sr and (210)Pb dating.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science- Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A simple and sensitive liquid chromatography-electrospray ionization mass spectrometry method was developed for the simultaneous quantification in human plasma of all selective serotonin reuptake inhibitors (citalopram, fluoxetine, fluvoxamine, paroxetine and sertraline) and their main active metabolites (desmethyl-citalopram and norfluoxetine). A stable isotope-labeled internal standard was used for each analyte to compensate for the global method variability, including extraction and ionization variations. After sample (250μl) pre-treatment with acetonitrile (500μl) to precipitate proteins, a fast solid-phase extraction procedure was performed using mixed mode Oasis MCX 96-well plate. Chromatographic separation was achieved in less than 9.0min on a XBridge C18 column (2.1×100mm; 3.5μm) using a gradient of ammonium acetate (pH 8.1; 50mM) and acetonitrile as mobile phase at a flow rate of 0.3ml/min. The method was fully validated according to Société Française des Sciences et Techniques Pharmaceutiques protocols and the latest Food and Drug Administration guidelines. Six point calibration curves were used to cover a large concentration range of 1-500ng/ml for citalopram, desmethyl-citalopram, paroxetine and sertraline, 1-1000ng/ml for fluoxetine and fluvoxamine, and 2-1000ng/ml for norfluoxetine. Good quantitative performances were achieved in terms of trueness (84.2-109.6%), repeatability (0.9-14.6%) and intermediate precision (1.8-18.0%) in the entire assay range including the lower limit of quantification. Internal standard-normalized matrix effects were lower than 13%. The accuracy profiles (total error) were mainly included in the acceptance limits of ±30% for biological samples. The method was successfully applied for routine therapeutic drug monitoring of more than 1600 patient plasma samples over 9 months. The β-expectation tolerance intervals determined during the validation phase were coherent with the results of quality control samples analyzed during routine use. This method is therefore precise and suitable both for therapeutic drug monitoring and pharmacokinetic studies in most clinical laboratories.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Whole-body counting is a technique of choice for assessing the intake of gamma-emitting radionuclides. An appropriate calibration is necessary, which is done either by experimental measurement or by Monte Carlo (MC) calculation. The aim of this work was to validate a MC model for calibrating whole-body counters (WBCs) by comparing the results of computations with measurements performed on an anthropomorphic phantom and to investigate the effect of a change in phantom's position on the WBC counting sensitivity. GEANT MC code was used for the calculations, and an IGOR phantom loaded with several types of radionuclides was used for the experimental measurements. The results show a reasonable agreement between measurements and MC computation. A 1-cm error in phantom positioning changes the activity estimation by >2%. Considering that a 5-cm deviation of the positioning of the phantom may occur in a realistic counting scenario, this implies that the uncertainty of the activity measured by a WBC is ∼10-20%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Genetically constructed microbial biosensors for measuring organic pollutants are mostly applied in aqueous samples. Unfortunately, the detection limit of most biosensors is insufficient to detect pollutants at low but environmentally relevant concentrations. However, organic pollutants with low levels of water solubility often have significant gas-water partitioning coefficients, which in principle makes it possible to measure such compounds in the gas rather than the aqueous phase. Here we describe the first use of a microbial biosensor for measuring organic pollutants directly in the gas phase. For this purpose, we reconstructed a bioluminescent Pseudomonas putida naphthalene biosensor strain to carry the NAH7 plasmid and a chromosomally inserted gene fusion between the sal promoter and the luxAB genes. Specific calibration studies were performed with suspended and filter-immobilized biosensor cells, in aqueous solution and in the gas phase. Gas phase measurements with filter-immobilized biosensor cells in closed flasks, with a naphthalene-contaminated aqueous phase, showed that the biosensor cells can measure naphthalene effectively. The biosensor cells on the filter responded with increasing light output proportional to the naphthalene concentration added to the water phase, even though only a small proportion of the naphthalene was present in the gas phase. In fact, the biosensor cells could concentrate a larger proportion of naphthalene through the gas phase than in the aqueous suspension, probably due to faster transport of naphthalene to the cells in the gas phase. This led to a 10-fold lower detectable aqueous naphthalene concentration (50 nM instead of 0.5 micro M). Thus, the use of bacterial biosensors for measuring organic pollutants in the gas phase is a valid method for increasing the sensitivity of these valuable biological devices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

RESUME Les évidences montrant que les changements globaux affectent la biodiversité s'accumulent. Les facteurs les plus influant dans ce processus sont les changements et destructions d'habitat, l'expansion des espèces envahissantes et l'impact des changements climatiques. Une évaluation pertinente de la réponse des espèces face à ces changements est essentielle pour proposer des mesures permettant de réduire le déclin actuel de la biodiversité. La modélisation de la répartition d'espèces basée sur la niche (NBM) est l'un des rares outils permettant cette évaluation. Néanmoins, leur application dans le contexte des changements globaux repose sur des hypothèses restrictives et demande une interprétation critique. Ce travail présente une série d'études de cas investiguant les possibilités et limitations de cette approche pour prédire l'impact des changements globaux. Deux études traitant des menaces sur les espèces rares et en danger d'extinction sont présentées. Les caractéristiques éco-géographiques de 118 plantes avec un haut degré de priorité de conservation sont revues. La prévalence des types de rareté sont analysées en relation avec leur risque d'extinction UICN. La revue souligne l'importance de la conservation à l'échelle régionale. Une évaluation de la rareté à échelle globale peut être trompeuse pour certaine espèces car elle ne tient pas en compte des différents degrés de rareté que présente une espèce à différentes échelles spatiales. La deuxième étude test une approche pour améliorer l'échantillonnage d'espèces rares en incluant des phases itératives de modélisation et d'échantillonnage sur le terrain. L'application de l'approche en biologie de la conservation (illustrée ici par le cas du chardon bleu, Eryngium alpinum), permettrait de réduire le temps et les coûts d'échantillonnage. Deux études sur l'impact des changements climatiques sur la faune et la flore africaine sont présentées. La première étude évalue la sensibilité de 227 mammifères africains face aux climatiques d'ici 2050. Elle montre qu'un nombre important d'espèces pourrait être bientôt en danger d'extinction et que les parcs nationaux africains (principalement ceux situé en milieux xériques) pourraient ne pas remplir leur mandat de protection de la biodiversité dans le futur. La seconde étude modélise l'aire de répartition en 2050 de 975 espèces de plantes endémiques du sud de l'Afrique. L'étude propose l'inclusion de méthodes améliorant la prédiction des risques liés aux changements climatiques. Elle propose également une méthode pour estimer a priori la sensibilité d'une espèce aux changements climatiques à partir de ses propriétés écologiques et des caractéristiques de son aire de répartition. Trois études illustrent l'utilisation des modèles dans l'étude des invasions biologiques. Une première étude relate l'expansion de la laitue sáuvage (Lactuca serriola) vers le nord de l'Europe en lien avec les changements du climat depuis 250 ans. La deuxième étude analyse le potentiel d'invasion de la centaurée tachetée (Centaures maculosa), une mauvaise herbe importée en Amérique du nord vers 1890. L'étude apporte la preuve qu'une espèce envahissante peut occuper une niche climatique différente après introduction sur un autre continent. Les modèles basés sur l'aire native prédisent de manière incorrecte l'entier de l'aire envahie mais permettent de prévoir les aires d'introductions potentielles. Une méthode alternative, incluant la calibration du modèle à partir des deux aires où l'espèce est présente, est proposée pour améliorer les prédictions de l'invasion en Amérique du nord. Je présente finalement une revue de la littérature sur la dynamique de la niche écologique dans le temps et l'espace. Elle synthétise les récents développements théoriques concernant le conservatisme de la niche et propose des solutions pour améliorer la pertinence des prédictions d'impact des changements climatiques et des invasions biologiques. SUMMARY Evidences are accumulating that biodiversity is facing the effects of global change. The most influential drivers of change in ecosystems are land-use change, alien species invasions and climate change impacts. Accurate projections of species' responses to these changes are needed to propose mitigation measures to slow down the on-going erosion of biodiversity. Niche-based models (NBM) currently represent one of the only tools for such projections. However, their application in the context of global changes relies on restrictive assumptions, calling for cautious interpretations. In this thesis I aim to assess the effectiveness and shortcomings of niche-based models for the study of global change impacts on biodiversity through the investigation of specific, unsolved limitations and suggestion of new approaches. Two studies investigating threats to rare and endangered plants are presented. I review the ecogeographic characteristic of 118 endangered plants with high conservation priority in Switzerland. The prevalence of rarity types among plant species is analyzed in relation to IUCN extinction risks. The review underlines the importance of regional vs. global conservation and shows that a global assessment of rarity might be misleading for some species because it can fail to account for different degrees of rarity at a variety of spatial scales. The second study tests a modeling framework including iterative steps of modeling and field surveys to improve the sampling of rare species. The approach is illustrated with a rare alpine plant, Eryngium alpinum and shows promise for complementing conservation practices and reducing sampling costs. Two studies illustrate the impacts of climate change on African taxa. The first one assesses the sensitivity of 277 mammals at African scale to climate change by 2050 in terms of species richness and turnover. It shows that a substantial number of species could be critically endangered in the future. National parks situated in xeric ecosystems are not expected to meet their mandate of protecting current species diversity in the future. The second study model the distribution in 2050 of 975 endemic plant species in southern Africa. The study proposes the inclusion of new methodological insights improving the accuracy and ecological realism of predictions of global changes studies. It also investigates the possibility to estimate a priori the sensitivity of a species to climate change from the geographical distribution and ecological proprieties of the species. Three studies illustrate the application of NBM in the study of biological invasions. The first one investigates the Northwards expansion of Lactuca serriola L. in Europe during the last 250 years in relation with climate changes. In the last two decades, the species could not track climate change due to non climatic influences. A second study analyses the potential invasion extent of spotted knapweed, a European weed first introduced into North America in the 1890s. The study provides one of the first empirical evidence that an invasive species can occupy climatically distinct niche spaces following its introduction into a new area. Models fail to predict the current full extent of the invasion, but correctly predict areas of introduction. An alternative approach, involving the calibration of models with pooled data from both ranges, is proposed to improve predictions of the extent of invasion on models based solely on the native range. I finally present a review on the dynamic nature of ecological niches in space and time. It synthesizes the recent theoretical developments to the niche conservatism issues and proposes solutions to improve confidence in NBM predictions of the impacts of climate change and species invasions on species distributions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To assess the global cardiovascular (CV) risk of an individual, several scores have been developed. However, their accuracy and comparability need to be evaluated in populations others from which they were derived. The aim of this study was to compare the predictive accuracy of 4 CV risk scores using data of a large population-based cohort. Methods: Prospective cohort study including 4980 participants (2698 women, mean age± SD: 52.7±10.8 years) in Lausanne, Switzerland followed for an average of 5.5 years (range 0.2 - 8.5). Two end points were assessed: 1) coronary heart disease (CHD), and 2) CV diseases (CVD). Four risk scores were compared: original and recalibrated Framingham coronary heart disease scores (1998 and 2001); original PROCAM score (2002) and its recalibrated version for Switzerland (IAS-AGLA); Reynolds risk score. Discrimination was assessed using Harrell's C statistics, model fitness using Akaike's information criterion (AIC) and calibration using pseudo Hosmer-Lemeshow test. The sensitivity, specificity and corresponding 95% confidence intervals were assessed for each risk score using the highest risk category ([20+ % at 10 years) as the "positive" test. Results: Recalibrated and original 1998 and original 2001 Framingham scores show better discrimination (>0.720) and model fitness (low AIC) for CHD and CVD. All 4 scores are correctly calibrated (Chi2<20). The recalibrated Framingham 1998 score has the best sensitivities, 37.8% and 40.4%, for CHD and CVD, respectively. All scores present specificities >90%. Framingham 1998, PROCAM and IAS-AGLA scores include the greatest proportion of subjects (>200) in the high risk category whereas recalibrated Framingham 2001 and Reynolds include <=44 subjects. Conclusion: In this cohort, we see variations of accuracy between risk scores, the original Framingham 2001 score demonstrating the best compromise between its accuracy and its limited selection of subjects in the highest risk category. We advocate that national guidelines, based on independently validated data, take into account calibrated CV risk scores for their respective countries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The physical disector is a method of choice for estimating unbiased neuron numbers; nevertheless, calibration is needed to evaluate each counting method. The validity of this method can be assessed by comparing the estimated cell number with the true number determined by a direct counting method in serial sections. We reconstructed a 1/5 of rat lumbar dorsal root ganglia taken from two experimental conditions. From each ganglion, images of 200 adjacent semi-thin sections were used to reconstruct a volumetric dataset (stack of voxels). On these stacks the number of sensory neurons was estimated and counted respectively by physical disector and direct counting methods. Also, using the coordinates of nuclei from the direct counting, we simulate, by a Matlab program, disector pairs separated by increasing distances in a ganglion model. The comparison between the results of these approaches clearly demonstrates that the physical disector method provides a valid and reliable estimate of the number of sensory neurons only when the distance between the consecutive disector pairs is 60 microm or smaller. In these conditions the size of error between the results of physical disector and direct counting does not exceed 6%. In contrast when the distance between two pairs is larger than 60 microm (70-200 microm) the size of error increases rapidly to 27%. We conclude that the physical dissector method provides a reliable estimate of the number of rat sensory neurons only when the separating distance between the consecutive dissector pairs is no larger than 60 microm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A sensitive and selective ultra-high performance liquid chromatography (UHPLC) tandem mass spectrometry (MS/MS) method was developed for the fast quantification of ten psychotropic drugs and metabolites in human plasma for the needs of our laboratory (amisulpride, asenapine, desmethyl-mirtazapine, iloperidone, mirtazapine, norquetiapine, olanzapine, paliperidone, quetiapine and risperidone). Stable isotope-labeled internal standards were used for all analytes, to compensate for the global method variability, including extraction and ionization variations. Sample preparation was performed by generic protein precipitation with acetonitrile. Chromatographic separation was achieved in less than 3.0min on an Acquity UPLC BEH Shield RP18 column (2.1mm×50mm; 1.7μm), using a gradient elution of 10mM ammonium formate buffer pH 3.0 and acetonitrile at a flow rate of 0.4ml/min. The compounds were quantified on a tandem quadrupole mass spectrometer operating in positive electrospray ionization mode, using multiple reaction monitoring. The method was fully validated according to the latest recommendations of international guidelines. Eight point calibration curves were used to cover a large concentration range 0.5-200ng/ml for asenapine, desmethyl-mirtazapine, iloperidone, mirtazapine, olanzapine, paliperidone and risperidone, and 1-1500ng/ml for amisulpride, norquetiapine and quetiapine. Good quantitative performances were achieved in terms of trueness (93.1-111.2%), repeatability (1.3-8.6%) and intermediate precision (1.8-11.5%). Internal standard-normalized matrix effects ranged between 95 and 105%, with a variability never exceeding 6%. The accuracy profiles (total error) were included in the acceptance limits of ±30% for biological samples. This method is therefore suitable for both therapeutic drug monitoring and pharmacokinetic studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We examined phylogenetic relationships among six species representing three subfamilies, Glirinae, Graphiurinae and Leithiinae with sequences from three nuclear protein-coding genes (apolipoprotein B, APOB; interphotoreceptor retinoid-binding protein, IRBP; recombination-activating gene 1, RAG1). Phylogenetic trees reconstructed from maximum-parsimony (MP), maximum-likelihood (ML) and Bayesian-inference (BI) analyses showed the monophyly of Glirinae (Glis and Glirulus) and Leithiinae (Dryomys, Eliomys and Muscardinus) with strong support, although the branch length maintaining this relationship was very short, implying rapid diversification among the three subfamilies. Divergence time estimates were calculated from ML (local clock model) and Bayesian-dating method using a calibration point of 25 Myr (million years) ago for the divergence between Glis and Glirulus, and 55 Myr ago for the split between lineages of Gliridae and Sciuridae on the basis of fossil records. The results showed that each lineage of Graphiuros, Glis, Glirulus and Muscardinus dates from the Late Oligocene to the Early Miocene period, which is mostly in agreement with fossil records. Taking into account that warm climate harbouring a glirid-favoured forest dominated from Europe to Asia during this period, it is considered that this warm environment triggered the prosperity of the glirid species through the rapid diversification. Glirulus japonicas is suggested to be a relict of this ancient diversification during the warm period.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

New isotopic results on bulk carbonate and mollusc (gastropods and bivalves) samples from Lake Geneva (Switzerland), spanning the period from the Oldest Dryas to the present day, are compared with pre-existing stable isotope data. According to preliminary calibration of modern samples, Lake Geneva endogenic calcite precipitates at or near oxygen isotopic equilibrium with ambient water, confirming the potential of this large lake to record paleoenvironmental and paleoclimatic changes. The onset of endogenic calcite precipitation at the beginning of the Allerod biozone is clearly indicated by the oxygen isotopic signature of bulk carbonate. A large change in delta(13)C values occurs during the Preboreal. This carbon shift is likely to be due to a change in bioproductivity and/or to a `'catchment effect'', the contribution of biogenic CO2 from the catchment area to the dissolved inorganic carbon reservoir of the lake water becoming significant only during the Preboreal. Gastropods are confirmed as valuable for studies of changes in paleotemperature and in paleowater isotopic composition, despite the presence of a vital effect. Mineralogical evidence indicates an increased detrital influence upon sedimentation since the Subboreal time period. On the other hand, stable isotope measurements of Subatlantic carbonate sediments show values comparable to those of pure endogenic calcite and of gastropods (taking into account the vital effect). This apparent disagreement still remains difficult to explain.