978 resultados para Must -- Analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The nematode Caenorhabditis elegans is a well-known model organism used to investigate fundamental questions in biology. Motility assays of this small roundworm are designed to study the relationships between genes and behavior. Commonly, motility analysis is used to classify nematode movements and characterize them quantitatively. Over the past years, C. elegans' motility has been studied across a wide range of environments, including crawling on substrates, swimming in fluids, and locomoting through microfluidic substrates. However, each environment often requires customized image processing tools relying on heuristic parameter tuning. In the present study, we propose a novel Multi-Environment Model Estimation (MEME) framework for automated image segmentation that is versatile across various environments. The MEME platform is constructed around the concept of Mixture of Gaussian (MOG) models, where statistical models for both the background environment and the nematode appearance are explicitly learned and used to accurately segment a target nematode. Our method is designed to simplify the burden often imposed on users; here, only a single image which includes a nematode in its environment must be provided for model learning. In addition, our platform enables the extraction of nematode ‘skeletons’ for straightforward motility quantification. We test our algorithm on various locomotive environments and compare performances with an intensity-based thresholding method. Overall, MEME outperforms the threshold-based approach for the overwhelming majority of cases examined. Ultimately, MEME provides researchers with an attractive platform for C. elegans' segmentation and ‘skeletonizing’ across a wide range of motility assays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When a firearm projectile hits a biological target a spray of biological material (e.g., blood and tissue fragments) can be propelled from the entrance wound back towards the firearm. This phenomenon has become known as "backspatter" and if caused by contact shots or shots from short distances traces of backspatter may reach, consolidate on, and be recovered from, the inside surfaces of the firearm. Thus, a comprehensive investigation of firearm-related crimes must not only comprise of wound ballistic assessment but also backspatter analysis, and may even take into account potential correlations between these emergences. The aim of the present study was to evaluate and expand the applicability of the "triple contrast" method by probing its compatibility with forensic analysis of nuclear and mitochondrial DNA and the simultaneous investigation of co-extracted mRNA and miRNA from backspatter collected from internal components of different types of firearms after experimental shootings. We demonstrate that "triple contrast" stained biological samples collected from the inside surfaces of firearms are amenable to forensic co-analysis of DNA and RNA and permit sequence analysis of the entire mtDNA displacement-loop, even for "low template" DNA amounts that preclude standard short tandem repeat DNA analysis. Our findings underscore the "triple contrast" method's usefulness as a research tool in experimental forensic ballistics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE To assess whether palliative primary tumor resection in colorectal cancer patients with incurable stage IV disease is associated with improved survival. BACKGROUND There is a heated debate regarding whether or not an asymptomatic primary tumor should be removed in patients with incurable stage IV colorectal disease. METHODS Stage IV colorectal cancer patients were identified in the Surveillance, Epidemiology, and End Results database between 1998 and 2009. Patients undergoing surgery to metastatic sites were excluded. Overall survival and cancer-specific survival were compared between patients with and without palliative primary tumor resection using risk-adjusted Cox proportional hazard regression models and stratified propensity score methods. RESULTS Overall, 37,793 stage IV colorectal cancer patients were identified. Of those, 23,004 (60.9%) underwent palliative primary tumor resection. The rate of patients undergoing palliative primary cancer resection decreased from 68.4% in 1998 to 50.7% in 2009 (P < 0.001). In Cox regression analysis after propensity score matching primary cancer resection was associated with a significantly improved overall survival [hazard ratio (HR) of death = 0.40, 95% confidence interval (CI) = 0.39-0.42, P < 0.001] and cancer-specific survival (HR of death = 0.39, 95% CI = 0.38-0.40, P < 0.001). The benefit of palliative primary cancer resection persisted during the time period 1998 to 2009 with HRs equal to or less than 0.47 for both overall and cancer-specific survival. CONCLUSIONS On the basis of this population-based cohort of stage IV colorectal cancer patients, palliative primary tumor resection was associated with improved overall and cancer-specific survival. Therefore, the dogma that an asymptomatic primary tumor never should be resected in patients with unresectable colorectal cancer metastases must be questioned.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Panic disorder is characterised by the presence of recurrent unexpected panic attacks, discrete periods of fear or anxiety that have a rapid onset and include symptoms such as racing heart, chest pain, sweating and shaking. Panic disorder is common in the general population, with a lifetime prevalence of 1% to 4%. A previous Cochrane meta-analysis suggested that psychological therapy (either alone or combined with pharmacotherapy) can be chosen as a first-line treatment for panic disorder with or without agoraphobia. However, it is not yet clear whether certain psychological therapies can be considered superior to others. In order to answer this question, in this review we performed a network meta-analysis (NMA), in which we compared eight different forms of psychological therapy and three forms of a control condition. OBJECTIVES To assess the comparative efficacy and acceptability of different psychological therapies and different control conditions for panic disorder, with or without agoraphobia, in adults. SEARCH METHODS We conducted the main searches in the CCDANCTR electronic databases (studies and references registers), all years to 16 March 2015. We conducted complementary searches in PubMed and trials registries. Supplementary searches included reference lists of included studies, citation indexes, personal communication to the authors of all included studies and grey literature searches in OpenSIGLE. We applied no restrictions on date, language or publication status. SELECTION CRITERIA We included all relevant randomised controlled trials (RCTs) focusing on adults with a formal diagnosis of panic disorder with or without agoraphobia. We considered the following psychological therapies: psychoeducation (PE), supportive psychotherapy (SP), physiological therapies (PT), behaviour therapy (BT), cognitive therapy (CT), cognitive behaviour therapy (CBT), third-wave CBT (3W) and psychodynamic therapies (PD). We included both individual and group formats. Therapies had to be administered face-to-face. The comparator interventions considered for this review were: no treatment (NT), wait list (WL) and attention/psychological placebo (APP). For this review we considered four short-term (ST) outcomes (ST-remission, ST-response, ST-dropouts, ST-improvement on a continuous scale) and one long-term (LT) outcome (LT-remission/response). DATA COLLECTION AND ANALYSIS As a first step, we conducted a systematic search of all relevant papers according to the inclusion criteria. For each outcome, we then constructed a treatment network in order to clarify the extent to which each type of therapy and each comparison had been investigated in the available literature. Then, for each available comparison, we conducted a random-effects meta-analysis. Subsequently, we performed a network meta-analysis in order to synthesise the available direct evidence with indirect evidence, and to obtain an overall effect size estimate for each possible pair of therapies in the network. Finally, we calculated a probabilistic ranking of the different psychological therapies and control conditions for each outcome. MAIN RESULTS We identified 1432 references; after screening, we included 60 studies in the final qualitative analyses. Among these, 54 (including 3021 patients) were also included in the quantitative analyses. With respect to the analyses for the first of our primary outcomes, (short-term remission), the most studied of the included psychological therapies was CBT (32 studies), followed by BT (12 studies), PT (10 studies), CT (three studies), SP (three studies) and PD (two studies).The quality of the evidence for the entire network was found to be low for all outcomes. The quality of the evidence for CBT vs NT, CBT vs SP and CBT vs PD was low to very low, depending on the outcome. The majority of the included studies were at unclear risk of bias with regard to the randomisation process. We found almost half of the included studies to be at high risk of attrition bias and detection bias. We also found selective outcome reporting bias to be present and we strongly suspected publication bias. Finally, we found almost half of the included studies to be at high risk of researcher allegiance bias.Overall the networks appeared to be well connected, but were generally underpowered to detect any important disagreement between direct and indirect evidence. The results showed the superiority of psychological therapies over the WL condition, although this finding was amplified by evident small study effects (SSE). The NMAs for ST-remission, ST-response and ST-improvement on a continuous scale showed well-replicated evidence in favour of CBT, as well as some sparse but relevant evidence in favour of PD and SP, over other therapies. In terms of ST-dropouts, PD and 3W showed better tolerability over other psychological therapies in the short term. In the long term, CBT and PD showed the highest level of remission/response, suggesting that the effects of these two treatments may be more stable with respect to other psychological therapies. However, all the mentioned differences among active treatments must be interpreted while taking into account that in most cases the effect sizes were small and/or results were imprecise. AUTHORS' CONCLUSIONS There is no high-quality, unequivocal evidence to support one psychological therapy over the others for the treatment of panic disorder with or without agoraphobia in adults. However, the results show that CBT - the most extensively studied among the included psychological therapies - was often superior to other therapies, although the effect size was small and the level of precision was often insufficient or clinically irrelevant. In the only two studies available that explored PD, this treatment showed promising results, although further research is needed in order to better explore the relative efficacy of PD with respect to CBT. Furthermore, PD appeared to be the best tolerated (in terms of ST-dropouts) among psychological treatments. Unexpectedly, we found some evidence in support of the possible viability of non-specific supportive psychotherapy for the treatment of panic disorder; however, the results concerning SP should be interpreted cautiously because of the sparsity of evidence regarding this treatment and, as in the case of PD, further research is needed to explore this issue. Behaviour therapy did not appear to be a valid alternative to CBT as a first-line treatment for patients with panic disorder with or without agoraphobia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diamonds are known for both their beauty and their durability. Jefferson National Lab in Newport News, VA has found a way to utilize the diamond's strength to view the beauty of the inside of the atomic nucleus with the hopes of finding exotic forms of matter. By firing very fast electrons at a diamond sheet no thicker than a human hair, high energy particles of light known as photons are produced with a high degree of polarization that can illuminate the constituents of the nucleus known as quarks. The University of Connecticut Nuclear Physics group has responsibility for crafting these extremely thin, high quality diamond wafers. These wafers must be cut from larger stones that are about the size of a human finger, and then carefully machined down to the final thickness. The thinning of these diamonds is extremely challenging, as the diamond's greatest strength also becomes its greatest weakness. The Connecticut Nuclear Physics group has developed a novel technique to assist industrial partners in assessing the quality of the final machining steps, using a technique based on laser interferometry. The images of the diamond surface produced by the interferometer encode the thickness and shape of the diamond surface in a complex way that requires detailed analysis to extract. We have developed a novel software application to analyze these images based on the method of simulated annealing. Being able to image the surface of these diamonds without requiring costly X-ray diffraction measurements allows rapid feedback to the industrial partners as they refine their thinning techniques. Thus, by utilizing a material found to be beautiful by many, the beauty of nature can be brought more clearly into view.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cancer is the second leading cause of death in the United States. With the advent of new technologies, changes in health care delivery, and multiplicity of provider types that patients must see, cancer care management has become increasingly complex. The availability of cancer health information has been shown to help cancer patients cope with the management and effects of their cancers. As a result, more cancer patients are using the internet to find resources that can aid in decision-making and recovery. ^ The Health Information National Trends Survey (HINTS) is a nationally representative survey designed to collect information about the experiences of cancer and non-cancer adults with health information sources. The HINTS survey focused on both conventional sources as well as newer technologies, particularly the internet. This study is a descriptive analysis of the HINTS 2003 and HINTS 2005 survey data. The purpose of the research is to explore the general trends in health information seeking and use by US adults, and especially by cancer patients. ^ From 2003 to 2005, internet use for various health-related activities appears to have increased among adults with and without cancer. Differences were found between the groups in the general trust in information media, particularly the internet. Non-cancer respondents tended to have greater trust in information media than cancer respondents. ^ The latter portion of this work examined characteristics of HINTS respondents that were thought to be relevant to how much trust individuals placed in the internet as a source of health information. Trust in health information from the internet was significantly greater among younger adults, higher-earning households, internet users, online seekers of health or cancer information, and those who found online cancer information useful. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently, the barriers to appropriate infant feeding practices are largely unknown in the Central River Division of the Gambia. A questionnaire was developed and implemented by a local Non Governmental Organization (NGO), the Gambia Food and Nutrition Agency, in order to gain more information and ultimately to improve the child mortality rate of the country. There were two participant groups: 88 Doers who are women who had adopted the appropriate complementary feeding practice guidelines as defined by the World Health Organization and 87 Non Doers who are women who had in some way strayed from the appropriate complementary feeding practice guidelines. The questionnaire included aspects of the Health Belief Model which can be used in the development of a future intervention. The Yes/No questions were analyzed using the Chi-square statistical method and the open-ended questions used a descriptive analysis method of evaluation. The constructs for perceived susceptibility, perceived action efficacy, perceived self efficacy, cues for action and perception of divine showed significant differences between the Doers and the Non Doers (p<0.05). The descriptive analysis revealed that both participant groups had a limited understanding of the preventative qualities of the adoption of appropriate complementary feeding practices. The women in both of groups also showed a strong perception of divine will. Women in the Central River Division perceive their husband and in-laws to be the most influential in the decision-making process regarding infant feeding practices. Recommendations for future interventions must acknowledge the importance and influence of the community surrounding the women in their adoption of the appropriate infant feeding practices. It would also be important to educate women about of the specific guidelines of the appropriate complementary feeding practices, specifically the delay in early initiation of complementary feeding. The results of this barrier analysis provide useful information to plan and implement an effective intervention to improve the child mortality rate in the Gambia. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Americans with Disabilities Act (ADA) of 1990 was created to prohibit discrimination against disabled persons in our society. The goal of the ADA as a comprehensive civil rights law is to "ensure equal opportunity and complete participation, independent living and economic self-sufficiency" for disabled persons (U.S. Department of Justice, 2008). As part of Title II and III of the ADA, states and local governments are required to provide people with disabilities the same chance to engage in and benefit from all programs and services including recreational facilities and activities as every other citizen. Recreational facilities and related structures must comply with accessibility standards when creating new structures or renovating existing ones. Through a systematic literature review of articles accessed through online databases, articles relating to children with disabilities, their quality of life and their experience gained through play were reviewed, analyzed and synthesized. Additionally, the ADA's Final Rule regarding accessible playgrounds was evaluated through a descriptive analysis which yielded the following five components relating the importance of barrier-free playgrounds to children with disabilities: appropriate dimensions for children, integration of the play area, variety of activity and stimulation, availability of accessible play structures to communities, and financial feasibility. These components were used as evaluation criteria to investigate the degree to which the ADA's Final Rule document met these criteria. An evaluation of two federal funding sources, the Urban Parks and Recreation Renewal Program (UPARR) and the Land and Water Conservation Fund (LWCF), was also conducted which revealed three components relating the two programs' ability to support the realization of the ADA's Final Rule which included: current budget for the program, ability of local communities to attain funds, and level of ADA compliance required to receive funding. Majority of the evaluation of the Final Rule concluded it be adequate in development of barrier-free playgrounds although there are some portions of the guidelines that would benefit from further elucidation. Both funding programs were concluded to not adequately support the development of barrier-free playgrounds and therefore it was recommended that their funding be re-instated or increased as necessary. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background. Childhood immunization programs have dramatically reduced the morbidity and mortality associated with vaccine-preventable diseases. Proper documentation of immunizations that have been administered is essential to prevent duplicate immunization of children. To help improve documentation, immunization information systems (IISs) have been developed. IISs are comprehensive repositories of immunization information for children residing within a geographic region. The two models for participation in an IIS are voluntary inclusion, or "opt-in," and voluntary exclusion, or "opt-out." In an opt-in system, consent must be obtained for each participant, conversely, in an opt-out IIS, all children are included unless procedures to exclude the child are completed. Consent requirements for participation vary by state; the Texas IIS, ImmTrac, is an opt-in system.^ Objectives. The specific objectives are to: (1) Evaluate the variance among the time and costs associated with collecting ImmTrac consent at public and private birthing hospitals in the Greater Houston area; (2) Estimate the total costs associated with collecting ImmTrac consent at selected public and private birthing hospitals in the Greater Houston area; (3) Describe the alternative opt-out process for collecting ImmTrac consent at birth and discuss the associated cost savings relative to an opt-in system.^ Methods. Existing time-motion studies (n=281) conducted between October, 2006 and August, 2007 at 8 birthing hospitals in the Greater Houston area were used to assess the time and costs associated with obtaining ImmTrac consent at birth. All data analyzed are deidentified and contain no personal information. Variations in time and costs at each location were assessed and total costs per child and costs per year were estimated. The cost of an alternative opt-out system was also calculated.^ Results. The median time required by birth registrars to complete consent procedures varied from 72-285 seconds per child. The annual costs associated with obtaining consent for 388,285 newborns in ImmTrac's opt-in consent process were estimated at $702,000. The corresponding costs of the proposed opt-out system were estimated to total $194,000 per year. ^ Conclusions. Substantial variation in the time and costs associated with completion of ImmTrac consent procedures were observed. Changing to an opt-out system for participation could represent significant cost savings. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation develops and tests through path analysis a theoretical model to explain how socioeconomic, socioenvironmental, and biologic risk factors simultaneously influence each other to further produce short-term, depressed growth in preschoolers. Three areas of risk factors were identified: child's proximal environment, maturational stage, and biological vulnerability. The theoretical model represented both the conceptual framework and the nature and direction of the hypotheses. Original research completed in 1978-80 and in 1982 provided the background data. It was analyzed first by nested-analysis of variance, followed by path analysis. The study provided evidence of mild iron deficiency and gastrointestinal symptomatology in the etiology of depressed, short-term weight gain. Also, there was evidence suggesting that family resources for material and social survival significantly contribute to the variability of short-term, age-adjusted growth velocity. These results challenge current views of unifocal intervention, whether for prevention or control. For policy formulations, though, the mechanisms underlying any set of interlaced relationships must be decoded. Theoretical formulations here proposed should be reassessed under a more extensive research design. It is suggested that studies should be undertaken where social changes are actually in progress; otherwise, nutritional epidemiology in developing countries operates somewhere between social reality and research concepts, with little grasp of its real potential. The study stresses that there is a connection between substantive theory, empirical observation, and policy issues. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electronic waste is a fairly new and largely unknown phenomenon. Accordingly, governments have only recently acknowledged electronic waste as a threat to the environment and public health. In attempting to mitigate the hazards associated with this rapidly growing toxic waste stream, governments at all levels have started to implement e-waste management programs. The legislation enacted to create these programs is based on extended producer responsibility or EPR policy. ^ EPR shifts the burden of final disposal of e-waste from the consumer or municipal solid waste system to the manufacturer of electronic equipment. Applying an EPR policy is intended to send signals up the production chain to the manufacturer. The desired outcome is to change the methods of production in order to reduce production outputs/inputs with the ultimate goal of changing product design. This thesis performs a policy analysis of the current e-waste policies at the federal and state level of government, focusing specifically on Texas e-waste policies. ^ The Texas e-waste law known, as HB 2714 or the Texas Computer TakeBack Law, requires manufacturers to provide individual consumers with a free and convenient method for returning their used computers to manufacturers. The law is based on individual producer responsibility and shared responsibility among consumer, retailers, recyclers, and the TCEQ. ^ Using a set of evaluation criteria created by the Organization for Economic Co-operation and Development, the Texas e-waste law was examined to determine its effectiveness at reducing the threat of e-waste in Texas. Based on the outcomes of the analysis certain recommendations were made for the legislature to incorporate into HB 2714. ^ The results of the policy analysis show that HB 2714 is a poorly constructed law and does not provide the desired results seen in other states with EPR policies. The TakeBack Law does little to change the collection methods of manufacturers and even less to change their production habits. If the e-waste problem is to be taken seriously, HB 2714 must be amended to reflect the proposed changes in this thesis.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first manuscript, entitled "Time-Series Analysis as Input for Clinical Predictive Modeling: Modeling Cardiac Arrest in a Pediatric ICU" lays out the theoretical background for the project. There are several core concepts presented in this paper. First, traditional multivariate models (where each variable is represented by only one value) provide single point-in-time snapshots of patient status: they are incapable of characterizing deterioration. Since deterioration is consistently identified as a precursor to cardiac arrests, we maintain that the traditional multivariate paradigm is insufficient for predicting arrests. We identify time series analysis as a method capable of characterizing deterioration in an objective, mathematical fashion, and describe how to build a general foundation for predictive modeling using time series analysis results as latent variables. Building a solid foundation for any given modeling task involves addressing a number of issues during the design phase. These include selecting the proper candidate features on which to base the model, and selecting the most appropriate tool to measure them. We also identified several unique design issues that are introduced when time series data elements are added to the set of candidate features. One such issue is in defining the duration and resolution of time series elements required to sufficiently characterize the time series phenomena being considered as candidate features for the predictive model. Once the duration and resolution are established, there must also be explicit mathematical or statistical operations that produce the time series analysis result to be used as a latent candidate feature. In synthesizing the comprehensive framework for building a predictive model based on time series data elements, we identified at least four classes of data that can be used in the model design. The first two classes are shared with traditional multivariate models: multivariate data and clinical latent features. Multivariate data is represented by the standard one value per variable paradigm and is widely employed in a host of clinical models and tools. These are often represented by a number present in a given cell of a table. Clinical latent features derived, rather than directly measured, data elements that more accurately represent a particular clinical phenomenon than any of the directly measured data elements in isolation. The second two classes are unique to the time series data elements. The first of these is the raw data elements. These are represented by multiple values per variable, and constitute the measured observations that are typically available to end users when they review time series data. These are often represented as dots on a graph. The final class of data results from performing time series analysis. This class of data represents the fundamental concept on which our hypothesis is based. The specific statistical or mathematical operations are up to the modeler to determine, but we generally recommend that a variety of analyses be performed in order to maximize the likelihood that a representation of the time series data elements is produced that is able to distinguish between two or more classes of outcomes. The second manuscript, entitled "Building Clinical Prediction Models Using Time Series Data: Modeling Cardiac Arrest in a Pediatric ICU" provides a detailed description, start to finish, of the methods required to prepare the data, build, and validate a predictive model that uses the time series data elements determined in the first paper. One of the fundamental tenets of the second paper is that manual implementations of time series based models are unfeasible due to the relatively large number of data elements and the complexity of preprocessing that must occur before data can be presented to the model. Each of the seventeen steps is analyzed from the perspective of how it may be automated, when necessary. We identify the general objectives and available strategies of each of the steps, and we present our rationale for choosing a specific strategy for each step in the case of predicting cardiac arrest in a pediatric intensive care unit. Another issue brought to light by the second paper is that the individual steps required to use time series data for predictive modeling are more numerous and more complex than those used for modeling with traditional multivariate data. Even after complexities attributable to the design phase (addressed in our first paper) have been accounted for, the management and manipulation of the time series elements (the preprocessing steps in particular) are issues that are not present in a traditional multivariate modeling paradigm. In our methods, we present the issues that arise from the time series data elements: defining a reference time; imputing and reducing time series data in order to conform to a predefined structure that was specified during the design phase; and normalizing variable families rather than individual variable instances. The final manuscript, entitled: "Using Time-Series Analysis to Predict Cardiac Arrest in a Pediatric Intensive Care Unit" presents the results that were obtained by applying the theoretical construct and its associated methods (detailed in the first two papers) to the case of cardiac arrest prediction in a pediatric intensive care unit. Our results showed that utilizing the trend analysis from the time series data elements reduced the number of classification errors by 73%. The area under the Receiver Operating Characteristic curve increased from a baseline of 87% to 98% by including the trend analysis. In addition to the performance measures, we were also able to demonstrate that adding raw time series data elements without their associated trend analyses improved classification accuracy as compared to the baseline multivariate model, but diminished classification accuracy as compared to when just the trend analysis features were added (ie, without adding the raw time series data elements). We believe this phenomenon was largely attributable to overfitting, which is known to increase as the ratio of candidate features to class examples rises. Furthermore, although we employed several feature reduction strategies to counteract the overfitting problem, they failed to improve the performance beyond that which was achieved by exclusion of the raw time series elements. Finally, our data demonstrated that pulse oximetry and systolic blood pressure readings tend to start diminishing about 10-20 minutes before an arrest, whereas heart rates tend to diminish rapidly less than 5 minutes before an arrest.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Oxygen isotope measurements have been made in foraminifera from over 60 deep-sea sediment cores. Taken together with the oxygen isotope measurements published by Emiliani from Caribbean and Equatorial Atlantic cores, this comprises a unique body of stratigraphic data covering most of the important areas of calcareous sediment over the whole world ocean. The oxygen isotopic composition of foraminifera from cores of Late Pleistocene sediment varies in a similar manner in nearly all areas; the variations reflect changes in the oxygen isotopic composition of the ocean. The oceans are mixed in about 1 ka so that ocean isotopic changes, resulting from fluctuations in the quantity of ice stored on the continents, must have occurred almost synchronously in all regions. Thus the oxygen isotope record provides an excellent means of stratigraphic correlation. Cores accumulated at rates of over about 5 cm/ka provide records of oxygen isotopic composition change that are almost unaffected by post-depositional mixing of the sediment. Thus they preserve a detailed record of the advance and retreat of the ice masses in the northern hemisphere, and provide a unique source of information for the study of ice-sheet dynamics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Rieseberger Moor is a fen, 145 hectares in size, situated about 20 km east of Brunswick (Braunschweig), Lower Saxony, Germany. Peat was dug in the fen - with changing intensity - since the mid-18th century until around AD 1955. According to Schneekloth & Schneider (1971) the remaining peat (fen and wood peat) is predominantly 1.5 to 2 m thick (maximum 2.7 m). Part of the fen - now a nature reserve (NSG BR 005) - is wooded (Betula, Salix, Alnus). For more information on the Rieseberger Moor see http://de.wikipedia.org/wiki/Rieseberger_Moor. Willi Selle was the first to publish pollen diagrams from this site (Selle 1935, profiles Rieseberger Torfmoor I and II). This report deals with a 2.2 m long profile from the wooded south-eastern part of the fen consisting of strongly decomposed fen peat taken A.D. 1965 and studied by pollen analysis in the same year. The peat below 1.45 m contained silt and clay, samples 1.48 and 1.58 m even fine sand. These samples had to be treated with HF (hydrofluoric acid) in addition to the treatment with hot caustic potash solution. The coring ended in sandy material. The new pollen data reflect the early part of the known postglacial development of the vegetation of this area: the change from a birch dominated forest to a pine forest and the later spreading of Corylus and of the thermophilous deciduous tree genera Quercus, Ulmus, Tilia and Fraxinus followed by the expansion of Alnus. The new data are in agreement with Selle's results, except for Alnus, which in Selle's pollen diagram II shows high values (up to 42% of the arboreal pollen sum) even in samples deposited before Corylus and Quercus started to spread. On contrary the new pollen diagram shows that alder pollen - although present in all samples - is frequent in the three youngest pollen spectra only. A period with dominating Alnus as seen in the uppermost part of Selle's pollen diagrams is missing. The latter is most likely the result of peat cutting at the later coring site, whereas the early, unusually high alder values of Selle's pollen study are probably caused by contamination of the pollen samples with younger peat. Selle took peat samples usually with a "Torfbohrer" (= Hiller sampler). This side-filling type of sampler with an inner chamber and an outer loose jacket offers - if not handled with appropriate care - ample opportunities to contaminate older peat with carried off younger material. Pollen grains of Fagus (2 % of the arboreal pollen sum) were found in two samples only, namely in the uppermost samples of the new profile (0.18 m) and of Selle's profile I (0.25 m). If this pollen is autochthonous, with other words: if this surface-near peat was not disturbed by human activities, the Fagus pollen indicates an Early Subboreal age of this part of the profile. The accumulation of the Rieseberg peat started during the Preboreal. Increased values of Corylus, Quercus and Ulmus indicate that sample 0.78 m of the new profile is the oldest Boreal sample. The high Alnus values prove the Atlantic age of the younger peat. Whether Early Subboreal peat exists at the site is questionable, but evidently none of the three profiles reaches to Late Subboreal time, when Fagus spread in the region. Did peat-growth end during the Subboreal? Did younger peat exist, but got lost by peat cutting or has younger peat simply not yet been found in the Rieseberg fen? These questions cannot be answered with this study. The temporary decline of the curve of Pinus for the benefit of Betula during the Preboreal, unusual for this period, is contemporaneous with the deposition of sand (Rieseberger Moor II, 1.33 - 1,41 m; samples 1.48 and 1.58 m of the new profile) and must be considered a local phenomenon. Literature: Schneekloth, Heinrich & Schneider, Siegfried (1971). Die Moore in Niedersachsen. 2. Teil. Bereich des Blattes Braunschweig der Geologischen Karte der Bundesrepublik Deutschland (1:200000). - Schriften der wirtschaftswissenschaftlichen Gesellschaft zum Studium Niedersachsens e.V. Reihe A I., Band 96, Heft 2, 83 Seiten, Göttingen. Selle, Willi (1935) Das Torfmoor bei Rieseberg. - Jahresbericht des Vereins für Naturwissenschaft zu Braunschweig, 23, 46-58, Braunschweig.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The stability of gypsum in marine sediments has been investigated through the calculation of its saturation index at the sediment in situ temperature and pressure, using the entire ODP/IODP porewater composition database (14416 samples recovered from sediments collected during 95 ODP and IODP Legs). Saturation is reached in sediment porewaters of 26 boreholes drilled at 23 different sites, during 12 ODP/IODP Legs. As ocean bottom seawater is largely undersaturated with respect to gypsum, the porewater Ca content or its SO4 concentration, or both, must increase in order to reach equilibrium. At several sites equilibrium is reached either through the presence of evaporitic gypsum layers found in the sedimentary sequence, and/or through a salinity increase due to the presence of evaporitic brines with high concentrations of Ca and SO4. Saturation can also be reached in porewaters of seawater-like salinity (~ 35 per mil), provided sulfate reduction is limited. In this case, saturation is due to the alteration of volcanogenic material which releases large amounts of Ca to the porewaters, where the Ca concentration can reach 55 times its seawater value as for example at ODP Leg 134 site 833. At a few sites, saturation is reached in hydrothermal environments, or as a consequence of the alteration of the basaltic basement. In addition to the well known influence of brines on the formation of gypsum, these results indicate that the alteration of sediments rich in volcanogenic material is a major process leading to gypsum saturation in marine sediment porewaters. Therefore, the presence of gypsum in ancient and recent marine sediments should not be systematically interpreted as due to hypersaline waters, especially if volcanogenic material is present.