943 resultados para Records as Topic


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The comparison of palaeoclimate records on their own independent timescales is central to the work of the INTIMATE (INTegrating Ice core, MArine and TErrestrial records) network. For the North Atlantic region, an event stratigraphy has been established from the high-precision Greenland ice-core records and the integrated GICC05 chronology. This stratotype provides a palaeoclimate signal to which the timing and nature of palaeoenvironmental change recorded in marine and terrestrial archives can be compared. To facilitate this wider comparison, without assuming synchroneity of climatic change/proxy response, INTIMATE has also focussed on the development of tools to achieve this. In particular the use of time-parallel marker horizons e.g. tephra layers (volcanic ash). Coupled with the recent temporal extension of the Greenland stratotype, as part of this special issue, we present an updated INTIMATE event stratigraphy highlighting key tephra horizons used for correlation across Europe and the North Atlantic. We discuss the advantages of such an approach, and the key challenges for the further integration of terrestrial palaeoenvironmental records with those from ice cores and the marine realm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Congenital anomalies (CA) are the paradigm example of rare diseases liable to primary prevention actions due to the multifactorial etiology of many of them, involving a number of environmental factors together with genetic predispositions. Yet despite the preventive potential, lack of attention to an integrated preventive strategy has led to the prevalence of CA remaining relatively stable in recent decades. The 2 European projects, EUROCAT and EUROPLAN, have joined efforts to provide the first science-based and comprehensive set of recommendations for the primary prevention of CA in the European Union. The resulting EUROCAT-EUROPLAN 'Recommendations on Policies to Be Considered for the Primary Prevention of Congenital Anomalies in National Plans and Strategies on Rare Diseases' were issued in 2012 and endorsed by EUCERD (European Union Committee of Experts on Rare Diseases) in 2013. The recommendations exploit interdisciplinary expertise encompassing drugs, diet, lifestyles, maternal health status, and the environment. The recommendations include evidence-based actions aimed at reducing risk factors and at increasing protective factors and behaviors at both individual and population level. Moreover, consideration is given to topics specifically related to CA (e.g. folate status, teratogens) as well as of broad public health impact (e.g. obesity, smoking) which call for specific attention to their relevance in the pre- and periconceptional period. The recommendations, reported entirely in this paper, are a comprehensive tool to implement primary prevention into national policies on rare diseases in Europe.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Age-depth modeling using Bayesian statistics requires well-informed prior information about the behavior of sediment accumulation. Here we present average sediment accumulation rates (represented as deposition times, DT, in yr/cm) for lakes in an Arctic setting, and we examine the variability across space (intra- and inter-lake) and time (late Holocene). The dataset includes over 100 radiocarbon dates, primarily on bulk sediment, from 22 sediment cores obtained from 18 lakes spanning the boreal to tundra ecotone gradients in subarctic Canada. There are four to twenty-five radiocarbon dates per core, depending on the length and character of the sediment records. Deposition times were calculated at 100-year intervals from age-depth models constructed using the ‘classical’ age-depth modeling software Clam. Lakes in boreal settings have the most rapid accumulation (mean DT 20 ± 10 years), whereas lakes in tundra settings accumulate at moderate (mean DT 70 ± 10 years) to very slow rates, (>100 yr/cm). Many of the age-depth models demonstrate fluctuations in accumulation that coincide with lake evolution and post-glacial climate change. Ten of our sediment cores yielded sediments as old as c. 9,000 cal BP (BP = years before AD 1950). From between c. 9,000 cal BP and c. 6,000 cal BP, sediment accumulation was relatively rapid (DT of 20 to 60 yr/cm). Accumulation slowed between c. 5,500 and c. 4,000 cal BP as vegetation expanded northward in response to warming. A short period of rapid accumulation occurred near 1,200 cal BP at three lakes. Our research will help inform priors in Bayesian age modeling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tin, as a constituent of bronze, was central to the technological development of early societies, but cassiterite (SnO2) deposits were scarce and located distantly from the centres of Mediterranean civilizations. As Britain had the largest workable ore deposits in the ancient Western world, this has led to much historical speculation and myth regarding the long-distance trading of tin from the Bronze Age onwards. Here we establish the first detailed chronology for tin, along with lead and copper deposition, into undisturbed ombrotrophic (rain-fed) peat bogs located at Bodmin Moor and Dartmoor in the centre of the British tin ore fields. Sustained elevated tin deposition is demonstrated clearly, with peaks occurring at 100-400 and 700-1000 calendar years AD - contemporaneous with the Roman and Anglo-Saxon periods respectively. While pre-Roman Iron Age tin exploitation undoubtedly took place, it was on a scale that did not result in convincingly enhanced deposition of the metal. The deposition of lead in the peat record provides evidence of a pre-Roman metal-based economy in southwest Britain. Emerging in the 4th century BC, this was centred on copper and lead ore processing that expanded exponentially and then collapsed upon Roman colonization during the 1st century AD. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The North Atlantic has played a key role in abrupt climate changes due to the sensitivity of the Atlantic Meridional Overturning Circulation (AMOC) to the location and strength of deep water formation. It is crucial for modelling future climate change to understand the role of the AMOC in the rapid warming and gradual cooling cycles known as Dansgaard-Oescher (DO) events which are recorded in the Greenland ice cores. However, palaeoceanographic research into DO events has been hampered by the uncertainty in timing due largely to the lack of a precise chronological time frame for marine records. While tephrochronology provides links to the Greenland ice core records at a few points, radiocarbon remains the primary dating method for most marine cores. Due to variations in the atmospheric and oceanic 14C concentration, radiocarbon ages must be calibrated to provide calendric ages. The IntCal Working Group provides a global estimate of ocean 14C ages for calibration of marine radiocarbon dates, but the variability of the surface marine reservoir age in the North Atlantic particularly during Heinrich or DO events, makes calibration uncertain. In addition, the current Marine09 radiocarbon calibration beyond around 15 ka BP is largely based on 'tuning' to the Hulu Cave isotope record, so that the timing of events may not be entirely synchronous with the Greenland ice cores. The use of event-stratigraphy and independent chronological markers such as tephra provide the scope to improve marine radiocarbon reservoir age estimates particularly in the North Atlantic where a number of tephra horizons have been identified in both marine sediments and the Greenland ice cores. Quantification of timescale uncertainties is critical but statistical techniques which can take into account the differential dating between events can improve the precision. Such techniques should make it possible to develop specific marine calibration curves for selected regions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: High risk medications are commonly prescribed to older US patients. Currently, less is known about high risk medication prescribing in other Western Countries, including the UK. We measured trends and correlates of high risk medication prescribing in a subset of the older UK population (community/institutionalized) to inform harm minimization efforts. Methods: Three cross-sectional samples from primary care electronic clinical records (UK Clinical Practice Research Datalink, CPRD) in fiscal years 2003/04, 2007/08 and 2011/12 were taken. This yielded a sample of 13,900 people aged 65 years or over from 504 UK general practices. High risk medications were defined by 2012 Beers Criteria adapted for the UK. Using descriptive statistical methods and regression modelling, prevalence of ‘any’ (drugs prescribed at least once per year) and ‘long-term’ (drugs prescribed all quarters of year) high risk medication prescribing and correlates were determined. Results: While polypharmacy rates have risen sharply, high risk medication prevalence has remained stable across a decade. A third of older (65+) people are exposed to high risk medications, but only half of the total prevalence was long-term (any = 38.4 % [95 % CI: 36.3, 40.5]; long-term = 17.4 % [15.9, 19.9] in 2011/12). Long-term but not any high risk medication exposure was associated with older ages (85 years or over). Women and people with higher polypharmacy burden were at greater risk of exposure; lower socio-economic status was not associated. Ten drugs/drug classes accounted for most of high risk medication prescribing in 2011/12. Conclusions: High risk medication prescribing has not increased over time against a background of increasing polypharmacy in the UK. Half of patients receiving high risk medications do so for less than a year. Reducing or optimising the use of a limited number of drugs could dramatically reduce high risk medications in older people. Further research is needed to investigate why the oldest old and women are at greater risk. Interventions to reduce high risk medications may need to target shorter and long-term use separately.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Molecular testing is becoming an important part of the diagnosis of any patient with cancer. The challenge to laboratories is to meet this need, using reliable methods and processes to ensure that patients receive a timely and accurate report on which their treatment will be based. The aim of this paper is to provide minimum requirements for the management of molecular pathology laboratories. This general guidance should be augmented by the specific guidance available for different tumour types and tests. Preanalytical considerations are important, and careful consideration of the way in which specimens are obtained and reach the laboratory is necessary. Sample receipt and handling follow standard operating procedures, but some alterations may be necessary if molecular testing is to be performed, for instance to control tissue fixation. DNA and RNA extraction can be standardised and should be checked for quality and quantity of output on a regular basis. The choice of analytical method(s) depends on clinical requirements, desired turnaround time, and expertise available. Internal quality control, regular internal audit of the whole testing process, laboratory accreditation, and continual participation in external quality assessment schemes are prerequisites for delivery of a reliable service. A molecular pathology report should accurately convey the information the clinician needs to treat the patient with sufficient information to allow for correct interpretation of the result. Molecular pathology is developing rapidly, and further detailed evidence-based recommendations are required for many of the topics covered here.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Search filters are combinations of words and phrases designed to retrieve an optimal set of records on a particular topic (subject filters) or study design (methodological filters). Information specialists are increasingly turning to reusable filters to focus their searches. However, the extent of the academic literature on search filters is unknown. We provide a broad overview to the academic literature on search filters.
Objectives: To map the academic literature on search filters from 2004 to 2015 using a novel form of content analysis.
Methods: We conducted a comprehensive search for literature between 2004 and 2015 across eight databases using a subjectively derived search strategy. We identified key words from titles, grouped them into categories, and examined their frequency and co-occurrences.
Results: The majority of records were housed in Embase (n = 178) and MEDLINE (n = 154). Over the last decade, both databases appeared to exhibit a bimodal distribution with the number of publications on search filters rising until 2006, before dipping in 2007, and steadily increasing until 2012. Few articles appeared in social science databases over the same time frame (e.g. Social Services Abstracts, n = 3).
Unsurprisingly, the term ‘search’ appeared in most titles, and quite often, was used as a noun adjunct for the word 'filter' and ‘strategy’. Across the papers, the purpose of searches as a means of 'identifying' information and gathering ‘evidence’ from 'databases' emerged quite strongly. Other terms relating to the methodological assessment of search filters, such as precision and validation, also appeared albeit less frequently.
Conclusions: Our findings show surprising commonality across the papers with regard to the literature on search filters. Much of the literature seems to be focused on developing search filters to identify and retrieve information, as opposed to testing or validating such filters. Furthermore, the literature is mostly housed in health-related databases, namely MEDLINE, CINAHL, and Embase, implying that it is medically driven. Relatively few papers focus on the use of search filters in the social sciences.