64 resultados para multiple data sources


Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE To investigate whether revascularisation improves prognosis compared with medical treatment among patients with stable coronary artery disease. DESIGN Bayesian network meta-analyses to combine direct within trial comparisons between treatments with indirect evidence from other trials while maintaining randomisation. ELIGIBILITY CRITERIA FOR SELECTING STUDIES A strategy of initial medical treatment compared with revascularisation by coronary artery bypass grafting or Food and Drug Administration approved techniques for percutaneous revascularization: balloon angioplasty, bare metal stent, early generation paclitaxel eluting stent, sirolimus eluting stent, and zotarolimus eluting (Endeavor) stent, and new generation everolimus eluting stent, and zotarolimus eluting (Resolute) stent among patients with stable coronary artery disease. DATA SOURCES Medline and Embase from 1980 to 2013 for randomised trials comparing medical treatment with revascularisation. MAIN OUTCOME MEASURE All cause mortality. RESULTS 100 trials in 93 553 patients with 262 090 patient years of follow-up were included. Coronary artery bypass grafting was associated with a survival benefit (rate ratio 0.80, 95% credibility interval 0.70 to 0.91) compared with medical treatment. New generation drug eluting stents (everolimus: 0.75, 0.59 to 0.96; zotarolimus (Resolute): 0.65, 0.42 to 1.00) but not balloon angioplasty (0.85, 0.68 to 1.04), bare metal stents (0.92, 0.79 to 1.05), or early generation drug eluting stents (paclitaxel: 0.92, 0.75 to 1.12; sirolimus: 0.91, 0.75 to 1.10; zotarolimus (Endeavor): 0.88, 0.69 to 1.10) were associated with improved survival compared with medical treatment. Coronary artery bypass grafting reduced the risk of myocardial infarction compared with medical treatment (0.79, 0.63 to 0.99), and everolimus eluting stents showed a trend towards a reduced risk of myocardial infarction (0.75, 0.55 to 1.01). The risk of subsequent revascularisation was noticeably reduced by coronary artery bypass grafting (0.16, 0.13 to 0.20) followed by new generation drug eluting stents (zotarolimus (Resolute): 0.26, 0.17 to 0.40; everolimus: 0.27, 0.21 to 0.35), early generation drug eluting stents (zotarolimus (Endeavor): 0.37, 0.28 to 0.50; sirolimus: 0.29, 0.24 to 0.36; paclitaxel: 0.44, 0.35 to 0.54), and bare metal stents (0.69, 0.59 to 0.81) compared with medical treatment. CONCLUSION Among patients with stable coronary artery disease, coronary artery bypass grafting reduces the risk of death, myocardial infarction, and subsequent revascularisation compared with medical treatment. All stent based coronary revascularisation technologies reduce the need for revascularisation to a variable degree. Our results provide evidence for improved survival with new generation drug eluting stents but no other percutaneous revascularisation technology compared with medical treatment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The near-real time retrieval of low stratiform cloud (LSC) coverage is of vital interest for such disciplines as meteorology, transport safety, economy and air quality. Within this scope, a novel methodology is proposed which provides the LSC occurrence probability estimates for a satellite scene. The algorithm is suited for the 1 × 1 km Advanced Very High Resolution Radiometer (AVHRR) data and was trained and validated against collocated SYNOP observations. Utilisation of these two combined data sources requires a formulation of constraints in order to discriminate cases where the LSC is overlaid by higher clouds. The LSC classification process is based on six features which are first converted to the integer form by step functions and combined by means of bitwise operations. Consequently, a set of values reflecting a unique combination of those features is derived which is further employed to extract the LSC occurrence probability estimates from the precomputed look-up vectors (LUV). Although the validation analyses confirmed good performance of the algorithm, some inevitable misclassification with other optically thick clouds were reported. Moreover, the comparison against Polar Platform System (PPS) cloud-type product revealed superior classification accuracy. From the temporal perspective, the acquired results reported a presence of diurnal and annual LSC probability cycles over Europe.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many observed time series of the global radiosonde or PILOT networks exist as fragments distributed over different archives. Identifying and merging these fragments can enhance their value for studies on the three-dimensional spatial structure of climate change. The Comprehensive Historical Upper-Air Network (CHUAN version 1.7), which was substantially extended in 2013, and the Integrated Global Radiosonde Archive (IGRA) are the most important collections of upper-air measurements taken before 1958. CHUAN (tracked) balloon data start in 1900, with higher numbers from the late 1920s onward, whereas IGRA data start in 1937. However, a substantial fraction of those measurements have not been taken at synoptic times (preferably 00:00 or 12:00 GMT) and on altitude levels instead of standard pressure levels. To make them comparable with more recent data, the records have been brought to synoptic times and standard pressure levels using state-of-the-art interpolation techniques, employing geopotential information from the National Oceanic and Atmospheric Administration (NOAA) 20th Century Reanalysis (NOAA 20CR). From 1958 onward the European Re-Analysis archives (ERA-40 and ERA-Interim) available at the European Centre for Medium-Range Weather Forecasts (ECMWF) are the main data sources. These are easier to use, but pilot data still have to be interpolated to standard pressure levels. Fractions of the same records distributed over different archives have been merged, if necessary, taking care that the data remain traceable back to their original sources. If possible, station IDs assigned by the World Meteorological Organization (WMO) have been allocated to the station records. For some records which have never been identified by a WMO ID, a local ID above 100 000 has been assigned. The merged data set contains 37 wind records longer than 70 years and 139 temperature records longer than 60 years. It can be seen as a useful basis for further data processing steps, most notably homogenization and gridding, after which it should be a valuable resource for climatological studies. Homogeneity adjustments for wind using the NOAA-20CR as a reference are described in Ramella Pralungo and Haimberger (2014). Reliable homogeneity adjustments for temperature beyond 1958 using a surface-data-only reanalysis such as NOAA-20CR as a reference have yet to be created. All the archives and metadata files are available in ASCII and netCDF format in the PANGAEA archive

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Current guidelines for evaluating cleft palate treatments are mostly based on two-dimensional (2D) evaluation, but three-dimensional (3D) imaging methods to assess treatment outcome are steadily rising. OBJECTIVE To identify 3D imaging methods for quantitative assessment of soft tissue and skeletal morphology in patients with cleft lip and palate. DATA SOURCES Literature was searched using PubMed (1948-2012), EMBASE (1980-2012), Scopus (2004-2012), Web of Science (1945-2012), and the Cochrane Library. The last search was performed September 30, 2012. Reference lists were hand searched for potentially eligible studies. There was no language restriction. STUDY SELECTION We included publications using 3D imaging techniques to assess facial soft tissue or skeletal morphology in patients older than 5 years with a cleft lip with/or without cleft palate. We reviewed studies involving the facial region when at least 10 subjects in the sample size had at least one cleft type. Only primary publications were included. DATA EXTRACTION Independent extraction of data and quality assessments were performed by two observers. RESULTS Five hundred full text publications were retrieved, 144 met the inclusion criteria, with 63 high quality studies. There were differences in study designs, topics studied, patient characteristics, and success measurements; therefore, only a systematic review could be conducted. Main 3D-techniques that are used in cleft lip and palate patients are CT, CBCT, MRI, stereophotogrammetry, and laser surface scanning. These techniques are mainly used for soft tissue analysis, evaluation of bone grafting, and changes in the craniofacial skeleton. Digital dental casts are used to evaluate treatment and changes over time. CONCLUSION Available evidence implies that 3D imaging methods can be used for documentation of CLP patients. No data are available yet showing that 3D methods are more informative than conventional 2D methods. Further research is warranted to elucidate it.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE To systematically analyze the regenerative effect of the available biomaterials either alone or in various combinations for the treatment of periodontal intrabony defects as evaluated in preclinical histologic studies. DATA SOURCES A protocol covered all aspects of the systematic review methodology. A literature search was performed in Medline, including hand searching. Combinations of searching terms and several criteria were applied for study identification, selection, and inclusion. The preliminary outcome variable was periodontal regeneration after reconstructive surgery obtained with the various regenerative materials, as demonstrated through histologic/ histomorphometric analysis. New periodontal ligament, new cementum, and new bone formation as a linear measurement in mm or as a percentage of the instrumented root length were recorded. Data were extracted based on the general characteristics, study characteristics, methodologic characteristics, and conclusions. Study selection was limited to preclinical studies involving histologic analysis, evaluating the use of potential regenerative materials (ie, barrier membranes, grafting materials, or growth factors/proteins) for the treatment of periodontal intrabony defects. Any type of biomaterial alone or in various combinations was considered. All studies reporting histologic outcome measures with a healing period of at least 6 weeks were included. A meta-analysis was not possible due to the heterogeneity of the data. CONCLUSION Flap surgery in conjunction with most of the evaluated biomaterials used either alone or in various combinations has been shown to promote periodontal regeneration to a greater extent than control therapy (flap surgery without biomaterials). Among the used biomaterials, autografts revealed the most favorable outcomes, whereas the use of most biologic factors showed inferior results compared to flap surgery.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE Over 15 years have passed since an enamel matrix derivative (EMD) was introduced as a biologic agent capable of periodontal regeneration. Histologic and controlled clinical studies have provided evidence for periodontal regeneration and substantial clinical improvements following its use. The purpose of this review article was to perform a systematic review comparing the eff ect of EMD when used alone or in combination with various types of bone grafting material. DATA SOURCES A literature search was conducted on several medical databases including Medline, EMBASE, LILACS, and CENTRAL. For study inclusion, all studies that used EMD in combination with a bone graft were included. In the initial search, a total of 820 articles were found, 71 of which were selected for this review article. Studies were divided into in vitro, in vivo, and clinical studies. The clinical studies were subdivided into four subgroups to determine the eff ect of EMD in combination with autogenous bone, allografts, xenografts, and alloplasts. RESULTS The analysis from the present study demonstrates that while EMD in combination with certain bone grafts is able to improve the regeneration of periodontal intrabony and furcation defects, direct evidence supporting the combination approach is still missing. CONCLUSION Further controlled clinical trials are required to explain the large variability that exists amongst the conducted studies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Data on the association between subclinical thyroid dysfunction and fractures conflict. PURPOSE To assess the risk for hip and nonspine fractures associated with subclinical thyroid dysfunction among prospective cohorts. DATA SOURCES Search of MEDLINE and EMBASE (1946 to 16 March 2014) and reference lists of retrieved articles without language restriction. STUDY SELECTION Two physicians screened and identified prospective cohorts that measured thyroid function and followed participants to assess fracture outcomes. DATA EXTRACTION One reviewer extracted data using a standardized protocol, and another verified data. Both reviewers independently assessed methodological quality of the studies. DATA SYNTHESIS The 7 population-based cohorts of heterogeneous quality included 50,245 participants with 1966 hip and 3281 nonspine fractures. In random-effects models that included the 5 higher-quality studies, the pooled adjusted hazard ratios (HRs) of participants with subclinical hyperthyroidism versus euthyrodism were 1.38 (95% CI, 0.92 to 2.07) for hip fractures and 1.20 (CI, 0.83 to 1.72) for nonspine fractures without statistical heterogeneity (P = 0.82 and 0.52, respectively; I2= 0%). Pooled estimates for the 7 cohorts were 1.26 (CI, 0.96 to 1.65) for hip fractures and 1.16 (CI, 0.95 to 1.42) for nonspine fractures. When thyroxine recipients were excluded, the HRs for participants with subclinical hyperthyroidism were 2.16 (CI, 0.87 to 5.37) for hip fractures and 1.43 (CI, 0.73 to 2.78) for nonspine fractures. For participants with subclinical hypothyroidism, HRs from higher-quality studies were 1.12 (CI, 0.83 to 1.51) for hip fractures and 1.04 (CI, 0.76 to 1.42) for nonspine fractures (P for heterogeneity = 0.69 and 0.88, respectively; I2 = 0%). LIMITATIONS Selective reporting cannot be excluded. Adjustment for potential common confounders varied and was not adequately done across all studies. CONCLUSION Subclinical hyperthyroidism might be associated with an increased risk for hip and nonspine fractures, but additional large, high-quality studies are needed. PRIMARY FUNDING SOURCE Swiss National Science Foundation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The field of animal syndromic surveillance (SyS) is growing, with many systems being developed worldwide. Now is an appropriate time to share ideas and lessons learned from early SyS design and implementation. Based on our practical experience in animal health SyS, with additions from the public health and animal health SyS literature, we put forward for discussion a 6-step approach to designing SyS systems for livestock and poultry. The first step is to formalise policy and surveillance goals which are considerate of stakeholder expectations and reflect priority issues (1). Next, it is important to find consensus on national priority diseases and identify current surveillance gaps. The geographic, demographic, and temporal coverage of the system must be carefully assessed (2). A minimum dataset for SyS that includes the essential data to achieve all surveillance objectives while minimizing the amount of data collected should be defined. One can then compile an inventory of the data sources available and evaluate each using the criteria developed (3). A list of syndromes should then be produced for all data sources. Cases can be classified into syndrome classes and the data can be converted into time series (4). Based on the characteristics of the syndrome-time series, the length of historic data available and the type of outbreaks the system must detect, different aberration detection algorithms can be tested (5). Finally, it is essential to develop a minimally acceptable response protocol for each statistical signal produced (6). Important outcomes of this pre-operational phase should be building of a national network of experts and collective action and evaluation plans. While some of the more applied steps (4 and 5) are currently receiving consideration, more emphasis should be put on earlier conceptual steps by decision makers and surveillance developers (1-3).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE To investigate the planning of subgroup analyses in protocols of randomised controlled trials and the agreement with corresponding full journal publications. DESIGN Cohort of protocols of randomised controlled trial and subsequent full journal publications. SETTING Six research ethics committees in Switzerland, Germany, and Canada. DATA SOURCES 894 protocols of randomised controlled trial involving patients approved by participating research ethics committees between 2000 and 2003 and 515 subsequent full journal publications. RESULTS Of 894 protocols of randomised controlled trials, 252 (28.2%) included one or more planned subgroup analyses. Of those, 17 (6.7%) provided a clear hypothesis for at least one subgroup analysis, 10 (4.0%) anticipated the direction of a subgroup effect, and 87 (34.5%) planned a statistical test for interaction. Industry sponsored trials more often planned subgroup analyses compared with investigator sponsored trials (195/551 (35.4%) v 57/343 (16.6%), P<0.001). Of 515 identified journal publications, 246 (47.8%) reported at least one subgroup analysis. In 81 (32.9%) of the 246 publications reporting subgroup analyses, authors stated that subgroup analyses were prespecified, but this was not supported by 28 (34.6%) corresponding protocols. In 86 publications, authors claimed a subgroup effect, but only 36 (41.9%) corresponding protocols reported a planned subgroup analysis. CONCLUSIONS Subgroup analyses are insufficiently described in the protocols of randomised controlled trials submitted to research ethics committees, and investigators rarely specify the anticipated direction of subgroup effects. More than one third of statements in publications of randomised controlled trials about subgroup prespecification had no documentation in the corresponding protocols. Definitive judgments regarding credibility of claimed subgroup effects are not possible without access to protocols and analysis plans of randomised controlled trials.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The consistency of an existing reconstructed annual (December–November) temperature series for the Lisbon region (Portugal) from 1600 onwards, based on a European-wide reconstruction, with (1) five local borehole temperature–depth profiles; (2) synthetic temperature– depth profiles, generated from both reconstructed temperatures and two regional paleoclimate simulations in Portugal; (3) instrumental data sources over the twentieth century; and (4) temperature indices from documentary sources during the late Maunder Minimum (1675–1715) is assessed. The low-frequency variability in the reconstructed temperature in Portugal is not entirely consistent with local borehole temperature–depth profiles and with the simulated response of temperature in two regional paleoclimate simulations driven by reconstructions of various climate forcings. Therefore, the existing reconstructed series is calibrated by adjusting its low-frequency variability to the simulations (first-stage adjustment). The annual reconstructed series is then calibrated in its location and scale parameters, using the instrumental series and a linear regression between them (second-stage adjustment). This calibrated series shows clear footprints of the Maunder and Dalton minima, commonly related to changes in solar activity and explosive volcanic eruptions, and a strong recent-past warming, commonly related to human-driven forcing. Lastly, it is also in overall agreement with annual temperature indices over the late Maunder Minimum in Portugal. The series resulting from this post-reconstruction adjustment can be of foremost relevance to improve the current understanding of the driving mechanisms of climate variability in Portugal.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Swiss Swiss Consultant Trust Fund (CTF) support covered the period from July to December 2007 and comprised four main tasks: (1) Analysis of historic land degradation trends in the four watersheds of Zerafshan, Surkhob, Toirsu, and Vanj; (2) Translation of standard CDE GIS training materials into Russian and Tajik to enable local government staff and other specialists to use geospatial data and tools; (3) Demonstration of geospatial tools that show land degradation trends associated with land use and vegetative cover data in the project areas, (4) Preliminary training of government staff in using appropriate data, including existing information, global datasets, inexpensive satellite imagery and other datasets and webbased visualization tools like spatial data viewers, etc. The project allowed building of local awareness of, and skills in, up-to-date, inexpensive, easy-to-use GIS technologies, data sources, and applications relevant to natural resource management and especially to sustainable land management. In addition to supporting the implementation of the World Bank technical assistance activity to build capacity in the use of geospatial tools for natural resource management, the Swiss CTF support also aimed at complementing the Bank supervision work on the ongoing Community Agriculture and Watershed Management Project (CAWMP).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The new computing paradigm known as cognitive computing attempts to imitate the human capabilities of learning, problem solving, and considering things in context. To do so, an application (a cognitive system) must learn from its environment (e.g., by interacting with various interfaces). These interfaces can run the gamut from sensors to humans to databases. Accessing data through such interfaces allows the system to conduct cognitive tasks that can support humans in decision-making or problem-solving processes. Cognitive systems can be integrated into various domains (e.g., medicine or insurance). For example, a cognitive system in cities can collect data, can learn from various data sources and can then attempt to connect these sources to provide real time optimizations of subsystems within the city (e.g., the transportation system). In this study, we provide a methodology for integrating a cognitive system that allows data to be verbalized, making the causalities and hypotheses generated from the cognitive system more understandable to humans. We abstract a city subsystem—passenger flow for a taxi company—by applying fuzzy cognitive maps (FCMs). FCMs can be used as a mathematical tool for modeling complex systems built by directed graphs with concepts (e.g., policies, events, and/or domains) as nodes and causalities as edges. As a verbalization technique we introduce the restriction-centered theory of reasoning (RCT). RCT addresses the imprecision inherent in language by introducing restrictions. Using this underlying combinatorial design, our approach can handle large data sets from complex systems and make the output understandable to humans.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

High-resolution records of calibrated proxy data for the past millennium are fundamental to place current changes into the context of pre-industrial natural forced and unforced variability. Although the need for regional spatially-explicit comprehensive reconstructions is widely recognized, the proxy data sources are still scarce, particularly for the Southern Hemisphere and especially for South America. We present a 600-year long warm season temperature record from varved sediments of Lago Plomo, a proglacial lake of the Northern Patagonian Ice field in Southern Chile (46°59′S, 72°52′W, 203 m a.s.l.). The thickness of the bright summer sediment layer relative to the dark winter layer (measured as total brightness; % reflectance 400–730 nm) is calibrated against warm season SONDJF temperature (1900–2009; r = 0.58, p(aut) = 0.056, RE = 0.52; CE = 0.15, RMSEP = 0.28 °C; five-year triangular filtered data). In Lago Plomo, warm summer temperatures lead to enhanced glacier melt and suspended sediment transport, which results in thicker light summer layers and to brighter sediments. Although Patagonia shows pronounced regional differences in decadal temperature trends and variability, the 600 years temperature reconstruction from Lago Plomo compares favourably with other regional/continental temperature records, but also emphasizes significant regional differences for which no data and information existed so far. These regional differences seem to be real as they are also reflected in modern climate data sets (1900–2010). The reconstruction shows pronounced subdecadal – multidecadal variability with cold phases during parts of the Little Ice Age (16th and 18th centuries) and in the beginning of the 20th century. The most prominent warm phase is the 19th century which is as warm as the second half of the 20th century. The exceptional summer warmth AD 1780–1810 is also found in other archives of Northern Patagonia and Central Chile. Our record shows the delayed 20th century warming in the Southern Hemisphere. The comparison between winter precipitation and summer temperature (inter-seasonal coupling) from Lago Plomo reveals alternating phases with parallel and contrasting decadal trends of winter precipitation and summer temperature (positive and negative running correlations Rwinter PP; summer TT). This observation from the sediment proxy data is also confirmed by two sets of reanalysis data for the 20th century. Reanalysis data show that phases with negative correlations between winter precipitation and summer temperature (e.g., dry winters and warm summers) at Lago Plomo are characteristic for periods when circumpolar Westerly flow is displaced southward and enhanced around 60°S.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES: To (1) review the development and medical applications of hydroxyethyl starch (HES) solutions with particular emphasis on its physiochemical properties; (2) critically appraise the available evidence in human and veterinary medicine, and (3) evaluate the potential risks and benefits associated with their use in critically ill small animals. DATA SOURCES: Human and veterinary original research articles, scientific reviews, and textbook sources from 1950 to the present. HUMAN DATA SYNTHESIS: HES solutions have been used extensively in people for over 30 years and ever since its introduction there has been a great deal of debate over its safety and efficacy. Recently, results of seminal trials and meta-analyses showing increased risks related to kidney dysfunction and mortality in septic and critically ill patients, have led to the restriction of HES use in these patient populations by European regulatory authorities. Although the initial ban on the use of HES in Europe has been eased, proof regarding the benefits and safety profile of HES in trauma and surgical patient populations has been requested by these same European regulatory authorities. VETERINARY DATA SYNTHESIS: The veterinary literature is limited mostly to experimental studies and clinical investigations with small populations of patients with short-term end points and there is insufficient evidence to generate recommendations. CONCLUSIONS: Currently, there are no consensus recommendations regarding the use of HES in veterinary medicine. Veterinarians and institutions affected by the HES restrictions have had to critically reassess the risks and benefits related to HES usage based on the available information and sometimes adapt their procedures and policies based on their reassessment. Meanwhile, large, prospective, randomized veterinary studies evaluating HES use are needed to achieve relevant levels of evidence to enable formulation of specific veterinary guidelines.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Various avours of a new research field on (socio-)physical or personal analytics have emerged, with the goal of deriving semantically-rich insights from people's low-level physical sensing combined with their (online) social interactions. In this paper, we argue for more comprehensive data sources, including environmental (e.g. weather, infrastructure) and application-specific data, to better capture the interactions between users and their context, in addition to those among users. To illustrate our proposed concept of synergistic user <-> context analytics, we first provide some example use cases. Then, we present our ongoing work towards a synergistic analytics platform: a testbed, based on mobile crowdsensing and the Internet of Things (IoT), a data model for representing the different sources of data and their connections, and a prediction engine for analyzing the data and producing insights.