963 resultados para Scientific data


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: There is uncertain evidence of effectiveness of 5-aminosalicylates (5-ASA) to induce and maintain response and remission of active Crohn's disease (CD), and weak evidence to support their use in post-operative CD. AIM: To assess the frequency and determinants of 5-ASA use in CD patients and to evaluate the physicians' perception of clinical response and side effects to 5-ASA. METHODS: Data from the Swiss Inflammatory Bowel Disease Cohort, which collects data since 2006 on a large sample of IBD patients, were analysed. Information from questionnaires regarding utilisation of treatments and perception of response to 5-ASA were evaluated. Logistic regression modelling was performed to identify factors associated with 5-ASA use. RESULTS: Of 1420 CD patients, 835 (59%) were ever treated with 5-ASA from diagnosis to latest follow-up. Disease duration >10 years and colonic location were both significantly associated with 5-ASA use. 5-ASA treatment was judged to be successful in 46% (378/825) of treatment episodes (physician global assessment). Side effects prompting stop of therapy were found in 12% (98/825) episodes in which 5-ASA had been stopped. CONCLUSIONS: 5-Aminosalicylates were frequently prescribed in patients with Crohn's disease in the Swiss IBD cohort. This observation stands in contrast to the scientific evidence demonstrating a very limited role of 5-ASA compounds in the treatment of Crohn's disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The pigments and the plasters of the Roman frescoes discovered at the House of Diana (Cosa, Grosseto, Italy) were analysed using non-destructive and destructive mineralogical and chemical techniques. The characterization of both pigments and plasters was performed through optical microscopy, scanning electron microscopy and electron microprobe analysis. The pigments were identified by Raman spectroscopy and submitted to stable isotope analysis. The results were integrated with the archaeological data in order to determine and reconstruct the provenance, trade patterns and the employment of the raw materials used for the elaboration of the frescoes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Compositional data naturally arises from the scientific analysis of the chemicalcomposition of archaeological material such as ceramic and glass artefacts. Data of thistype can be explored using a variety of techniques, from standard multivariate methodssuch as principal components analysis and cluster analysis, to methods based upon theuse of log-ratios. The general aim is to identify groups of chemically similar artefactsthat could potentially be used to answer questions of provenance.This paper will demonstrate work in progress on the development of a documentedlibrary of methods, implemented using the statistical package R, for the analysis ofcompositional data. R is an open source package that makes available very powerfulstatistical facilities at no cost. We aim to show how, with the aid of statistical softwaresuch as R, traditional exploratory multivariate analysis can easily be used alongside, orin combination with, specialist techniques of compositional data analysis.The library has been developed from a core of basic R functionality, together withpurpose-written routines arising from our own research (for example that reported atCoDaWork'03). In addition, we have included other appropriate publicly availabletechniques and libraries that have been implemented in R by other authors. Availablefunctions range from standard multivariate techniques through to various approaches tolog-ratio analysis and zero replacement. We also discuss and demonstrate a smallselection of relatively new techniques that have hitherto been little-used inarchaeometric applications involving compositional data. The application of the libraryto the analysis of data arising in archaeometry will be demonstrated; results fromdifferent analyses will be compared; and the utility of the various methods discussed

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In their safety evaluations of bisphenol A (BPA), the U.S. Food and Drug Administration (FDA) and a counterpart in Europe, the European Food Safety Authority (EFSA), have given special prominence to two industry-funded studies that adhered to standards defined by Good Laboratory Practices (GLP). These same agencies have given much less weight in risk assessments to a large number of independently replicated non-GLP studies conducted with government funding by the leading experts in various fields of science from around the world. OBJECTIVES: We reviewed differences between industry-funded GLP studies of BPA conducted by commercial laboratories for regulatory purposes and non-GLP studies conducted in academic and government laboratories to identify hazards and molecular mechanisms mediating adverse effects. We examined the methods and results in the GLP studies that were pivotal in the draft decision of the U.S. FDA declaring BPA safe in relation to findings from studies that were competitive for U.S. National Institutes of Health (NIH) funding, peer-reviewed for publication in leading journals, subject to independent replication, but rejected by the U.S. FDA for regulatory purposes. DISCUSSION: Although the U.S. FDA and EFSA have deemed two industry-funded GLP studies of BPA to be superior to hundreds of studies funded by the U.S. NIH and NIH counterparts in other countries, the GLP studies on which the agencies based their decisions have serious conceptual and methodologic flaws. In addition, the U.S. FDA and EFSA have mistakenly assumed that GLP yields valid and reliable scientific findings (i.e., "good science"). Their rationale for favoring GLP studies over hundreds of publically funded studies ignores the central factor in determining the reliability and validity of scientific findings, namely, independent replication, and use of the most appropriate and sensitive state-of-the-art assays, neither of which is an expectation of industry-funded GLP research. CONCLUSIONS: Public health decisions should be based on studies using appropriate protocols with appropriate controls and the most sensitive assays, not GLP. Relevant NIH-funded research using state-of-the-art techniques should play a prominent role in safety evaluations of chemicals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction The Andalusian Public Health System Virtual Library (Biblioteca Virtual del Sistema Sanitario Público de Andalucía, BV-SSPA) was set up in June 2006. It consists of a regional government action with the aim of democratizing the health professional access to quality scientific information, regardless of the professional workplace. Andalusia is a region with more than 8 million inhabitants, with 100,000 health professionals for 41 hospitals, 1,500 primary healthcare centres, and 28 centres for non-medical attention purposes (research, management, and educational centres). Objectives The Department of Development, Research and Investigation (R+D+i) of the Andalusian Regional Government has, among its duties, the task of evaluating the hospitals and centres of the Andalusian Public Health System (SSPA) in order to distribute its funding. Among the criteria used is the evaluation of the scientific output, which is measured using bibliometry. It is well-known that the bibliometry has a series of limitations and problems that should be taken into account, especially when it is used for non-information sciences, such us career, funding, etc. A few years ago, the bibliometric reports were done separately in each centre, but without using preset and well-defined criteria, elements which are basic when we need to compare the results of the reports. It was possible to find some hospitals which were including Meeting Abstracts in their figures, while others do not, and the same was happening with Erratum and many other differences. Therefore, the main problem that the Department of R+D+i had to deal with, when they were evaluating the health system, was that bibliometric data was not accurate and reports were not comparable. With the aim of having an unified criteria for the whole system, the Department of R+D+i ordered the BV-SSPA to do the year analysis of the scientific output of the system, using some well defined criteria and indicators, among whichstands out the Impact Factor. Materials and Methods As the Impact Factor is the bibliometric indicator that the virtual library is asked to consider, it is necessary to use the database Web of Science (WoS), since it is its owner and editor. The WoS includes the databases Science Citation Index (SCI), Social Sciences Citation Index (SSCI) and Arts & Humanities Citation Index. To gather all the documents, SCI and SSCI are used; to obtain the Impact Factor and quartils, it is used the Journal Citation Reports, JCR. Unlike other bibliographic databases, such us MEDLINE, the bibliometric database WoS includes the address of all the authors. In order to retrieve all the scientific output of the SSPA, we have done general searches, which are afterwards processed by a tool developed by our library. We have done nine different searches using the field ‘address’; eight of them including ‘Spain’ and each one of the eight Andalusian Regions, and the other one combining ‘Spain’ with all those cities where there are health centres, since we have detected that there are some authors that do not use the region in their signatures. These are some of the search strategies: AD=Malaga and AD=Spain AD=Sevill* and AD=Spain AD=SPAIN AND (AD=GUADIX OR AD=BAZA OR AD=MOTRIL) Further more, the field ‘year’ is used to determine the period. To exploit the data, the BV-SSPA has developed a tool called Impactia. It is a web application which uses a database to store the information of the documents generated by the SSPA. Impactia allows the user to automatically process the retrieved documents, assigning them to their correspondent centres. In order to do the classification of documents automaticaly, it was necessary to detect the huge variability of names of the centres that the authors use in their signatures. Therefore, Impactia knows that if an author signs as “Hospital Universitario Virgen Macarena”, “HVM” or “Hosp. Virgin Macarena”, he belongs to the same centre. The figure attached shows the variability found for the Empresa Publica Hospital de Poniente. Besides the documents from WoS, Impactia includes the documents indexed in Scopus and in other databases, where we do bibliographic searches using similar strategies to the later ones. Aware that in the health centres and hospitals there is a lot of grey literature that is not gathered in databases, Impactia allows the centres to feed the application with these documents, so that all the SSPA scientific output is gathered and organised in a centralized place. The ones responsible of localizing this gray literature are the librarians of each one of the centres. They can also do statements to the documents and indicators that are collected and calculated by Impactia. The bulk upload of documents from WoS and Scopus into Impactia is monthly done. One of the main issues that we found during the development of Impactia was the need of dealing with duplicated documents obtained from different sources. Taking into account that sometimes titles might be written differently, with slashes, comas, and so on, Impactia detects the duplicates using the field ‘DOI’ if it is available or comparing the fields: page start, page end and ISSN. Therefore it is possible to guarantee the absence of duplicates. Results The data gathered in Impactia becomes available to the administrative teams and hospitals managers, through an easy web page that allows them to know at any moment, and with just one click, the detailed information of the scientific output of their hospitals, including useful graphs such as percentage of document types, journals where their scientists usually publish, annual comparatives, bibliometric indicators and so on. They can also compare the different centres of the SSPA. Impactia allows the user to download the data from the application, so that he can work with this information or include them in their centres’ reports. This application saves the health system many working hours. It was previously done manually by forty one librarians, while now it is done by only one person in the BV-SSPA during two days a month. To sum up, the benefits of Impactia are: It has shown its effectiveness in the automatic classification, treatment and analysis of the data. It has become an essential tool for all managers to evaluate quickly and easily the scientific production of their centers. It optimizes the human resources of the SSPA, saving time and money. It is the reference point for the Department of R+D+i to do the scientific health staff evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this project a research both in finding predictors via clustering techniques and in reviewing the Data Mining free software is achieved. The research is based in a case of study, from where additionally to the KDD free software used by the scientific community; a new free tool for pre-processing the data is presented. The predictors are intended for the e-learning domain as the data from where these predictors have to be inferred are student qualifications from different e-learning environments. Through our case of study not only clustering algorithms are tested but also additional goals are proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

First discussion on compositional data analysis is attributable to Karl Pearson, in 1897. However, notwithstanding the recent developments on algebraic structure of the simplex, more than twenty years after Aitchison’s idea of log-transformations of closed data, scientific literature is again full of statistical treatments of this type of data by using traditional methodologies. This is particularly true in environmental geochemistry where besides the problem of the closure, the spatial structure (dependence) of the data have to be considered. In this work we propose the use of log-contrast values, obtained by asimplicial principal component analysis, as LQGLFDWRUV of given environmental conditions. The investigation of the log-constrast frequency distributions allows pointing out the statistical laws able togenerate the values and to govern their variability. The changes, if compared, for example, with the mean values of the random variables assumed as models, or other reference parameters, allow definingmonitors to be used to assess the extent of possible environmental contamination. Case study on running and ground waters from Chiavenna Valley (Northern Italy) by using Na+, K+, Ca2+, Mg2+, HCO3-, SO4 2- and Cl- concentrations will be illustrated

Relevância:

30.00% 30.00%

Publicador:

Resumo:

NovoTTF-100A (TTF) is a portable device delivering low-intensity, intermediate-frequency, alternating electric fields using noninvasive, disposable scalp electrodes. TTF interferes with tumor cell division, and it has been approved by the US Food and Drug Administration (FDA) for the treatment of recurrent glioblastoma (rGBM) based on data from a phase III trial. This presentation describes the updated survival data 2 years after completing recruitment. Adults with rGBM (KPS ≥ 70) were randomized (stratified by surgery and center) to either continuous TTF (20-24 h/day, 7 days/week) or efficacious chemotherapy based on best physician choice (BPC). The primary endpoint was overall survival (OS), and secondary endpoints were PFS6, 1-year survival, and QOL. Patients were randomized (28 US and European centers) to either TTF alone (n ¼ 120) or BPC (n ¼ 117). Patient characteristics were balanced, median age was 54 years (range, 23-80 years), and median KPS was 80 (range, 50-100). One quarter of the patients had debulking surgery, and over half of the patients were at their second or later recurrence. OS in the intent-to-treat (ITT) population was equivalent in TTF versus BPC patients (median OS, 6.6vs. 6.0 months; n ¼ 237; p ¼ 0.26; HR ¼ 0.86). With a median follow-up of 33.6 months, long-term survival in the TTF group was higher than that in the BPC group at 2, 3, and 4 years of follow-up (9.3% vs. 6.6%; 8.4% vs. 1.4%; 8.4% vs. 0.0%, respectively). Analysis of patients who received at least one treatment course demonstrated a survival benefit for TTF patients compared to BPC patients (median OS, 7.8 vs. 6.0 months; n ¼ 93 vs. n ¼ 117; p ¼ 0.012; HR ¼ 0.69). In this group, 1-year survival was 28% vs. 20%, and PFS6 was 26.2% vs. 15.2% (p ¼ 0.034). TTF, a noninvasive, novel cancer treatment modality shows significant therapeutic efficacy with promising long-term survival results. The impact of TTF was more pronounced when comparing only patients who received the minimal treatment course. A large-scale phase III trial in newly diagnosed GBM is ongoing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AbstractOBJECTIVEPresenting methodology for transferring knowledge to improve maternal outcomes in natural delivery based on scientific evidence.METHOD: An intervention study conducted in the maternity hospital of Itapecerica da Serra, SP, with 50 puerperal women and 102 medical records from July to November 2014. The PACES tool from Joanna Briggs Institute, consisting of pre-clinical audit (phase 1), implementation of best practice (phase 2) and Follow-up Clinical Audit (phase 3) was used. Data were analyzed by comparing results of phases 1 and 3 with Fisher's exact test and a significance level of 5%.RESULTSThe vertical position was adopted by the majority of puerperal women with statistical difference between phases 1 and 3. A significant increase in bathing/showering, walking and massages for pain relief was found from the medical records. No statistical difference was found in other practices and outcomes. Barriers and difficulties in the implementation of evidence-based practices have been identified. Variables were refined, techniques and data collection instruments were verified, and an intervention proposal was made.CONCLUSIONThe study found possibilities for implementing a methodology of practices based on scientific evidence for assistance in natural delivery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El presente manual de uso del software de visualización de datos “Ocean Data View” (ODV) describe la exploración, análisis y visualización de datos oceanográficos según el formato de la colección mundial de base de datos del océano “World Ocean Database” (WOD). El manual comprende 6 ejercicios prácticos donde se describe paso a paso la creación de las metavariables, la importación de los datos y su visualización mediante mapas de latitud, longitud y gráficos de dispersión, secciones verticales y series de tiempo. Se sugiere el uso extensivo del ODV para la visualización de datos oceanográficos por el personal científico del IMARPE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Synthesizing research evidence using systematic and rigorous methods has become a key feature of evidence-based medicine and knowledge translation. Systematic reviews (SRs) may or may not include a meta-analysis depending on the suitability of available data. They are often being criticised as 'secondary research' and denied the status of original research. Scientific journals play an important role in the publication process. How they appraise a given type of research influences the status of that research in the scientific community. We investigated the attitudes of editors of core clinical journals towards SRs and their value for publication.¦METHODS: We identified the 118 journals labelled as "core clinical journals" by the National Library of Medicine, USA in April 2009. The journals' editors were surveyed by email in 2009 and asked whether they considered SRs as original research projects; whether they published SRs; and for which section of the journal they would consider a SR manuscript.¦RESULTS: The editors of 65 journals (55%) responded. Most respondents considered SRs to be original research (71%) and almost all journals (93%) published SRs. Several editors regarded the use of Cochrane methodology or a meta-analysis as quality criteria; for some respondents these criteria were premises for the consideration of SRs as original research. Journals placed SRs in various sections such as "Review" or "Feature article". Characterization of non-responding journals showed that about two thirds do publish systematic reviews.¦DISCUSSION: Currently, the editors of most core clinical journals consider SRs original research. Our findings are limited by a non-responder rate of 45%. Individual comments suggest that this is a grey area and attitudes differ widely. A debate about the definition of 'original research' in the context of SRs is warranted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a thorough aggregation of probability and graph theory, Bayesian networks currently enjoy widespread interest as a means for studying factors that affect the coherent evaluation of scientific evidence in forensic science. Paper I of this series of papers intends to contribute to the discussion of Bayesian networks as a framework that is helpful for both illustrating and implementing statistical procedures that are commonly employed for the study of uncertainties (e.g. the estimation of unknown quantities). While the respective statistical procedures are widely described in literature, the primary aim of this paper is to offer an essentially non-technical introduction on how interested readers may use these analytical approaches - with the help of Bayesian networks - for processing their own forensic science data. Attention is mainly drawn to the structure and underlying rationale of a series of basic and context-independent network fragments that users may incorporate as building blocs while constructing larger inference models. As an example of how this may be done, the proposed concepts will be used in a second paper (Part II) for specifying graphical probability networks whose purpose is to assist forensic scientists in the evaluation of scientific evidence encountered in the context of forensic document examination (i.e. results of the analysis of black toners present on printed or copied documents).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study explores the statistical properties of a randomization test based on the random assignment of the intervention point in a two-phase (AB) single-case design. The focus is on randomization distributions constructed with the values of the test statistic for all possible random assignments and used to obtain p-values. The shape of those distributions is investigated for each specific data division defined by the moment in which the intervention is introduced. Another aim of the study consisted in testing the detection of inexistent effects (i.e., production of false alarms) in autocorrelated data series, in which the assumption of exchangeability between observations may be untenable. In this way, it was possible to compare nominal and empirical Type I error rates in order to obtain evidence on the statistical validity of the randomization test for each individual data division. The results suggest that when either of the two phases has considerably less measurement times, Type I errors may be too probable and, hence, the decision making process to be carried out by applied researchers may be jeopardized.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study looks at how increased memory utilisation affects throughput and energy consumption in scientific computing, especially in high-energy physics. Our aim is to minimise energy consumed by a set of jobs without increasing the processing time. The earlier tests indicated that, especially in data analysis, throughput can increase over 100% and energy consumption decrease 50% by processing multiple jobs in parallel per CPU core. Since jobs are heterogeneous, it is not possible to find an optimum value for the number of parallel jobs. A better solution is based on memory utilisation, but finding an optimum memory threshold is not straightforward. Therefore, a fuzzy logic-based algorithm was developed that can dynamically adapt the memory threshold based on the overall load. In this way, it is possible to keep memory consumption stable with different workloads while achieving significantly higher throughput and energy-efficiency than using a traditional fixed number of jobs or fixed memory threshold approaches.