870 resultados para international new ventures
Resumo:
Introduction: Biological. therapy has dramatically changed management of Crohn's disease (CD). New data have confirmed the benefit and relative long-term safety of anti-TNF alpha inhibition as part of a regular scheduled administration programme. The EPACT appropriateness criteria for maintenance treatment after medically-induced remission (MIR) or surgically-induced remission (SIR) of CD thus required updating. Methods: A multidisciplinary international expert panel (EPACT II, Geneva, Switzerland) discussed and anonymously rated detailed, explicit clinical indications based on evidence in the literature and personal expertise. Median ratings (on a 9-point scale) were stratified into three assessment categories: appropriate (7-9), uncertain (4-6 and/or disagreement) and inappropriate (1-3). Experts ranked appropriate medication according to their own clinical practice, without any consideration of cost. Results: Three hundred and ninety-two specific indications for maintenance treatment of CD were rated (200 for MIR and 192 for SIR). Azathioprine, methotrexate and/or anti-TNF alpha antibodies were considered appropriate in 42 indications, corresponding to 68% of all appropriate interventions (97% of MIR and 39% of SIR). The remaining appropriate interventions consisted of mesalazine and a "wait-and-see" strategy. Factors that influenced the panel's voting were patient characteristics and outcome of previous treatment. Results favour use of anti-TNF alpha agents after failure of any immunosuppressive therapy, while earlier primary use remains controversial. Conclusion: Detailed explicit appropriateness criteria (EPACT) have been updated for maintenance treatment of CD. New expert recommendations for use of the classic immunosuppressors as well as anti-TNF alpha agents are now freely available online (www.epact.ch). The validity of these criteria should now be tested by prospective evaluation. (C) 2009 European Crohn's and Colitis Organisation. Published by Elsevier B.V. All rights reserved.
Resumo:
Although paraphrasing is the linguistic mechanism underlying many plagiarism cases, little attention has been paid to its analysis in the framework of automatic plagiarism detection. Therefore, state-of-the-art plagiarism detectors find it difficult to detect cases of paraphrase plagiarism. In this article, we analyse the relationship between paraphrasing and plagiarism, paying special attention to which paraphrase phenomena underlie acts of plagiarism and which of them are detected by plagiarism detection systems. With this aim in mind, we created the P4P corpus, a new resource which uses a paraphrase typology to annotate a subset of the PAN-PC-10 corpus for automatic plagiarism detection. The results of the Second International Competition on Plagiarism Detection were analysed in the light of this annotation. The presented experiments show that (i) more complex paraphrase phenomena and a high density of paraphrase mechanisms make plagiarism detection more difficult, (ii) lexical substitutions are the paraphrase mechanisms used the most when plagiarising, and (iii) paraphrase mechanisms tend to shorten the plagiarized text. For the first time, the paraphrase mechanisms behind plagiarism have been analysed, providing critical insights for the improvement of automatic plagiarism detection systems.
Resumo:
This paper analyses the adoption of new information and communication technologies (ICTs) by Spanish journalists specialising in science. Applying an ethnographic research model, this study was based on a wide sample of professionals, aiming to evaluate the extent by which science journalists have adopted the new media and changed the way they use information sources. In addition, interviewees were asked whether in their opinion the Web 2.0 has had an impact on the quality of the news. The integration of formats certainly implies a few issues for today’s newsrooms. Finally, with the purpose of improving the practice of science information dissemination, the authors put forward a few proposals, namely: Increasing the training of Spanish science journalists in the field of new technologies; Emphasising the accuracy of the information and the validation of sources; Rethinking the mandates and the tasks of information professionals.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
Cataract surgery is a common ocular surgical procedure consisting in the implantation of an artificial intraocular lens (IOL) to replace the ageing, dystrophic or damaged natural one. The management of postoperative ocular inflammation is a major challenge especially in the context of pre-existing uveitis. The association of the implanted IOL with a drug delivery system (DDS) allows the prolonged intraocular release of anti-inflammatory agents after surgery. Thus IOL-DDS represents an "all in one" strategy that simultaneously addresses both cataract and inflammation issues. Polymeric DDS loaded with two model anti-inflammatory drugs (triamcinolone acetonide (TA) and cyclosporine A (CsA)) were manufactured in a novel way and tested regarding their efficiency for the management of intraocular inflammation during the 3 months following surgery. The study involved an experimentally induced uveitis in rabbits. Experimental results showed that medicated DDS efficiently reduced ocular inflammation (decrease of protein concentration in aqueous humour, inflammatory cells in aqueous humour and clinical score). Additionally, more than 60% of the loading dose remained in the DDS at the end of the experiment, suggesting that the system could potentially cover longer inflammatory episodes. Thus, IOL-DDS were demonstrated to inhibit intraocular inflammation for at least 3 months after cataract surgery, representing a potential novel approach to cataract surgery in eyes with pre-existing uveitis.
Resumo:
Although prosthetic joint infection (PJI) is a rare event after arthroplasty, it represents a significant complication that is associated with high morbidity, need for complex treatment, and substantial healthcare costs. An accurate and rapid diagnosis of PJI is crucial for treatment success. Current diagnostic methods in PJI are insufficient with 10-30% false-negative cultures. Consequently, there is a need for research and development into new methods aimed at improving diagnostic accuracy and speed of detection. In this article, we review available conventional diagnostic methods for the diagnosis of PJI (laboratory markers, histopathology, synovial fluid and periprosthetic tissue cultures), new diagnostic methods (sonication of implants, specific and multiplex PCR, mass spectrometry) and innovative techniques under development (new laboratory markers, microcalorimetry, electrical method, reverse transcription [RT]-PCR, fluorescence in situ hybridization [FISH], biofilm microscopy, microarray identification, and serological tests). The results of highly sensitive diagnostic techniques with unknown specificity should be interpreted with caution. The organism identified by a new method may represent a real pathogen that was unrecognized by conventional diagnostic methods or contamination during specimen sampling, transportation, or processing. For accurate interpretation, additional studies are needed, which would evaluate the long-term outcome (usually >2 years) with or without antimicrobial treatment. It is expected that new rapid, accurate, and fully automatic diagnostic tests will be developed soon.
Resumo:
Rapport de synthèse1. Partie de laboratoireCette première étude décrit le développement et la validation, selon les standards internationaux, de deux techniques de mesure des concentrations sanguines de voriconazole, un nouvel agent antifongique à large spectre: 1) la chromatographic en phase liquide à haute pression et 2) le bio-essai utilisant une souche mutante de Candida hypersensible au voriconazole. Ce travail a aussi permis de mettre en évidence une importante et imprévisible variabilité inter- et intra-individuelle des concentrations sanguines de voriconazole malgré l'utilisation des doses recommandées par le fabriquant. Ce travail a été publié dans un journal avec "peer-review": "Variability of voriconazole plasma levels measured by new high- performance liquid chromatography and bioassay methods" by A. Pascual, V. Nieth, T. Calandra, J. Bille, S. Bolay, L.A. Decosterd, T. Buclin, P.A. Majcherczyk, D. Sanglard, 0. Marchetti. Antimicrobial Agents Chemotherapy, 2007; 51:137-432. Partie CliniqueCette deuxième étude a évalué de façon prospective l'impact clinique des concentrations sanguines de voriconazole sur l'efficacité et sécurité thérapeutique chez des patients atteints d'infections fongiques. Des concentrations sanguines élevées étaient significativement associés à la survenue d'une toxicité neurologique (encéphalopathie avec confusion, hallucinations et myoclonies) et des concentrations sanguines basses à une réponse insuffisante au traitement antifongique (persistance ou progression des signes cliniques et radiologiques de l'infection). Dans la majorité des cas, un ajustement de la dose de voriconazole, sur la base des concentrations mesurées, a abouti à une récupération neurologique complète ou à une résolution de l'infection, respectivement. Ce travail a été publié dans un journal avec "peer-review": " Voriconazole Therapeutic Drug Monitoring in Patients with Invasive Mycoses Improves Efficacy and Safety Outcomes" by A. Pascual, T. Calandra, S. Bolay, T. Buclin, J. Bille, and O. Marchetti. Clinical Infectious Diseases, 2008 January 15; 46(2): 201-11.Ces deux études, financées de façon conjointe par un "grant" international de la Société suisse d'infectiologie et la Société internationale de maladies infectieuses et par la Fondation pour le progrès en microbiologie médicale et maladies infectieuses (FAMMID, Lausanne), ont été réalisées au sein du Service des Maladies Infectieuses, Département de Médecine, au CHUV, en étroite collaboration avec la Division de Pharmacologie Clinique, Département de Médecine, au CHUV et l'Institut de Microbiologie du CHUV et de l'Université de Lausanne.
Resumo:
This paper is a preliminary report on the petrographic and geochemical characteristics of a new sulphate mineral associated with lacustrine glauberite layers. This mineral is present in two boreholes recently drilled in the Emet borate district (Mio-cene; western Anatolia , Turkey). The evaporitic succession in these boreholes is mainly formed of a glauber ite-probertiteal-ternation. We suggest the name"emetite", after the town of Emet, for the new sulphate mineral, although the fine crystal size hinders the appropriate chemical and crystallographic characterization required to propose it as a new mineral to the International Mineralogical Association. .
Resumo:
This paper is a preliminary report on the petrographic and geochemical characteristics of a new sulphate mineral associated with lacustrine glauberite layers. This mineral is present in two boreholes recently drilled in the Emet borate district (Mio-cene; western Anatolia , Turkey). The evaporitic succession in these boreholes is mainly formed of a glauber ite-probertiteal-ternation. We suggest the name"emetite", after the town of Emet, for the new sulphate mineral, although the fine crystal size hinders the appropriate chemical and crystallographic characterization required to propose it as a new mineral to the International Mineralogical Association. .
Resumo:
Quantitative approaches in ceramology are gaining ground in excavation reports, archaeological publications and thematic studies. Hence, a wide variety of methods are being used depending on the researchers' theoretical premise, the type of material which is examined, the context of discovery and the questions that are addressed. The round table that took place in Athens on November 2008 was intended to offer the participants the opportunity to present a selection of case studies on the basis of which methodological approaches were discussed. The aim was to define a set of guidelines for quantification that would prove to be of use to all researchers. Contents: 1) Introduction (Samuel Verdan); 2) Isthmia and beyond. How can quantification help the analysis of EIA sanctuary deposits? (Catherine Morgan); 3) Approaching aspects of cult practice and ethnicity in Early Iron Age Ephesos using quantitative analysis of a Protogeometric deposit from the Artemision (Michael Kerschner); 4) Development of a ceramic cultic assemblage: Analyzing pottery from Late Helladic IIIC through Late Geometric Kalapodi (Ivonne Kaiser, Laura-Concetta Rizzotto, Sara Strack); 5) 'Erfahrungsbericht' of application of different quantitative methods at Kalapodi (Sara Strack); 6) The Early Iron Age sanctuary at Olympia: counting sherds from the Pelopion excavations (1987-1996) (Birgitta Eder); 7) L'aire du pilier des Rhodiens à Delphes: Essai de quantification du mobilier (Jean-Marc Luce); 8) A new approach in ceramic statistical analyses: Pit 13 on Xeropolis at Lefkandi (David A. Mitchell, Irene S. Lemos); 9) Households and workshops at Early Iron Age Oropos: A quantitative approach of the fine, wheel-made pottery (Vicky Vlachou); 10) Counting sherds at Sindos: Pottery consumption and construction of identities in the Iron Age (Stefanos Gimatzidis); 11) Analyse quantitative du mobilier céramique des fouilles de Xombourgo à Ténos et le cas des supports de caisson (Jean-Sébastien Gros); 12) Defining a typology of pottery from Gortyn: The material from a pottery workshop pit, (Emanuela Santaniello); 13) Quantification of ceramics from Early Iron Age tombs (Antonis Kotsonas); 14) Quantitative analysis of the pottery from the Early Iron Age necropolis of Tsikalario on Naxos (Xenia Charalambidou); 15) Finding the Early Iron Age in field survey: Two case studies from Boeotia and Magnesia (Vladimir Stissi); 16) Pottery quantification: Some guidelines (Samuel Verdan).
Resumo:
Introduction New evidence from randomized controlled and etiology of fever studies, the availability of reliable RDT for malaria, and novel technologies call for revision of the IMCI strategy. We developed a new algorithm based on (i) a systematic review of published studies assessing the safety and appropriateness of RDT and antibiotic prescription, (ii) results from a clinical and microbiological investigation of febrile children aged <5 years, (iii) international expert IMCI opinions. The aim of this study was to assess the safety of the new algorithm among patients in urban and rural areas of Tanzania.Materials and Methods The design was a controlled noninferiority study. Enrolled children aged 2-59 months with any illness were managed either by a study clinician using the new Almanach algorithm (two intervention health facilities), or clinicians using standard practice, including RDT (two control HF). At day 7 and day 14, all patients were reassessed. Patients who were ill in between or not cured at day 14 were followed until recovery or death. Primary outcome was rate of complications, secondary outcome rate of antibiotic prescriptions.Results 1062 children were recruited. Main diagnoses were URTI 26%, pneumonia 19% and gastroenteritis (9.4%). 98% (531/541) were cured at D14 in the Almanach arm and 99.6% (519/521) in controls. Rate of secondary hospitalization was 0.2% in each. One death occurred in controls. None of the complications was due to withdrawal of antibiotics or antimalarials at day 0. Rate of antibiotic use was 19% in the Almanach arm and 84% in controls.Conclusion Evidence suggests that the new algorithm, primarily aimed at the rational use of drugs, is as safe as standard practice and leads to a drastic reduction of antibiotic use. The Almanach is currently being tested for clinician adherence to proposed procedures when used on paper or a mobile phone
Resumo:
Obesity is heritable and predisposes to many diseases. To understand the genetic basis of obesity better, here we conduct a genome-wide association study and Metabochip meta-analysis of body mass index (BMI), a measure commonly used to define obesity and assess adiposity, in up to 339,224 individuals. This analysis identifies 97 BMI-associated loci (P < 5 × 10(-8)), 56 of which are novel. Five loci demonstrate clear evidence of several independent association signals, and many loci have significant effects on other metabolic phenotypes. The 97 loci account for ∼2.7% of BMI variation, and genome-wide estimates suggest that common variation accounts for >20% of BMI variation. Pathway analyses provide strong support for a role of the central nervous system in obesity susceptibility and implicate new genes and pathways, including those related to synaptic function, glutamate signalling, insulin secretion/action, energy metabolism, lipid biology and adipogenesis.
Resumo:
This paper discusses the qualitativecomparative evaluation performed on theresults of two machine translation systemswith different approaches to the processing ofmulti-word units. It proposes a solution forovercoming the difficulties multi-word unitspresent to machine translation by adopting amethodology that combines the lexicongrammar approach with OpenLogos ontologyand semantico-syntactic rules. The paper alsodiscusses the importance of a qualitativeevaluation metrics to correctly evaluate theperformance of machine translation engineswith regards to multi-word units.
Resumo:
BACKGROUND: Selective publication of studies, which is commonly called publication bias, is widely recognized. Over the years a new nomenclature for other types of bias related to non-publication or distortion related to the dissemination of research findings has been developed. However, several of these different biases are often still summarized by the term 'publication bias'. METHODS/DESIGN: As part of the OPEN Project (To Overcome failure to Publish nEgative fiNdings) we will conduct a systematic review with the following objectives:- To systematically review highly cited articles that focus on non-publication of studies and to present the various definitions of biases related to the dissemination of research findings contained in the articles identified.- To develop and discuss a new framework on nomenclature of various aspects of distortion in the dissemination process that leads to public availability of research findings in an international group of experts in the context of the OPEN Project.We will systematically search Web of Knowledge for highly cited articles that provide a definition of biases related to the dissemination of research findings. A specifically designed data extraction form will be developed and pilot-tested. Working in teams of two, we will independently extract relevant information from each eligible article.For the development of a new framework we will construct an initial table listing different levels and different hazards en route to making research findings public. An international group of experts will iteratively review the table and reflect on its content until no new insights emerge and consensus has been reached. DISCUSSION: Results are expected to be publicly available in mid-2013. This systematic review together with the results of other systematic reviews of the OPEN project will serve as a basis for the development of future policies and guidelines regarding the assessment and prevention of publication bias.