53 resultados para Optimisation of methods
Resumo:
Over the past 30 years, benzimidazoles have increasingly been used to treat cystic echinococcosis (CE). The efficacy of benzimidazoles, however, remains unclear. We systematically searched MEDLINE, EMBASE, SIGLE, and CCTR to identify studies on benzimidazole treatment outcome. A large heterogeneity of methods in 23 reports precluded a meta-analysis of published results. Specialist centres were contacted to provide individual patient data. We conducted survival analyses for cyst response defined as inactive (CE4 or CE5 by the ultrasound-based World Health Organisation [WHO] classification scheme) or as disappeared. We collected data from 711 treated patients with 1,308 cysts from six centres (five countries). Analysis was restricted to 1,159 liver and peritoneal cysts. Overall, 1-2 y after initiation of benzimidazole treatment 50%-75% of active C1 cysts were classified as inactive/disappeared compared to 30%-55% of CE2 and CE3 cysts. Further in analyzing the rate of inactivation/disappearance with regard to cyst size, 50%-60% of cysts <6 cm responded to treatment after 1-2 y compared to 25%-50% of cysts >6 cm. However, 25% of cysts reverted to active status within 1.5 to 2 y after having initially responded and multiple relapses were observed; after the second and third treatment 60% of cysts relapsed within 2 y. We estimated that 2 y after treatment initiation 40% of cysts are still active or become active again. The overall efficacy of benzimidazoles has been overstated in the past. There is an urgent need for a pragmatic randomised controlled trial that compares standardized benzimidazole therapy on responsive cyst stages with the other treatment modalities.
Resumo:
Cell death is essential for a plethora of physiological processes, and its deregulation characterizes numerous human diseases. Thus, the in-depth investigation of cell death and its mechanisms constitutes a formidable challenge for fundamental and applied biomedical research, and has tremendous implications for the development of novel therapeutic strategies. It is, therefore, of utmost importance to standardize the experimental procedures that identify dying and dead cells in cell cultures and/or in tissues, from model organisms and/or humans, in healthy and/or pathological scenarios. Thus far, dozens of methods have been proposed to quantify cell death-related parameters. However, no guidelines exist regarding their use and interpretation, and nobody has thoroughly annotated the experimental settings for which each of these techniques is most appropriate. Here, we provide a nonexhaustive comparison of methods to detect cell death with apoptotic or nonapoptotic morphologies, their advantages and pitfalls. These guidelines are intended for investigators who study cell death, as well as for reviewers who need to constructively critique scientific reports that deal with cellular demise. Given the difficulties in determining the exact number of cells that have passed the point-of-no-return of the signaling cascades leading to cell death, we emphasize the importance of performing multiple, methodologically unrelated assays to quantify dying and dead cells.
Resumo:
Through the concerted evaluations of thousands of commercial substances for the qualities of persistence, bioaccumulation, and toxicity as a result of the United Nations Environment Program's Stockholm Convention, it has become apparent that fewer empirical data are available on bioaccumulation than other endpoints and that bioaccumulation models were not designed to accommodate all chemical classes. Due to the number of chemicals that may require further assessment, in vivo testing is cost prohibitive and discouraged due to the large number of animals needed. Although in vitro systems are less developed and characterized for fish, multiple high-throughput in vitro assays have been used to explore the dietary uptake and elimination of pharmaceuticals and other xenobiotics by mammals. While similar processes determine bioaccumulation in mammalian species, a review of methods to measure chemical bioavailability in fish screening systems, such as chemical biotransformation or metabolism in tissue slices, perfused tissues, fish embryos, primary and immortalized cell lines, and subcellular fractions, suggest quantitative and qualitative differences between fish and mammals exist. Using in vitro data in assessments for whole organisms or populations requires certain considerations and assumptions to scale data from a test tube to a fish, and across fish species. Also, different models may incorporate the predominant site of metabolism, such as the liver, and significant presystemic metabolism by the gill or gastrointestinal system to help accurately convert in vitro data into representative whole-animal metabolism and subsequent bioaccumulation potential. The development of animal alternative tests for fish bioaccumulation assessment is framed in the context of in vitro data requirements for regulatory assessments in Europe and Canada.
Resumo:
OBJECTIVES Abstracts of systematic reviews are of critical importance, as consumers of research often do not access the full text. This study aimed to assess the reporting quality of systematic review (SR) abstracts in leading oral implantology journals. METHODS Six specialty journals were screened for SRs between 2008 and 2012. A 16-item checklist, based on the PRISMA statement, was used to examine the completeness of abstract reporting. RESULTS Ninety-three SR abstracts were included in this study. The majority were published in Clinical Oral Implants Research (43%). The mean overall reporting quality score was 72.5% (95% CI: 70.8-74.2). Most abstracts were structured (97.9%), adequately reporting objectives (97.9%) and conclusions (93.6%). Conversely, inadequate reporting of methods of the study, background (79.6%), appraisal (65.6%), and data synthesis (65.6%) were observed. Registration of reviews was not reported in any of the included abstracts. Multivariate analysis revealed no difference in reporting quality with respect to continent, number of authors, or meta-analysis conduct. CONCLUSIONS The results of this study suggest that the reporting quality of systematic review abstracts in implantology journals requires further improvement. CLINICAL SIGNIFICANCE Better reporting of SR abstracts is particularly important in ensuring the reliability of research findings, ultimately promoting the practice of evidence-based dentistry. Optimal reporting of SR abstracts should be encouraged, preferably by endorsing the PRISMA for abstracts guidelines.
Resumo:
BACKGROUND Partner notification is essential to the comprehensive case management of sexually transmitted infections. Systematic reviews and mathematical modelling can be used to synthesise information about the effects of new interventions to enhance the outcomes of partner notification. OBJECTIVE To study the effectiveness and cost-effectiveness of traditional and new partner notification technologies for curable sexually transmitted infections (STIs). DESIGN Secondary data analysis of clinical audit data; systematic reviews of randomised controlled trials (MEDLINE, EMBASE and Cochrane Central Register of Controlled Trials) published from 1 January 1966 to 31 August 2012 and of studies of health-related quality of life (HRQL) [MEDLINE, EMBASE, ISI Web of Knowledge, NHS Economic Evaluation Database (NHS EED), Database of Abstracts of Reviews of Effects (DARE) and Health Technology Assessment (HTA)] published from 1 January 1980 to 31 December 2011; static models of clinical effectiveness and cost-effectiveness; and dynamic modelling studies to improve parameter estimation and examine effectiveness. SETTING General population and genitourinary medicine clinic attenders. PARTICIPANTS Heterosexual women and men. INTERVENTIONS Traditional partner notification by patient or provider referral, and new partner notification by expedited partner therapy (EPT) or its UK equivalent, accelerated partner therapy (APT). MAIN OUTCOME MEASURES Population prevalence; index case reinfection; and partners treated per index case. RESULTS Enhanced partner therapy reduced reinfection in index cases with curable STIs more than simple patient referral [risk ratio (RR) 0.71; 95% confidence interval (CI) 0.56 to 0.89]. There are no randomised trials of APT. The median number of partners treated for chlamydia per index case in UK clinics was 0.60. The number of partners needed to treat to interrupt transmission of chlamydia was lower for casual than for regular partners. In dynamic model simulations, > 10% of partners are chlamydia positive with look-back periods of up to 18 months. In the presence of a chlamydia screening programme that reduces population prevalence, treatment of current partners achieves most of the additional reduction in prevalence attributable to partner notification. Dynamic model simulations show that cotesting and treatment for chlamydia and gonorrhoea reduce the prevalence of both STIs. APT has a limited additional effect on prevalence but reduces the rate of index case reinfection. Published quality-adjusted life-year (QALY) weights were of insufficient quality to be used in a cost-effectiveness study of partner notification in this project. Using an intermediate outcome of cost per infection diagnosed, doubling the efficacy of partner notification from 0.4 to 0.8 partners treated per index case was more cost-effective than increasing chlamydia screening coverage. CONCLUSIONS There is evidence to support the improved clinical effectiveness of EPT in reducing index case reinfection. In a general heterosexual population, partner notification identifies new infected cases but the impact on chlamydia prevalence is limited. Partner notification to notify casual partners might have a greater impact than for regular partners in genitourinary clinic populations. Recommendations for future research are (1) to conduct randomised controlled trials using biological outcomes of the effectiveness of APT and of methods to increase testing for human immunodeficiency virus (HIV) and STIs after APT; (2) collection of HRQL data should be a priority to determine QALYs associated with the sequelae of curable STIs; and (3) standardised parameter sets for curable STIs should be developed for mathematical models of STI transmission that are used for policy-making. FUNDING The National Institute for Health Research Health Technology Assessment programme.
Resumo:
The application of image-guided systems with or without support by surgical robots relies on the accuracy of the navigation process, including patient-to-image registration. The surgeon must carry out the procedure based on the information provided by the navigation system, usually without being able to verify its correctness beyond visual inspection. Misleading surrogate parameters such as the fiducial registration error are often used to describe the success of the registration process, while a lack of methods describing the effects of navigation errors, such as those caused by tracking or calibration, may prevent the application of image guidance in certain accuracy-critical interventions. During minimally invasive mastoidectomy for cochlear implantation, a direct tunnel is drilled from the outside of the mastoid to a target on the cochlea based on registration using landmarks solely on the surface of the skull. Using this methodology, it is impossible to detect if the drill is advancing in the correct direction and that injury of the facial nerve will be avoided. To overcome this problem, a tool localization method based on drilling process information is proposed. The algorithm estimates the pose of a robot-guided surgical tool during a drilling task based on the correlation of the observed axial drilling force and the heterogeneous bone density in the mastoid extracted from 3-D image data. We present here one possible implementation of this method tested on ten tunnels drilled into three human cadaver specimens where an average tool localization accuracy of 0.29 mm was observed.
Resumo:
The accuracy of Global Positioning System (GPS) time series is degraded by the presence of offsets. To assess the effectiveness of methods that detect and remove these offsets, we designed and managed the Detection of Offsets in GPS Experiment. We simulated time series that mimicked realistic GPS data consisting of a velocity component, offsets, white and flicker noises (1/f spectrum noises) composed in an additive model. The data set was made available to the GPS analysis community without revealing the offsets, and several groups conducted blind tests with a range of detection approaches. The results show that, at present, manual methods (where offsets are hand picked) almost always give better results than automated or semi‒automated methods (two automated methods give quite similar velocity bias as the best manual solutions). For instance, the fifth percentile range (5% to 95%) in velocity bias for automated approaches is equal to 4.2 mm/year (most commonly ±0.4 mm/yr from the truth), whereas it is equal to 1.8 mm/yr for the manual solutions (most commonly 0.2 mm/yr from the truth). The magnitude of offsets detectable by manual solutions is smaller than for automated solutions, with the smallest detectable offset for the best manual and automatic solutions equal to 5 mm and 8 mm, respectively. Assuming the simulated time series noise levels are representative of real GPS time series, robust geophysical interpretation of individual site velocities lower than 0.2–0.4 mm/yr is therefore certainly not robust, although a limit of nearer 1 mm/yr would be a more conservative choice. Further work to improve offset detection in GPS coordinates time series is required before we can routinely interpret sub‒mm/yr velocities for single GPS stations.
Resumo:
Palaeoflood hydrology is an expanding field as the damage potential of flood and flood-related processes are increasing with the population density and the value of the infrastructure. Assessing the risk of these hazards in mountainous terrain requires knowledge about the frequency and severness of such events in the past. A wide range of methods is employed using diverse biologic, geomorphic or geologic evidences to track past flood events. Impact of floods are studied and dated on alluvial fans and cones using for example the growth disturbance of trees (Stoffel and Bollschweiler 2008; Schneuwly-Bollschweiler and Stoffel 2012: this volume) or stratigraphic layers deposited by debris flows, allowing to reconstruct past flood frequencies (Bardou et~al. 2003). Further downstream, the classical approach of palaeoflood hydrology (Kochel and Baker 1982) utilizes geomorphic indicators such as overbank sediments, silt lines and erosion features of floods along a river (e.g. Benito and Thorndycraft 2005). Fine-grained sediment settles out of the river suspension in eddies or backwater areas, where the flow velocity of the river is reduced. Records of these deposits at different elevations across a river’s profile can be used to assess the discharge of the past floods. This approach of palaeoflood hydrology studies was successfully applied in several river catchments (e.g. Ely et al. 1993; Macklin and Lewin 2003; O’Connor et al. 1994; Sheffer et al. 2003; Thorndycraft et al. 2005; Thorndycraft and Benito 2006). All these different reconstruction methods have their own advantages and disadvantages, but often these studies have a limited time coverage and the records are potentially incomplete due to lateral limits of depositional areas and due to the erosional power of fluvial processes that remove previously deposited flood witnesses. Here, we present a method that follows the sediment particle transported by a flood event to its final sink: the lacustrine basin.
Resumo:
Attention deficit/hyperactivity disorder (ADHD) is an increasingly recognized comorbid condition in subjects with substance use disorders (SUDs). This paper describes the methods and study population of the International ADHD in Substance Use Disorders Prevalence (IASP) study. Objectives of the IASP are to determine the prevalence of ADHD in adult treatment seeking patients with SUD in different countries and SUD populations, determine the reliability and validity of the Adult ADHD Self-report Scale V 1.1 (ASRS) as ADHD screening instrument in SUD populations, investigate the comorbidity profile of SUD patients with and without ADHD, compare risk factors and protective factors in SUD patients with and without a comorbid diagnosis of ADHD, and increase our knowledge about the relationship between ADHD and the onset and course of SUD. In this cross-sectional, multi-centre two stage study, subjects were screened for ADHD with the ASRS, diagnosed with the Conner's Adult ADHD Diagnostic Interview for DSM-IV (CAADID), and evaluated for SUD, major depression, bipolar disorder, anti social personality disorder and borderline personality disorder. Three thousand five hundred and fifty-eight subjects from 10 countries were included. Of these 40.9% screened positive for ADHD. This is the largest international study on this population evaluating ADHD and comorbid disorders.
Resumo:
The Wetland and Wetland CH4 Intercomparison of Models Project (WETCHIMP) was created to evaluate our present ability to simulate large-scale wetland characteristics and corresponding methane (CH4) emissions. A multi-model comparison is essential to evaluate the key uncertainties in the mechanisms and parameters leading to methane emissions. Ten modelling groups joined WETCHIMP to run eight global and two regional models with a common experimental protocol using the same climate and atmospheric carbon dioxide (CO2) forcing datasets. We reported the main conclusions from the intercomparison effort in a companion paper (Melton et al., 2013). Here we provide technical details for the six experiments, which included an equilibrium, a transient, and an optimized run plus three sensitivity experiments (temperature, precipitation, and atmospheric CO2 concentration). The diversity of approaches used by the models is summarized through a series of conceptual figures, and is used to evaluate the wide range of wetland extent and CH4 fluxes predicted by the models in the equilibrium run. We discuss relationships among the various approaches and patterns in consistencies of these model predictions. Within this group of models, there are three broad classes of methods used to estimate wetland extent: prescribed based on wetland distribution maps, prognostic relationships between hydrological states based on satellite observations, and explicit hydrological mass balances. A larger variety of approaches was used to estimate the net CH4 fluxes from wetland systems. Even though modelling of wetland extent and CH4 emissions has progressed significantly over recent decades, large uncertainties still exist when estimating CH4 emissions: there is little consensus on model structure or complexity due to knowledge gaps, different aims of the models, and the range of temporal and spatial resolutions of the models.
Resumo:
The combination of scaled analogue experiments, material mechanics, X-ray computed tomography (XRCT) and Digital Volume Correlation techniques (DVC) is a powerful new tool not only to examine the 3 dimensional structure and kinematic evolution of complex deformation structures in scaled analogue experiments, but also to fully quantify their spatial strain distribution and complete strain history. Digital image correlation (DIC) is an important advance in quantitative physical modelling and helps to understand non-linear deformation processes. Optical non-intrusive (DIC) techniques enable the quantification of localised and distributed deformation in analogue experiments based either on images taken through transparent sidewalls (2D DIC) or on surface views (3D DIC). X-ray computed tomography (XRCT) analysis permits the non-destructive visualisation of the internal structure and kinematic evolution of scaled analogue experiments simulating tectonic evolution of complex geological structures. The combination of XRCT sectional image data of analogue experiments with 2D DIC only allows quantification of 2D displacement and strain components in section direction. This completely omits the potential of CT experiments for full 3D strain analysis of complex, non-cylindrical deformation structures. In this study, we apply digital volume correlation (DVC) techniques on XRCT scan data of “solid” analogue experiments to fully quantify the internal displacement and strain in 3 dimensions over time. Our first results indicate that the application of DVC techniques on XRCT volume data can successfully be used to quantify the 3D spatial and temporal strain patterns inside analogue experiments. We demonstrate the potential of combining DVC techniques and XRCT volume imaging for 3D strain analysis of a contractional experiment simulating the development of a non-cylindrical pop-up structure. Furthermore, we discuss various options for optimisation of granular materials, pattern generation, and data acquisition for increased resolution and accuracy of the strain results. Three-dimensional strain analysis of analogue models is of particular interest for geological and seismic interpretations of complex, non-cylindrical geological structures. The volume strain data enable the analysis of the large-scale and small-scale strain history of geological structures.
Resumo:
Self-administered online surveys provide a higher level of privacy protection to respondents than surveys administered by an interviewer. Yet, studies show that asking sensitive questions is problematic also in self-administered mode. Because respondents might not be willing to reveal the truth and provide answers that are subject to social desirability bias, the validity of prevalence estimates of sensitive behaviors gained via online surveys can be challenged. A well-known method to combat these problems is the Randomized Response Technique (RRT). However, convincing evidence that the RRT provides more valid estimates than direct questioning in online mode is still lacking. Moreover, an alternative approach called the Crosswise Model (CM) has recently been suggested to overcome some of the deficiencies of the RRT. In the context of an online survey on plagiarism and cheating on exams among students of two Swiss universities (N = 6,494), we tested different implementations of the RRT and the CM and compared them to direct questioning using a randomized experimental design. Results reveal a poor performance of the RRT, which failed to elicit higher prevalence estimates than direct questioning. Using the CM however, significantly higher prevalence estimates were obtained making it a promising new alternative to the conventional RRT.
Resumo:
Over the last two decades, imaging of the aorta has undergone a clinically relevant change. As part of the change non-invasive imaging techniques have replaced invasive intra-arterial digital subtraction angiography as the former imaging gold standard for aortic diseases. Computed tomography (CT) and magnetic resonance imaging (MRI) constitute the backbone of pre- and postoperative aortic imaging because they allow for imaging of the entire aorta and its branches. The first part of this review article describes the imaging principles of CT and MRI with regard to aortic disease, shows how both technologies can be applied in every day clinical practice, offering exciting perspectives. Recent CT scanner generations deliver excellent image quality with a high spatial and temporal resolution. Technical developments have resulted in CT scan performed within a few seconds for the entire aorta. Therefore, CT angiography (CTA) is the imaging technology of choice for evaluating acute aortic syndromes, for diagnosis of most aortic pathologies, preoperative planning and postoperative follow-up after endovascular aortic repair. However, radiation dose and the risk of contrast induced nephropathy are major downsides of CTA. Optimisation of scan protocols and contrast media administration can help to reduce the required radiation dose and contrast media. MR angiography (MRA) is an excellent alternative to CTA for both diagnosis of aortic pathologies and postoperative follow-up. The lack of radiation is particularly beneficial for younger patients. A potential side effect of gadolinium contrast agents is nephrogenic systemic fibrosis (NSF). In patients with high risk of NSF unenhanced MRA can be performed with both ECG- and breath-gating techniques. Additionally, MRI provides the possibility to visualise and measure both dynamic and flow information.
Resumo:
Development of methods for rapid screening and stratification of subjects after exposure is an integral part of countermeasures against radiation. The potential demographic and exposure history-related heterogeneity of exposed populations warrants robust biomarkers that withstand and reflect such differences. In this study, the effect of aging and repeated exposure on the metabolic response to sublethal irradiation was examined in mice using UPLC-ESI-QTOF mass spectrometry. Aging attenuated postexposure elevation in excretions of DNA damage biomarkers as well as N(1)-acetylspermidine. Although N(1)-acetylspermidine and 2'-deoxyuridine elevation was highly correlated in all age groups, xanthine and N(1)-acetylspermidine elevation was poorly correlated in older mice. These results may reflect the established decline in DNA damage-repair efficiency associated with aging and indicate a novel role for polyamine metabolism in the process. Although repeated irradiation at long intervals did not affect the elevation of N(1)-acetylspermidine, 2'-deoxyuridine, and xanthine, it did significantly attenuate the elevation of 2'-deoxycytidine and thymidine compared to a single exposure. However, these biomarkers were found to identify exposed subjects with accuracy ranging from 82% (xanthosine) to 98% (2'-deoxyuridine), irrespective of their age and exposure history. This indicates that metabolic biomarkers can act as robust noninvasive signatures of sublethal radiation exposure.
Resumo:
INTRODUCTION Empirical evidence has indicated that only a subsample of studies conducted reach full-text publication and this phenomenon has become known as publication bias. A form of publication bias is the selectively delayed full publication of conference abstracts. The objective of this article was to examine the publication status of oral abstracts and poster-presentation abstracts, included in the scientific program of the 82nd and 83rd European Orthodontic Society (EOS) congresses, held in 2006 and 2007, and to identify factors associated with full-length publication. METHODS A systematic search of PubMed and Google Scholar databases was performed in April 2013 using author names and keywords from the abstract title to locate abstract and full-article publications. Information regarding mode of presentation, type of affiliation, geographical origin, statistical results, and publication details were collected and analyzed using univariable and multivariable logistic regression. RESULTS Approximately 51 per cent of the EOS 2006 and 55 per cent of the EOS 2007 abstracts appeared in print more than 5 years post congress. A mean period of 1.32 years elapsed between conference and publication date. Mode of presentation (oral or poster), use of statistical analysis, and research subject area were significant predictors for publication success. LIMITATIONS Inherent discrepancies of abstract reporting, mainly related to presentation of preliminary results and incomplete description of methods, may be considered in analogous studies. CONCLUSIONS On average 52.2 per cent of the abstracts presented at the two EOS conferences reached full publication. Abstracts presented orally, including statistical analysis, were more likely to get published.