819 resultados para Observational Methodology
Resumo:
Probabilistic inversion methods based on Markov chain Monte Carlo (MCMC) simulation are well suited to quantify parameter and model uncertainty of nonlinear inverse problems. Yet, application of such methods to CPU-intensive forward models can be a daunting task, particularly if the parameter space is high dimensional. Here, we present a 2-D pixel-based MCMC inversion of plane-wave electromagnetic (EM) data. Using synthetic data, we investigate how model parameter uncertainty depends on model structure constraints using different norms of the likelihood function and the model constraints, and study the added benefits of joint inversion of EM and electrical resistivity tomography (ERT) data. Our results demonstrate that model structure constraints are necessary to stabilize the MCMC inversion results of a highly discretized model. These constraints decrease model parameter uncertainty and facilitate model interpretation. A drawback is that these constraints may lead to posterior distributions that do not fully include the true underlying model, because some of its features exhibit a low sensitivity to the EM data, and hence are difficult to resolve. This problem can be partly mitigated if the plane-wave EM data is augmented with ERT observations. The hierarchical Bayesian inverse formulation introduced and used herein is able to successfully recover the probabilistic properties of the measurement data errors and a model regularization weight. Application of the proposed inversion methodology to field data from an aquifer demonstrates that the posterior mean model realization is very similar to that derived from a deterministic inversion with similar model constraints.
Resumo:
This study presents an innovative methodology for forensic science image analysis for event reconstruction. The methodology is based on experiences from real cases. It provides real added value to technical guidelines such as standard operating procedures (SOPs) and enriches the community of practices at stake in this field. This bottom-up solution outlines the many facets of analysis and the complexity of the decision-making process. Additionally, the methodology provides a backbone for articulating more detailed and technical procedures and SOPs. It emerged from a grounded theory approach; data from individual and collective interviews with eight Swiss and nine European forensic image analysis experts were collected and interpreted in a continuous, circular and reflexive manner. Throughout the process of conducting interviews and panel discussions, similarities and discrepancies were discussed in detail to provide a comprehensive picture of practices and points of view and to ultimately formalise shared know-how. Our contribution sheds light on the complexity of the choices, actions and interactions along the path of data collection and analysis, enhancing both the researchers' and participants' reflexivity.
Resumo:
This paper describes an evaluation framework that allows a standardized and quantitative comparison of IVUS lumen and media segmentation algorithms. This framework has been introduced at the MICCAI 2011 Computing and Visualization for (Intra)Vascular Imaging (CVII) workshop, comparing the results of eight teams that participated. We describe the available data-base comprising of multi-center, multi-vendor and multi-frequency IVUS datasets, their acquisition, the creation of the reference standard and the evaluation measures. The approaches address segmentation of the lumen, the media, or both borders; semi- or fully-automatic operation; and 2-D vs. 3-D methodology. Three performance measures for quantitative analysis have been proposed. The results of the evaluation indicate that segmentation of the vessel lumen and media is possible with an accuracy that is comparable to manual annotation when semi-automatic methods are used, as well as encouraging results can be obtained also in case of fully-automatic segmentation. The analysis performed in this paper also highlights the challenges in IVUS segmentation that remains to be solved.
Resumo:
The GH-2000 and GH-2004 projects have developed a method for detecting GH misuse based on measuring insulin-like growth factor-I (IGF-I) and the amino-terminal pro-peptide of type III collagen (P-III-NP). The objectives were to analyze more samples from elite athletes to improve the reliability of the decision limit estimates, to evaluate whether the existing decision limits needed revision, and to validate further non-radioisotopic assays for these markers. The study included 998 male and 931 female elite athletes. Blood samples were collected according to World Anti-Doping Agency (WADA) guidelines at various sporting events including the 2011 International Association of Athletics Federations (IAAF) World Athletics Championships in Daegu, South Korea. IGF-I was measured by the Immunotech A15729 IGF-I IRMA, the Immunodiagnostic Systems iSYS IGF-I assay and a recently developed mass spectrometry (LC-MS/MS) method. P-III-NP was measured by the Cisbio RIA-gnost P-III-P, Orion UniQ? PIIINP RIA and Siemens ADVIA Centaur P-III-NP assays. The GH-2000 score decision limits were developed using existing statistical techniques. Decision limits were determined using a specificity of 99.99% and an allowance for uncertainty because of the finite sample size. The revised Immunotech IGF-I - Orion P-III-NP assay combination decision limit did not change significantly following the addition of the new samples. The new decision limits are applied to currently available non-radioisotopic assays to measure IGF-I and P-III-NP in elite athletes, which should allow wider flexibility to implement the GH-2000 marker test for GH misuse while providing some resilience against manufacturer withdrawal or change of assays. Copyright © 2015 John Wiley & Sons, Ltd.
Resumo:
The EpiNet project has been established to facilitate investigator-initiated clinical research in epilepsy, to undertake epidemiological studies, and to simultaneously improve the care of patients who have records created within the EpiNet database. The EpiNet database has recently been adapted to collect detailed information regarding status epilepticus. An incidence study is now underway in Auckland, New Zealand in which the incidence of status epilepticus in the greater Auckland area (population: 1.5 million) will be calculated. The form that has been developed for this study can be used in the future to collect information for randomized controlled trials in status epilepticus. This article is part of a Special Issue entitled "Status Epilepticus".
Resumo:
Routinely collected health data, obtained for administrative and clinical purposes without specific a priori research goals, are increasingly used for research. The rapid evolution and availability of these data have revealed issues not addressed by existing reporting guidelines, such as Strengthening the Reporting of Observational Studies in Epidemiology (STROBE). The REporting of studies Conducted using Observational Routinely collected health Data (RECORD) statement was created to fill these gaps. RECORD was created as an extension to the STROBE statement to address reporting items specific to observational studies using routinely collected health data. RECORD consists of a checklist of 13 items related to the title, abstract, introduction, methods, results, and discussion section of articles, and other information required for inclusion in such research reports. This document contains the checklist and explanatory and elaboration information to enhance the use of the checklist. Examples of good reporting for each RECORD checklist item are also included herein. This document, as well as the accompanying website and message board (http://www.record-statement.org), will enhance the implementation and understanding of RECORD. Through implementation of RECORD, authors, journals editors, and peer reviewers can encourage transparency of research reporting.
Resumo:
BACKGROUND: Developing and updating high-quality guidelines requires substantial time and resources. To reduce duplication of effort and enhance efficiency, we developed a process for guideline adaptation and assessed initial perceptions of its feasibility and usefulness. METHODS: Based on preliminary developments and empirical studies, a series of meetings with guideline experts were organised to define a process for guideline adaptation (ADAPTE) and to develop a manual and a toolkit made available on a website (http://www.adapte.org). Potential users, guideline developers and implementers, were invited to register and to complete a questionnaire evaluating their perception about the proposed process.
Resumo:
Overall Equipment Effectiveness (OEE) is the key metric of operational excellence. OEE monitors the actual performance of equipment relative to its performance capabilities under optimal manufacturing conditions. It looks at the entire manufacturing environment measuring, in addition to the equipment availability, the production efficiency while the equipment is available to run products, as well as the efficiency loss that results from scrap, rework, and yield losses. The analysis of the OEE provides improvement opportunities for the operation. One of the tools used for OEE improvement is Six Sigma DMAIC methodology which is a set of practices originally developed to improve processes by eliminating defects. It asserts the continuous efforts to reduce variation in process outputs as key to business success, as well as the possibility of measurement, analysis, improvement and control of manufacturing and business processes. In the case of the Bottomer line AD2378 in Papsac Maghreb Casablanca plant, the OEE figures reached 48.65 %, which is below the accepted OEE group performance. This required immediate actions to achieve OEE improvement. This Master thesis focuses on the application of Six Sigma DMAIC methodology in the OEE improvement on the Bottomer Line AD2378 in Papsac Maghreb Casablanca plant. First, the Six Sigma DMAIC and OEE usage in operation measurement will be discussed. Afterwards, the different DMAIC phases will allow the identification of improvement focus, the identification of OEE low performance causes and the design of improvement solutions. These will be implemented to allow further tracking of improvement impact on the plant operations.
Resumo:
Ever since the inception of economics over two hundred years ago, the tools at the discipline's disposal have grown more and more more sophisticated. This book provides a historical introduction to the methodology of economics through the eyes of economists. The story begins with John Stuart Mill's seminal essay from 1836 on the definition and method of political economy, which is then followed by an examination of how the actual practices of economists changed over time to such an extent that they not only altered their methods of enquiry, but also their self-perception as economists. Beginning as intellectuals and journalists operating to a large extent in the public sphere, they then transformed into experts who developed their tools of research increasingly behind the scenes. No longer did they try to influence policy agendas through public discourse; rather they targeted policymakers directly and with instruments that showed them as independent and objective policy advisors, the tools of the trade changing all the while. In order to shed light on this evolution of economic methodology, this book takes carefully selected snapshots from the discipline's history. It tracks the process of development through the nineteenth and twentieth centuries, analysing the growth of empirical and mathematical modelling. It also looks at the emergence of the experiment in economics, in addition to the similarities and differences between modelling and experimentation. This book will be relevant reading for students and academics in the fields of economic methodology, history of economics, and history and philosophy of the social sciences.
Resumo:
Objectifs: Le dosage des biomarqueurs du liquide céphalorachidien (LCR) ne fait pas partie des recommandations de la démarche diagnostique de la maladie d'Alzheimer (MA) en France. Nous voulions analyser l'apport de leur dosage en pratique clinique quotidienne. Matériel et méthode: Étude rétrospective observationnelle, portant sur l'ensemble des dosages de biomarqueurs du LCR de la MA effectués entre le 1er novembre 2010 et le 30 septembre 2012 dans l'hôpital de jour (HDJ) et le service de médecine interne gériatrique (SMIG) du centre mémoire de ressources et de recherche (CMRR) des hôpitaux universitaires de Strasbourg (Alsace, France). Résultats: Quatre-vingt-dix-sept patients (femmes : 60,8 % ; âge moyen : 80 ± 6,5 ans) ont été considérés. En HDJ (n = 50), les biomarqueurs étaient utilisés pour le diagnostic positif de MA (64,0 %) ou le diagnostic différentiel entre les démences (36,0 %). Au SMIG (n = 47), leur dosage était effectué afin de confirmer une MA (19,1 %), de rechercher une pathologie cognitive sous-jacente à un syndrome confusionnel (17,0 %) ou pour diagnostiquer une démence chez des patients atteints de pathologies psychiatriques (29,8 %). Si 49,5 % des patients ont eu un diagnostic de MA confirmée, les biomarqueurs ont contribué à infirmer cette étiologie dans 9,2 % des cas. Le doute entre une MA et une autre étiologie persistait cependant encore chez 10 patients. Les analyses comparatives des taux des différents biomarqueurs ont montré que la protéine tau est observée avec un taux significativement plus élevé dans la MA que dans la démence vasculaire (p = 0,003) et à la limite de la significativité pour la maladie de Parkinson (p = 0,06). Le profil observé avec la Ptau est similaire mais avec une significativité atteinte vis-à-vis de la démence de la maladie de Parkinson (p = 0,01). En ce qui concerne l'Aβ1-42, si les taux moyens étaient les plus élevés dans les démences vasculaire et à corps de Lewy, (p < 0,0001 et p < 0,01), ils étaient plus faibles en cas de démence de la maladie de Parkinson mais sans atteindre le seuil de signification (p = 0,12). Conclusion: Cette étude a analysé l'utilisation des biomarqueurs de la MA en pratique courante. Si leur intérêt se positionne actuellement dans le diagnostic de la MA à un stade léger, ces biomarqueurs montrent leur utilité dans les situations où le diagnostic clinique est rendu difficile par un trouble psychiatrique et/ou une confusion, une clinique atypique où lorsque les tests cognitifs sont irréalisables.
Resumo:
BACKGROUND: New generation transcatheter heart valves (THV) may improve clinical outcomes of transcatheter aortic valve implantation. METHODS AND RESULTS: In a nationwide, prospective, multicenter cohort study (Swiss Transcatheter Aortic Valve Implantation Registry, NCT01368250), outcomes of consecutive transfemoral transcatheter aortic valve implantation patients treated with the Sapien 3 THV (S3) versus the Sapien XT THV (XT) were investigated. An overall of 153 consecutive S3 patients were compared with 445 consecutive XT patients. Postprocedural mean transprosthetic gradient (6.5±3.0 versus 7.8±6.3 mm Hg, P=0.17) did not differ between S3 and XT patients, respectively. The rate of more than mild paravalvular regurgitation (1.3% versus 5.3%, P=0.04) and of vascular (5.3% versus 16.9%, P<0.01) complications were significantly lower in S3 patients. A higher rate of new permanent pacemaker implantations was observed in patients receiving the S3 valve (17.0% versus 11.0%, P=0.01). There were no significant differences for disabling stroke (S3 1.3% versus XT 3.1%, P=0.29) and all-cause mortality (S3 3.3% versus XT 4.5%, P=0.27). CONCLUSIONS: The use of the new generation S3 balloon-expandable THV reduced the risk of more than mild paravalvular regurgitation and vascular complications but was associated with an increased permanent pacemaker rate compared with the XT. Transcatheter aortic valve implantation using the newest generation balloon-expandable THV is associated with a low risk of stroke and favorable clinical outcomes. CLINICAL TRIAL REGISTRATION: URL: http://www.clinicaltrials.gov. Unique identifier: NCT01368250.
Resumo:
The Faculty of Business and Communication recently started an internationalization process that, in two year’s time, will allow all undergraduate students (studying Journalism, Audiovisual Communication, Advertising and Public Relations, Business and Marketing) to take 25% of their subjects in English using CLIL methodology. Currently, Journalism is the degree course with the greatest percentage of CLIL subjects, for example Current Affairs Workshop, a subject dedicated to analyzing current news using opinion genres. Moreover, because of the lack of other subjects offered in English, ERASMUS students have to take some journalism subjects in order to complete their international passport, and one of the classes they choose is the Current Affairs Workshop. The aim of this paper is to explore how CLIL methodology can be useful for learning journalistic opinion genres (chat-shows, discussions and debates) in a subject where Catalan Communication students –with different levels of English- share their knowledge with European students of other social disciplines. Students work in multidisciplinary groups in which they develop real radio and TV programs, adopting all the roles (moderator, technician, producer and participants), analyzing daily newspapers and other sources to create content, based on current affairs. This paper is based on the participant observation of the lecturers of the subject, who have designed different activities related to journalistic genres, where students can develop their skills according to the role they play in every assignment. Examples of successful lessons will be given, in addition to the results of the course: both positive and negative. Although the objective of the course is to examine professional routines related to opinion genres, and students are not directly graded on their level of English, the Catalan students come to appreciate how they finally overcome their fear of working in a foreign language. This is a basic result of their experience.
Resumo:
BACKGROUND: Most available pharmacotherapies for alcohol-dependent patients target abstinence; however, reduced alcohol consumption may be a more realistic goal. Using randomized clinical trial (RCT) data, a previous microsimulation model evaluated the clinical relevance of reduced consumption in terms of avoided alcohol-attributable events. Using real-life observational data, the current analysis aimed to adapt the model and confirm previous findings about the clinical relevance of reduced alcohol consumption. METHODS: Based on the prospective observational CONTROL study, evaluating daily alcohol consumption among alcohol-dependent patients, the model predicted the probability of drinking any alcohol during a given day. Predicted daily alcohol consumption was simulated in a hypothetical sample of 200,000 patients observed over a year. Individual total alcohol consumption (TAC) and number of heavy drinking days (HDD) were derived. Using published risk equations, probabilities of alcohol-attributable adverse health events (e.g., hospitalizations or death) corresponding to simulated consumptions were computed, and aggregated for categories of patients defined by HDDs and TAC (expressed per 100,000 patient-years). Sensitivity analyses tested model robustness. RESULTS: Shifting from >220 HDDs per year to 120-140 HDDs and shifting from 36,000-39,000 g TAC per year (120-130 g/day) to 15,000-18,000 g TAC per year (50-60 g/day) impacted substantially on the incidence of events (14,588 and 6148 events avoided per 100,000 patient-years, respectively). Results were robust to sensitivity analyses. CONCLUSIONS: This study corroborates the previous microsimulation modeling approach and, using real-life data, confirms RCT-based findings that reduced alcohol consumption is a relevant objective for consideration in alcohol dependence management to improve public health.
Resumo:
BACKGROUND: In Switzerland, patients may undergo "blood tests" without being informed what these are screening for. Inadequate doctor-patient communication may result in patient misunderstanding. We examined what patients in the emergency department (ED) believed they had been screened for and explored their attitudes to routine (non-targeted) human immunodeficiency virus (HIV) screening. METHODS: Between 1st October 2012 and 28th February 2013, a questionnaire-based survey was conducted among patients aged 16-70 years old presenting to the ED of Lausanne University Hospital. Patients were asked: (1) if they believed they had been screened for HIV; (2) if they agreed in principle to routine HIV screening and (3) if they agreed to be HIV tested during their current ED visit. RESULTS: Of 466 eligible patients, 411 (88%) agreed to participate. Mean age was 46 ± 16 years; 192 patients (47%) were women; 366 (89%) were Swiss or European; 113 (27%) believed they had been screened for HIV, the proportion increasing with age (p ≤0.01), 297 (72%) agreed in principle with routine HIV testing in the ED, and 138 patients (34%) agreed to be HIV tested during their current ED visit. CONCLUSION: In this ED population, 27% believed incorrectly they had been screened for HIV. Over 70% agreed in principle with routine HIV testing and 34% agreed to be tested during their current visit. These results demonstrate willingness among patients concerning routine HIV testing in the ED and highlight a need for improved doctor-patient communication about what a blood test specifically screens for.