62 resultados para Supporting methodology
Resumo:
The 2010 Position Development Conference addressed four questions related to the impact of previous fractures on 10-year fracture risk as calculated by FRAX(®). To address these questions, PubMed was searched on the keywords "fracture, epidemiology, osteoporosis." Titles of retrieved articles were reviewed for an indication that risk for future fracture was discussed. Abstracts of these articles were reviewed for an indication that one or more of the questions listed above was discussed. For those that did, the articles were reviewed in greater detail to extract the findings and to find additional past work and citing works that also bore on the questions. The official positions and the supporting literature review are presented here. FRAX(®) underestimates fracture probability in persons with a history of multiple fractures (good, A, W). FRAX(®) may underestimate fracture probability in individuals with prevalent severe vertebral fractures (good, A, W). While there is evidence that hip, vertebral, and humeral fractures appear to confer greater risk of subsequent fracture than fractures at other sites, quantification of this incremental risk in FRAX(®) is not possible (fair, B, W). FRAX(®) may underestimate fracture probability in individuals with a parental history of non-hip fragility fracture (fair, B, W). Limitations of the methodology include performance by a single reviewer, preliminary review of the literature being confined to titles, and secondary review being limited to abstracts. Limitations of the evidence base include publication bias, overrepresentation of persons of European descent in the published studies, and technical differences in the methods used to identify prevalent and incident fractures. Emerging topics for future research include fracture epidemiology in non-European populations and men, the impact of fractures in family members other than parents, and the genetic contribution to fracture risk.
Resumo:
In Switzerland, organ procurement is well organized at the national-level but transplant outcomes have not been systematically monitored so far. Therefore, a novel project, the Swiss Transplant Cohort Study (STCS), was established. The STCS is a prospective multicentre study, designed as a dynamic cohort, which enrolls all solid organ recipients at the national level. The features of the STCS are a flexible patient-case system that allows capturing all transplant scenarios and collection of patient-specific and allograft-specific data. Beyond comprehensive clinical data, specific focus is directed at psychosocial and behavioral factors, infectious disease development, and bio-banking. Between May 2008 and end of 2011, the six Swiss transplant centers recruited 1,677 patients involving 1,721 transplantations, and a total of 1,800 organs implanted in 15 different transplantation scenarios. 10 % of all patients underwent re-transplantation and 3% had a second transplantation, either in the past or during follow-up. 34% of all kidney allografts originated from living donation. Until the end of 2011 we observed 4,385 infection episodes in our patient population. The STCS showed operative capabilities to collect high-quality data and to adequately reflect the complexity of the post-transplantation process. The STCS represents a promising novel project for comparative effectiveness research in transplantation medicine.
Resumo:
BACKGROUND: Acute kidney injury (AKI) is common in patients undergoing cardiac surgery among whom it is associated with poor outcomes, prolonged hospital stays and increased mortality. Statin drugs can produce more than one effect independent of their lipid lowering effect, and may improve kidney injury through inhibition of postoperative inflammatory responses. OBJECTIVES: This review aimed to look at the evidence supporting the benefits of perioperative statins for AKI prevention in hospitalised adults after surgery who require cardiac bypass. The main objectives were to 1) determine whether use of statins was associated with preventing AKI development; 2) determine whether use of statins was associated with reductions in in-hospital mortality; 3) determine whether use of statins was associated with reduced need for RRT; and 4) determine any adverse effects associated with the use of statins. SEARCH METHODS: We searched the Cochrane Renal Group's Specialised Register to 13 January 2015 through contact with the Trials' Search Co-ordinator using search terms relevant to this review. SELECTION CRITERIA: Randomised controlled trials (RCTs) that compared administration of statin therapy with placebo or standard clinical care in adult patients undergoing surgery requiring cardiopulmonary bypass and reporting AKI, serum creatinine (SCr) or need for renal replacement therapy (RRT) as an outcome were eligible for inclusion. All forms and dosages of statins in conjunction with any duration of pre-operative therapy were considered for inclusion in this review. DATA COLLECTION AND ANALYSIS: All authors extracted data independently and assessments were cross-checked by a second author. Likewise, assessment of study risk of bias was initially conducted by one author and then by a second author to ensure accuracy. Disagreements were arbitrated among authors until consensus was reached. Authors from two of the included studies provided additional data surrounding post-operative SCr as well as need for RRT. Meta-analyses were used to assess the outcomes of AKI, SCr and mortality rate. Data for the outcomes of RRT and adverse effects were not pooled. Adverse effects taken into account were those reported by the authors of included studies. MAIN RESULTS: We included seven studies (662 participants) in this review. All except one study was assessed as being at high risk of bias. Three studies assessed atorvastatin, three assessed simvastatin and one investigated rosuvastatin. All studies collected data during the immediate perioperative period only; data collection to hospital discharge and postoperative biochemical data collection ranged from 24 hours to 7 days. Overall, pre-operative statin treatment was not associated with a reduction in postoperative AKI, need for RRT, or mortality. Only two studies (195 participants) reported postoperative SCr level. In those studies, patients allocated to receive statins had lower postoperative SCr concentrations compared with those allocated to no drug treatment/placebo (MD 21.2 µmol/L, 95% CI -31.1 to -11.1). Adverse effects were adequately reported in only one study; no difference was found between the statin group compared to placebo. AUTHORS' CONCLUSIONS: Analysis of currently available data did not suggest that preoperative statin use is associated with decreased incidence of AKI in adults after surgery who required cardiac bypass. Although a significant reduction in SCr was seen postoperatively in people treated with statins, this result was driven by results from a single study, where SCr was considered as a secondary outcome. The results of the meta-analysis should be interpreted with caution; few studies were included in subgroup analyses, and significant differences in methodology exist among the included studies. Large high quality RCTs are required to establish the safety and efficacy of statins to prevent AKI after cardiac surgery.
Resumo:
Résumé L'eau est souvent considérée comme une substance ordinaire puisque elle est très commune dans la nature. En fait elle est la plus remarquable de toutes les substances. Sans l'eau la vie sur la terre n'existerait pas. L'eau représente le composant majeur de la cellule vivante, formant typiquement 70 à 95% de la masse cellulaire et elle fournit un environnement à d'innombrables organismes puisque elle couvre 75% de la surface de terre. L'eau est une molécule simple faite de deux atomes d'hydrogène et un atome d'oxygène. Sa petite taille semble en contradiction avec la subtilité de ses propriétés physiques et chimiques. Parmi celles-là, le fait que, au point triple, l'eau liquide est plus dense que la glace est particulièrement remarquable. Malgré son importance particulière dans les sciences de la vie, l'eau est systématiquement éliminée des spécimens biologiques examinés par la microscopie électronique. La raison en est que le haut vide du microscope électronique exige que le spécimen biologique soit solide. Pendant 50 ans la science de la microscopie électronique a adressé ce problème résultant en ce moment en des nombreuses techniques de préparation dont l'usage est courrant. Typiquement ces techniques consistent à fixer l'échantillon (chimiquement ou par congélation), remplacer son contenu d'eau par un plastique doux qui est transformé à un bloc rigide par polymérisation. Le bloc du spécimen est coupé en sections minces (denviron 50 nm) avec un ultramicrotome à température ambiante. En général, ces techniques introduisent plusieurs artefacts, principalement dû à l'enlèvement d'eau. Afin d'éviter ces artefacts, le spécimen peut être congelé, coupé et observé à basse température. Cependant, l'eau liquide cristallise lors de la congélation, résultant en une importante détérioration. Idéalement, l'eau liquide est solidifiée dans un état vitreux. La vitrification consiste à refroidir l'eau si rapidement que les cristaux de glace n'ont pas de temps de se former. Une percée a eu lieu quand la vitrification d'eau pure a été découverte expérimentalement. Cette découverte a ouvert la voie à la cryo-microscopie des suspensions biologiques en film mince vitrifié. Nous avons travaillé pour étendre la technique aux spécimens épais. Pour ce faire les échantillons biologiques doivent être vitrifiés, cryo-coupées en sections vitreuse et observées dans une cryo-microscope électronique. Cette technique, appelée la cryo- microscopie électronique des sections vitrifiées (CEMOVIS), est maintenant considérée comme étant la meilleure façon de conserver l'ultrastructure de tissus et cellules biologiques dans un état très proche de l'état natif. Récemment, cette technique est devenue une méthode pratique fournissant des résultats excellents. Elle a cependant, des limitations importantes, la plus importante d'entre elles est certainement dû aux artefacts de la coupe. Ces artefacts sont la conséquence de la nature du matériel vitreux et le fait que les sections vitreuses ne peuvent pas flotter sur un liquide comme c'est le cas pour les sections en plastique coupées à température ambiante. Le but de ce travail a été d'améliorer notre compréhension du processus de la coupe et des artefacts de la coupe. Nous avons ainsi trouvé des conditions optimales pour minimiser ou empêcher ces artefacts. Un modèle amélioré du processus de coupe et une redéfinitions des artefacts de coupe sont proposés. Les résultats obtenus sous ces conditions sont présentés et comparés aux résultats obtenus avec les méthodes conventionnelles. Abstract Water is often considered to be an ordinary substance since it is transparent, odourless, tasteless and it is very common in nature. As a matter of fact it can be argued that it is the most remarkable of all substances. Without water life on Earth would not exist. Water is the major component of cells, typically forming 70 to 95% of cellular mass and it provides an environment for innumerable organisms to live in, since it covers 75% of Earth surface. Water is a simple molecule made of two hydrogen atoms and one oxygen atom, H2O. The small size of the molecule stands in contrast with its unique physical and chemical properties. Among those the fact that, at the triple point, liquid water is denser than ice is especially remarkable. Despite its special importance in life science, water is systematically removed from biological specimens investigated by electron microscopy. This is because the high vacuum of the electron microscope requires that the biological specimen is observed in dry conditions. For 50 years the science of electron microscopy has addressed this problem resulting in numerous preparation techniques, presently in routine use. Typically these techniques consist in fixing the sample (chemically or by freezing), replacing its water by plastic which is transformed into rigid block by polymerisation. The block is then cut into thin sections (c. 50 nm) with an ultra-microtome at room temperature. Usually, these techniques introduce several artefacts, most of them due to water removal. In order to avoid these artefacts, the specimen can be frozen, cut and observed at low temperature. However, liquid water crystallizes into ice upon freezing, thus causing severe damage. Ideally, liquid water is solidified into a vitreous state. Vitrification consists in solidifying water so rapidly that ice crystals have no time to form. A breakthrough took place when vitrification of pure water was discovered. Since this discovery, the thin film vitrification method is used with success for the observation of biological suspensions of. small particles. Our work was to extend the method to bulk biological samples that have to be vitrified, cryosectioned into vitreous sections and observed in cryo-electron microscope. This technique is called cryo-electron microscopy of vitreous sections (CEMOVIS). It is now believed to be the best way to preserve the ultrastructure of biological tissues and cells very close to the native state for electron microscopic observation. Since recently, CEMOVIS has become a practical method achieving excellent results. It has, however, some sever limitations, the most important of them certainly being due to cutting artefacts. They are the consequence of the nature of vitreous material and the fact that vitreous sections cannot be floated on a liquid as is the case for plastic sections cut at room temperature. The aim of the present work has been to improve our understanding of the cutting process and of cutting artefacts, thus finding optimal conditions to minimise or prevent these artefacts. An improved model of the cutting process and redefinitions of cutting artefacts are proposed. Results obtained with CEMOVIS under these conditions are presented and compared with results obtained with conventional methods.
Resumo:
This paper aims at detecting spatio-temporal clustering in fire sequences using space?time scan statistics, a powerful statistical framework for the analysis of point processes. The methodology is applied to active fire detection in the state of Florida (US) identified by MODIS (Moderate Resolution Imaging Spectroradiometer) during the period 2003?06. Results of the present study show that statistically significant clusters can be detected and localized in specific areas and periods of the year. Three out of the five most likely clusters detected for the entire frame period are localized in the north of the state, and they cover forest areas; the other two clusters cover a large zone in the south, corresponding to agricultural land and the prairies in the Everglades. In order to analyze if the wildfires recur each year during the same period, the analyses have been performed separately for the 4 years: it emerges that clusters of forest fires are more frequent in hot seasons (spring and summer), while in the southern areas, they are widely present during the whole year. The recognition of overdensities of events and the ability to locate them in space and in time can help in supporting fire management and focussing on prevention measures.
Resumo:
For the first time in Finland, the chemical profiling of cocaine specimens was performed at the National Bureau of Investigation (NBI). The main goals were to determine the chemical composition of cocaine specimens sold in the Finnish market and to study the distribution networks of cocaine in order to provide intelligence related to its trafficking. An analytical methodology enabling through one single GC-MS injection the determination of the added cutting agents (adulterants and diluents), the cocaine purity and the chemical profile (based on the major and minor alkaloids) for each specimen was thus implemented and validated. The methodology was found to be efficient for the discrimination between specimens coming from the same source and specimens coming from different sources. The results highlighted the practical utility of the chemical profiling, especially for supporting the investigation through operational intelligence and improving the knowledge related to the cocaine trafficking through strategic intelligence.
Resumo:
Diagrams and tools help to support task modelling in engi- neering and process management. Unfortunately they are unfit to help in a business context at a strategic level, because of the flexibility needed for creative thinking and user friendly interactions. We propose a tool which bridges the gap between freedom of actions, encouraging creativity, and constraints, allowing validation and advanced features.
Resumo:
Probabilistic inversion methods based on Markov chain Monte Carlo (MCMC) simulation are well suited to quantify parameter and model uncertainty of nonlinear inverse problems. Yet, application of such methods to CPU-intensive forward models can be a daunting task, particularly if the parameter space is high dimensional. Here, we present a 2-D pixel-based MCMC inversion of plane-wave electromagnetic (EM) data. Using synthetic data, we investigate how model parameter uncertainty depends on model structure constraints using different norms of the likelihood function and the model constraints, and study the added benefits of joint inversion of EM and electrical resistivity tomography (ERT) data. Our results demonstrate that model structure constraints are necessary to stabilize the MCMC inversion results of a highly discretized model. These constraints decrease model parameter uncertainty and facilitate model interpretation. A drawback is that these constraints may lead to posterior distributions that do not fully include the true underlying model, because some of its features exhibit a low sensitivity to the EM data, and hence are difficult to resolve. This problem can be partly mitigated if the plane-wave EM data is augmented with ERT observations. The hierarchical Bayesian inverse formulation introduced and used herein is able to successfully recover the probabilistic properties of the measurement data errors and a model regularization weight. Application of the proposed inversion methodology to field data from an aquifer demonstrates that the posterior mean model realization is very similar to that derived from a deterministic inversion with similar model constraints.
Resumo:
This study presents an innovative methodology for forensic science image analysis for event reconstruction. The methodology is based on experiences from real cases. It provides real added value to technical guidelines such as standard operating procedures (SOPs) and enriches the community of practices at stake in this field. This bottom-up solution outlines the many facets of analysis and the complexity of the decision-making process. Additionally, the methodology provides a backbone for articulating more detailed and technical procedures and SOPs. It emerged from a grounded theory approach; data from individual and collective interviews with eight Swiss and nine European forensic image analysis experts were collected and interpreted in a continuous, circular and reflexive manner. Throughout the process of conducting interviews and panel discussions, similarities and discrepancies were discussed in detail to provide a comprehensive picture of practices and points of view and to ultimately formalise shared know-how. Our contribution sheds light on the complexity of the choices, actions and interactions along the path of data collection and analysis, enhancing both the researchers' and participants' reflexivity.
Resumo:
The GH-2000 and GH-2004 projects have developed a method for detecting GH misuse based on measuring insulin-like growth factor-I (IGF-I) and the amino-terminal pro-peptide of type III collagen (P-III-NP). The objectives were to analyze more samples from elite athletes to improve the reliability of the decision limit estimates, to evaluate whether the existing decision limits needed revision, and to validate further non-radioisotopic assays for these markers. The study included 998 male and 931 female elite athletes. Blood samples were collected according to World Anti-Doping Agency (WADA) guidelines at various sporting events including the 2011 International Association of Athletics Federations (IAAF) World Athletics Championships in Daegu, South Korea. IGF-I was measured by the Immunotech A15729 IGF-I IRMA, the Immunodiagnostic Systems iSYS IGF-I assay and a recently developed mass spectrometry (LC-MS/MS) method. P-III-NP was measured by the Cisbio RIA-gnost P-III-P, Orion UniQ? PIIINP RIA and Siemens ADVIA Centaur P-III-NP assays. The GH-2000 score decision limits were developed using existing statistical techniques. Decision limits were determined using a specificity of 99.99% and an allowance for uncertainty because of the finite sample size. The revised Immunotech IGF-I - Orion P-III-NP assay combination decision limit did not change significantly following the addition of the new samples. The new decision limits are applied to currently available non-radioisotopic assays to measure IGF-I and P-III-NP in elite athletes, which should allow wider flexibility to implement the GH-2000 marker test for GH misuse while providing some resilience against manufacturer withdrawal or change of assays. Copyright © 2015 John Wiley & Sons, Ltd.
Resumo:
BACKGROUND: Developing and updating high-quality guidelines requires substantial time and resources. To reduce duplication of effort and enhance efficiency, we developed a process for guideline adaptation and assessed initial perceptions of its feasibility and usefulness. METHODS: Based on preliminary developments and empirical studies, a series of meetings with guideline experts were organised to define a process for guideline adaptation (ADAPTE) and to develop a manual and a toolkit made available on a website (http://www.adapte.org). Potential users, guideline developers and implementers, were invited to register and to complete a questionnaire evaluating their perception about the proposed process.
Resumo:
Ever since the inception of economics over two hundred years ago, the tools at the discipline's disposal have grown more and more more sophisticated. This book provides a historical introduction to the methodology of economics through the eyes of economists. The story begins with John Stuart Mill's seminal essay from 1836 on the definition and method of political economy, which is then followed by an examination of how the actual practices of economists changed over time to such an extent that they not only altered their methods of enquiry, but also their self-perception as economists. Beginning as intellectuals and journalists operating to a large extent in the public sphere, they then transformed into experts who developed their tools of research increasingly behind the scenes. No longer did they try to influence policy agendas through public discourse; rather they targeted policymakers directly and with instruments that showed them as independent and objective policy advisors, the tools of the trade changing all the while. In order to shed light on this evolution of economic methodology, this book takes carefully selected snapshots from the discipline's history. It tracks the process of development through the nineteenth and twentieth centuries, analysing the growth of empirical and mathematical modelling. It also looks at the emergence of the experiment in economics, in addition to the similarities and differences between modelling and experimentation. This book will be relevant reading for students and academics in the fields of economic methodology, history of economics, and history and philosophy of the social sciences.
Resumo:
Hypothesis: The quality of care for chronic patients depends on the collaborative skills of the healthcare providers.1,2 The literature lacks reports of the use of simulation to teach collaborative skills in non-acute care settings. We posit that simulation offers benefits for supporting the development of collaborative practice in non-acute settings. We explored the benefits and challenges of using an Interprofessional Team - Objective Structured Clinical Examination (IT-OSCE) as a formative assessment tool. IT-OSCE is an intervention which involves an interprofessional team of trainees interacting with a simulated patient (SP) enabling them to practice collaborative skills in non-acute care settings.5 A simulated patient are people trained to portray patients in a simulated scenario for educational purposes.6,7 Since interprofessional education (IPE) ultimately aims to provide collaborative patient-centered care.8,9 We sought to promote patient-centeredness in the learning process. Methods: The IT-OSCE was conducted with four trios of students from different professions. The debriefing was co-facilitated by the SP with a faculty. The participants were final-year students in nursing, physiotherapy and medicine. Our research question focused on the introduction of co-facilitated (SP and faculty) debriefing after an IT-OSCE: 1) What are the benefits and challenges of involving the SP during the debriefing? and 2) To evaluate the IT-OSCE, an exploratory case study was used to provide fine grained data 10, 11. Three focus groups were conducted - two with students (n=6; n=5), one with SPs (n=3) and one with faculty (n=4). Audiotapes were transcribed for thematic analysis performed by three researchers, who found a consensus on the final set of themes. Results: The thematic analysis showed little differentiation between SPs, student and faculty perspectives. The analysis of transcripts revealed more particularly, that the SP's co-facilitation during the debriefing of an IT-OSCE proved to be feasible. It was appreciated by all the participants and appeared to value and to promote patient-centeredness in the learning process. The main challenge consisted in SPs feedback, more particularly in how they could report accurate observations to a students' group rather than individual students. Conclusion: In conclusion, SP methodology using an IT-OSCE seems to be a useful and promising way to train collaborative skills, aligning IPE, simulation-based team training in a non-acute care setting and patient-centeredness. We acknowledge the limitations of the study, especially the small sample and consider the exploration of SP-based IPE in non-acute care settings as strength. Future studies could consider the preparation of SPs and faculty as co-facilitators. References: 1. Borrill CS, Carletta J, Carter AJ, et al. The effectiveness of health care teams in the National Health Service. Aston centre for Health Service Organisational Research. 2001. 2. Reeves S, Lewin S, Espin S, Zwarenstein M. Interprofessional teamwork for health and social care. Oxford: Wiley-Blackwell; 2010. 3. Issenberg S, McGaghie WC, Petrusa ER, Gordon DL, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning - a BEME systematic review. Medical Teacher. 2005;27(1):10-28. 4. McGaghie W, Petrusa ER, Gordon DL, Scalese RJ. A critical review of simulation-based medical education research: 2003-2009. Medical Education. 2010;44(1):50-63. 5. Simmons B, Egan-Lee E, Wagner SJ, Esdaile M, Baker L, Reeves S. Assessment of interprofessional learning: the design of an interprofessional objective structured clinical examination (iOSCE) approach. Journal of Interprofessional Care. 2011;25(1):73-74. 6. Nestel D, Layat Burn C, Pritchard SA, Glastonbury R, Tabak D. The use of simulated patients in medical education: Guide Supplement 42.1 - Viewpoint. Medical teacher. 2011;33(12):1027-1029. Disclosures: None (C) 2014 by Lippincott Williams & Wilkins, Inc.