57 resultados para ratings agencies
Resumo:
Posterior chest wall defects are frequently encountered after excision of tumors as a result of trauma or in the setting of wound dehiscence after spine surgery. Various pedicled fasciocutaneous and musculocutaneous flaps have been described for the coverage of these wounds. The advent of perforator flaps has allowed the preservation of muscle function but their bulk is limited. Musculocutaneous flaps remain widely employed. The trapezius and the latissimus dorsi (LD) flaps have been used extensively for upper and middle posterior chest wounds, respectively. Their bulk allows for obliteration of the dead space in deep wounds. The average width of the LD skin paddle is limited to 10-12 cm if closure of the donor site is expected without skin grafting. In 2001 a modification of the skin paddle design was introduced in order to allow large flaps to be raised without requiring grafts or flaps for donor site closure. This V-Y pattern allows coverage of large anterior chest defects after mastectomy. We have modified this flap to allow its use for posterior chest wall defects. We describe the flap design, its indications, and its limitations with three clinical cases. Level of Evidence V This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors at www.springer.com/00266 .
Resumo:
PURPOSE: To evaluate the effects of recent advances in magnetic resonance imaging (MRI) radiofrequency (RF) coil and parallel imaging technology on brain volume measurement consistency. MATERIALS AND METHODS: In all, 103 whole-brain MRI volumes were acquired at a clinical 3T MRI, equipped with a 12- and 32-channel head coil, using the T1-weighted protocol as employed in the Alzheimer's Disease Neuroimaging Initiative study with parallel imaging accelerations ranging from 1 to 5. An experienced reader performed qualitative ratings of the images. For quantitative analysis, differences in composite width (CW, a measure of image similarity) and boundary shift integral (BSI, a measure of whole-brain atrophy) were calculated. RESULTS: Intra- and intersession comparisons of CW and BSI measures from scans with equal acceleration demonstrated excellent scan-rescan accuracy, even at the highest acceleration applied. Pairs-of-scans acquired with different accelerations exhibited poor scan-rescan consistency only when differences in the acceleration factor were maximized. A change in the coil hardware between compared scans was found to bias the BSI measure. CONCLUSION: The most important findings are that the accelerated acquisitions appear to be compatible with the assessment of high-quality quantitative information and that for highest scan-rescan accuracy in serial scans the acquisition protocol should be kept as consistent as possible over time. J. Magn. Reson. Imaging 2012;36:1234-1240. ©2012 Wiley Periodicals, Inc.
Resumo:
Syrian dry areas have been for several millennia a place of interaction between human populations and the environment. If environmental constraints and heterogeneity condition the human occupation and exploitation of resources, socio-political, economic and historical elements play a fundamental role. Since the late 1980s, Syrian dry areas are viewed as suffering a serious water crisis, due to groundwater overdraft. The Syrian administration and international development agencies believe that groundwater overexploitation is also leading to a decline of agricultural activities and to poverty increase. Action is thus required to address these problems.However, the overexploitation diagnosis needs to be reviewed. The overexploitation discourse appears in the context of Syria's opening to international organizations and to the market economy. It echoes the international discourse of "global water crisis". The diagnosis is based on national indicators recycling old Soviet data that has not been updated. In the post-Soviet era, the Syrian national water policy seems to abandon large surface water irrigation projects in favor of a strategy of water use rationalization and groundwater conservation in crisis regions, especially in the district of Salamieh.This groundwater conservation policy has a number of inconsistencies. It is justified for the administration and also probably for international donors, since it responds to an indisputable environmental emergency. However, efforts to conserve water are anecdotal or even counterproductive. The water conservation policy appears a posteriori as an extension of the national policy of food self-sufficiency. The dominant interpretation of overexploitation, and more generally of the water crisis, prevents any controversary approach of the status of resources and of the agricultural system in general and thus destroys any attempt to discuss alternatives with respect to groundwater management, allocation, and their inclusion in development programs.A revisited diagnosis of the situation needs to take into account spatial and temporal dimensions of the groundwater exploitation and to analyze the co-evolution of hydrogeological and agricultural systems. It should highlight the adjustments adopted to cope with environmental and economic variability, changes of water availability and regulatory measures enforcements. These elements play an important role for water availability and for the spatial, temporal, sectoral allocation of water resource. The groundwater exploitation in the last century has obviously had an impact on the environment, but the changes are not necessarily catastrophic.The current groundwater use in central Syria increases the uncertainty by reducing the ability of aquifers to buffer climatic changes. However, the climatic factor is not the only source of uncertainty. The high volatility of commodity prices, fuel, land and water, depending on the market but also on the will (and capacity) of the Syrian State to preserve social peace is a strong source of uncertainty. The research should consider the whole range of possibilities and propose alternatives that take into consideration the risks they imply for the water users, the political will to support or not the local access to water - thus involving a redefinition of the economic and social objectives - and finally the ability of international organizations to reconsider pre-established diagnoses.
Resumo:
RATIONALE AND OBJECTIVES: Dose reduction may compromise patients because of a decrease of image quality. Therefore, the amount of dose savings in new dose-reduction techniques needs to be thoroughly assessed. To avoid repeated studies in one patient, chest computed tomography (CT) scans with different dose levels were performed in corpses comparing model-based iterative reconstruction (MBIR) as a tool to enhance image quality with current standard full-dose imaging. MATERIALS AND METHODS: Twenty-five human cadavers were scanned (CT HD750) after contrast medium injection at different, decreasing dose levels D0-D5 and respectively reconstructed with MBIR. The data at full-dose level, D0, have been additionally reconstructed with standard adaptive statistical iterative reconstruction (ASIR), which represented the full-dose baseline reference (FDBR). Two radiologists independently compared image quality (IQ) in 3-mm multiplanar reformations for soft-tissue evaluation of D0-D5 to FDBR (-2, diagnostically inferior; -1, inferior; 0, equal; +1, superior; and +2, diagnostically superior). For statistical analysis, the intraclass correlation coefficient (ICC) and the Wilcoxon test were used. RESULTS: Mean CT dose index values (mGy) were as follows: D0/FDBR = 10.1 ± 1.7, D1 = 6.2 ± 2.8, D2 = 5.7 ± 2.7, D3 = 3.5 ± 1.9, D4 = 1.8 ± 1.0, and D5 = 0.9 ± 0.5. Mean IQ ratings were as follows: D0 = +1.8 ± 0.2, D1 = +1.5 ± 0.3, D2 = +1.1 ± 0.3, D3 = +0.7 ± 0.5, D4 = +0.1 ± 0.5, and D5 = -1.2 ± 0.5. All values demonstrated a significant difference to baseline (P < .05), except mean IQ for D4 (P = .61). ICC was 0.91. CONCLUSIONS: Compared to ASIR, MBIR allowed for a significant dose reduction of 82% without impairment of IQ. This resulted in a calculated mean effective dose below 1 mSv.
Resumo:
The increasing number of bomb attacks involving improvised explosive devices, as well as the nature of the explosives, give rise to concern among safety and law enforcement agencies. The substances used in explosive charges are often everyday products diverted from their primary licit applications. Thus, reducing or limiting their accessibility for prevention purposes is difficult. Ammonium nitrate, employed in agriculture as a fertiliser, is used worldwide in small and large homemade bombs. Black powder, dedicated to hunting and shooting sports, is used illegally as a filling in pipe bombs causing extensive damage. If the main developments of instrumental techniques in explosive analysis have been constantly pushing the limits of detection, their actual contribution to the investigation of explosives in terms of source discrimination is limited. Forensic science has seen the emergence of a new technology, isotope ratio mass spectrometry (IRMS), that shows promising results. Its very first application in forensic science dates back to 1979. Liu et al. analysed cannabis plants coming from different countries [Liu et al. 1979]. This preliminary study highlighted its potential to discriminate specimens coming from different sources. Thirty years later, the keen interest in this new technology has given rise to a flourishing number of publications in forensic science. The countless applications of IRMS to a wide range of materials and substances attest to its success and suggest that the technique is ready to be used in forensic science. However, many studies are characterised by a lack of methodology and fundamental data. They have been undertaken in a top-down approach, applying this technique in an exploratory manner on a restricted sampling. This manner of procedure often does not allow the researcher to answer a number of questions, such as: do the specimens come from the same source, what do we mean by source or what is the inherent variability of a substance? The production of positive results has prevailed at the expense of forensic fundamentals. This research focused on the evaluation of the contribution of the information provided by isotopic analysis to the investigation of explosives. More specifically, this evaluation was based on a sampling of black powders and ammonium nitrate fertilisers coming from known sources. Not only has the methodology developed in this work enabled us to highlight crucial elements inherent to the methods themselves, but also to evaluate both the longitudinal and transversal variabilities of the information. First, the study of the variability of the profile over time was undertaken. Secondly, the variability of black powders and ammonium nitrate fertilisers within the same source and between different sources was evaluated. The contribution of this information to the investigation of explosives was then evaluated and discussed. --------------------------------------------------------------------------------------------------- Le nombre croissant d'attentats à la bombe impliquant des engins explosifs artisanaux, ainsi que la nature des charges explosives, constituent une préoccupation majeure pour les autorités d'application de la loi et les organismes de sécurité. Les substances utilisées dans les charges explosives sont souvent des produits du quotidien, détournés de leurs applications licites. Par conséquent, réduire ou limiter l'accessibilité de ces produits dans un but de prévention est difficile. Le nitrate d'ammonium, employé dans l'agriculture comme engrais, est utilisé dans des petits et grands engins explosifs artisanaux. La poudre noire, initialement dédiée à la chasse et au tir sportif, est fréquemment utilisée comme charge explosive dans les pipe bombs, qui causent des dommages importants. Si les développements des techniques d'analyse des explosifs n'ont cessé de repousser les limites de détection, leur contribution réelle à l'investigation des explosifs est limitée en termes de discrimination de sources. Une nouvelle technologie qui donne des résultats prometteurs a fait son apparition en science forensique: la spectrométrie de masse à rapport isotopique (IRMS). Sa première application en science forensique remonte à 1979. Liu et al. ont analysé des plants de cannabis provenant de différents pays [Liu et al. 1979]. Cette étude préliminaire, basée sur quelques analyses, a mis en évidence le potentiel de l'IRMS à discriminer des spécimens provenant de sources différentes. Trente ans plus tard, l'intérêt marqué pour cette nouvelle technologie en science forensique se traduit par un nombre florissant de publications. Les innombrables applications de l'IRMS à une large gamme de matériaux et de substances attestent de son succès et suggèrent que la technique est prête à être utilisée en science forensique. Cependant, de nombreuses études sont caractérisées par un manque de méthodologie et de données fondamentales. Elles ont été menées sans définir les hypothèses de recherche et en appliquant cette technique de façon exploratoire sur un échantillonnage restreint. Cette manière de procéder ne permet souvent pas au chercheur de répondre à un certain nombre de questions, tels que: est-ce que deux spécimens proviennent de la même source, qu'entend-on par source ou encore quelle est l'intravariabilité d'une substance? La production de résultats positifs a prévalu au détriment des fondamentaux de science forensique. Cette recherche s'est attachée à évaluer la contribution réelle de l'information isotopique dans les investigations en matière d'explosifs. Plus particulièrement, cette évaluation s'est basée sur un échantillonnage constitué de poudres noires et d'engrais à base de nitrate d'ammonium provenant de sources connues. La méthodologie développée dans ce travail a permis non seulement de mettre en évidence des éléments cruciaux relatifs à la méthode d'analyse elle-même, mais également d'évaluer la variabilité de l'information isotopique d'un point de vue longitudinal et transversal. Dans un premier temps, l'évolution du profil en fonction du temps a été étudiée. Dans un second temps, la variabilité du profil des poudres noires et des engrais à base de nitrate d'ammonium au sein d'une même source et entre différentes sources a été évaluée. La contribution de cette information dans le cadre des investigations d'explosifs a ensuite été discutée et évaluée.
Resumo:
European regulatory networks (ERNs) constitute the main governance instrument for the informal co-ordination of public regulation at the European Union (EU) level. They are in charge of co-ordinating national regulators and ensuring the implementation of harmonized regulatory policies across the EU, while also offering sector-specific expertise to the Commission. To this aim, ERNs develop 'best practices' and benchmarking procedures in the form of standards, norms and guidelines to be adopted in member states. In this paper, we focus on the Committee of European Securities Regulators and examine the consequences of the policy-making structure of ERNs on the domestic adoption of standards. We find that the regulators of countries with larger financial industries tend to occupy more central positions in the network, especially among newer member states. In turn, network centrality is associated with a more prompt domestic adoption of standards.
Resumo:
Epoetin-delta (Dynepo Shire Pharmaceuticals, Basing stoke, UK) is a synthetic form of erythropoietin (EPO) whose resemblance with endogenous EPO makes it hard to identify using the classical identification criteria. Urine samples collected from six healthy volunteers treated with epoetin-delta injections and from a control population were immuno-purified and analyzed with the usual IEF method. On the basis of the EPO profiles integration, a linear multivariate model was computed for discriminant analysis. For each sample, a pattern classification algorithm returned a bands distribution and intensity score (bands intensity score) saying how representative this sample is of one of the two classes, positive or negative. Effort profiles were also integrated in the model. The method yielded a good sensitivity versus specificity relation and was used to determine the detection window of the molecule following multiple injections. The bands intensity score, which can be generalized to epoetin-alpha and epoetin-beta, is proposed as an alternative criterion and a supplementary evidence for the identification of EPO abuse.
Resumo:
BACKGROUND: The AO comprehensive pediatric longbone fracture classification system describes the localization and morphology of fractures, and considers severity in 3 categories: (1) simple, (2) wedge, and (3) complex. We evaluated the reliability and accuracy of surgeons in using this rating system. MATERIAL AND METHODS: In a first validation phase, 5 experienced pediatric (orthopedic) surgeons reviewed radiographs of 267 prospectively collected pediatric fractures (agreement study A). In a second study (B), 70 surgeons of various levels of experience in 15 clinics classified 275 fractures via internet. Simple fractures comprised about 90%, 99% and 100% of diaphyseal (D), metaphyseal (M), and epiphyseal (E) fractures, respectively. RESULTS: Kappa coefficients for severity coding in D fractures were 0.82 and 0.51 in studies A and B, respectively. The median accuracy of surgeons in classifying simple fractures was above 97% in both studies but was lower, 85% (46-100), for wedge or complex D fractures. INTERPRETATION: While reliability and accuracy estimates were satisfactory as a whole, the ratings of some individual surgeons were inadequate. Our findings suggest that the classification of fracture severity in children should be done in only two categories that distinguish between simple and wedge/complex fractures.
Resumo:
In this thesis, I examine the diffusion process for a complex medical technology, the PET scanner, in two different health care systems, one of which is more market-oriented (Switzerland) and the other more centrally managed by a public agency (Quebec). The research draws on institutional and socio-political theories of the diffusion of innovations to examine how institutional contexts affect processes of diffusion. I find that diffusion proceeds more rapidly in Switzerland than in Quebec, but that processes in both jurisdictions are characterized by intense struggles among providers and between providers and public agencies. I show that the institutional environment influences these processes by determining the patterns of material resources and authority available to actors in their struggles to strategically control the technology, and by constituting the discursive resources or institutional logics on which actors may legitimately draw in their struggles to give meaning to the technology in line with their interests and values. This thesis illustrates how institutional structures and meanings manifest themselves in the context of specific decisions within an organizational field, and reveals the ways in which governance structures may be contested and realigned when they conflict with interests that are legitimized by dominant institutional logics. It is argued that this form of contestation and readjustment at the margins constitutes one mechanism by which institutional frameworks are tested, stretched and reproduced or redefined.
Resumo:
NanoImpactNet (NIN) is a multidisciplinary European Commission funded network on the environmental, health and safety (EHS) impact of nanomaterials. The 24 founding scientific institutes are leading European research groups active in the fields of nanosafety, nanorisk assessment and nanotoxicology. This 4-year project is the new focal point for information exchange within the research community. Contact with other stakeholders is vital and their needs are being surveyed. NIN is communicating with 100s of stakeholders: businesses; internet platforms; industry associations; regulators; policy makers; national ministries; international agencies; standard-setting bodies and NGOs concerned by labour rights, EHS or animal welfare. To improve this communication, internet research, a questionnaire distributed via partners and targeted phone calls were used to identify stakeholders' interests and needs. Knowledge gaps and the necessity for further data mentioned by representatives of all stakeholder groups in the targeted phone calls concerned: • the potential toxic and safety hazards of nanomaterials throughout their lifecycles; • the fate and persistence of nanoparticles in humans, animals and the environment; • the associated risks of nanoparticle exposure; • greater participation in: the preparation of nomenclature, standards, methodologies, protocols and benchmarks; • the development of best practice guidelines; • voluntary schemes on responsibility; • databases of materials, research topics and themes, but also of expertise. These findings suggested that stakeholders and NIN researchers share very similar knowledge needs, and that open communication and free movement of knowledge will benefit both researchers and industry. Subsequently a workshop was organised by NIN focused on building a sustainable multi-stakeholder dialogue. Specific questions were asked to different stakeholder groups to encourage discussions and open communication. 1. What information do stakeholders need from researchers and why? The discussions about this question confirmed the needs identified in the targeted phone calls. 2. How to communicate information? While it was agreed that reporting should be enhanced, commercial confidentiality and economic competition were identified as major obstacles. It was recognised that expertise was needed in the areas of commercial law and economics for a wellinformed treatment of this communication issue. 3. Can engineered nanomaterials be used safely? The idea that nanomaterials are probably safe because some of them have been produced 'for a long time', was questioned, since many materials in common use have been proved to be unsafe. The question of safety is also about whether the public has confidence. New legislation like REACH could help with this issue. Hazards do not materialise if exposure can be avoided or at least significantly reduced. Thus, there is a need for information on what can be regarded as acceptable levels of exposure. Finally, it was noted that there is no such thing as a perfectly safe material but only boundaries. At this moment we do not know where these boundaries lie. The matter of labelling of products containing nanomaterials was raised, as in the public mind safety and labelling are connected. This may need to be addressed since the issue of nanomaterials in food, drink and food packaging may be the first safety issue to attract public and media attention, and this may have an impact on 'nanotechnology as a whole. 4. Do we need more or other regulation? Any decision making process should accommodate the changing level of uncertainty. To address the uncertainties, adaptations of frameworks such as REACH may be indicated for nanomaterials. Regulation is often needed even if voluntary measures are welcome because it mitigates the effects of competition between industries. Data cannot be collected on voluntary bases for example. NIN will continue with an active stakeholder dialogue to further build on interdisciplinary relationships towards a healthy future with nanotechnology.
Resumo:
Background: Ulcerative colitis (UC) is a chronic disease with a wide variety of treatment options many of which are not evidence based. Supplementing available guidelines, which are often broadly defined, consensus-based and generally not tailored to specifically reflect the individual patient situation, we developed explicit appropriateness criteria to assist, and improve treatment decisions. Methods: We used the RAND appropriateness method which does not force consensus. An extensive literature review was compiled based on and supplementing, where necessary, the ECCO UC 2011 guidelines. EPATUC (endorsed by ECCO) was formed by 8 gastroenterologists, 2 surgeons and 2 general practitioners from throughout Europe. Clinical scenarios reflecting practice were rated on a 9-point scale from 1 (extremely inappropriate) to 9 (extremely appropriate), based on the expert's experience and the available literature. After extensive discussion, all scenarios were re-rated at a two-day panel meeting. Median and disagreement were used to categorize ratings into 3 categories: appropriate, uncertain and inappropriate. Results: 718 clinical scenarios were rated, structured in 13 main clinical presentations: not refractory (n=64) or refractory (n=33) proctitis, mild to moderate left-sided (n=72) or extensive (n=48) colitis, severe colitis (n=36), steroid-dependant colitis (n=36), steroid-refractory colitis (n=55), acute pouchitis (n=96), maintenance of remission (n=248), colorectal cancer prevention (n=9) and fulminant colitis (n=9). Overall, 100 indications were judged appropriate (14%), 129 uncertain (18%) and 489 inappropriate (68%). Disagreement between experts was very low (6%). Conclusion: For the very first time, explicit appropriateness criteria for therapy of UC were developed that allow both specific and rapid therapeutic decision making and prospective assessment of treatment appropriateness. Comparison of these detailed scenarios with patient profiles encountered in the Swiss IBD cohort study indicates good concordance. EPATUC criteria will be freely accessible on the internet (epatuc.ch).
Resumo:
This article seeks to explain the pattern of delegation to independent regulatory agencies in Western Europe. Two types of arguments are advanced to explain variations in the formal independence of regulators. Firstly, the need for governments to increase their credible commitment capacity may lead them to delegate regulation to an agency that is partly beyond their direct control. Secondly, delegation may be a response to the political uncertainty problem, which arises when governments are afraid of being replaced by another coalition with different preferences, which could decide to change existing policy choices. In addition, veto players may constitute a functional equivalent of delegation, since they influence policy stability and therefore tend to mitigate both the credibility and the political uncertainty problems. These arguments are consistent with the results of the empirical analysis of the formal independence of regulators in seventeen countries and seven sectors.