146 resultados para error rates


Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: To ensure vaccines safety, given the weaknesses of the national pharmacovigilance system in Cameroon, there is a need to identify effective interventions that can contribute to improving AEFI reporting. OBJECTIVE: To assess the effect of: (i) sending weekly SMS, or (ii) weekly supervisory visits on AEFI reporting rate during a meningitis immunization campaign conducted in Cameroon in 2012 using the meningitis A conjugate vaccine (MenAfriVac?). METHODS: Health facilities that met the inclusion criteria were randomly assigned to receive: (i) a weekly standardized SMS, (ii) a weekly standardized supervisory visits or (iii) no intervention. The primary outcome was the reported AEFI incidence rate from week 5 to 8 after the immunization campaign. Poisson regression model was used to estimate the effect of interventions after adjusting for health region, type of health facility, type and position of health workers as well as the cumulative number of AEFI reported from weeks 1 to 4. RESULTS: A total of 348 (77.2%) of 451 health facility were included, and 116 assigned to each of three groups. The incidence rate of reported AEFI per 100 health facility per week was 20.0 (15.9-24.1) in the SMS group, 40.2 (34.4-46.0) in supervision group and 13.6 (10.1-16.9) in the control group. Supervision led to a significant increase of AEFI reporting rate compared to SMS [adjusted RR=2.1 (1.6-2.7); p<0.001] and control [RR=2.8(2.1-3.7); p<0.001)] groups. The effect of SMS led to some increase in AEFI reporting rate compared to the control group, but the difference was not statistically significant [RR=1.4(0.8-1.6); p=0.07)]. CONCLUSION: Supervision was more effective than SMS or routine surveillance in improving AEFI reporting rate. It should be part of any AEFI surveillance system. SMS could be useful in improving AEFI reporting rates but strategies need to be found to improve its effectiveness, and thus maximize its benefits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The natural dissipation rates of sidestream smoke (SS) particles dispersed in a chamber were studied from the standpoint of a static atmosphere and were expressed as half-lives of residence in the air. The half-lives for particles less than 0.3 micron, 0.3-0.5 micron and 0.5-1 micron were found to be 25.5, 12.8 and 4.9 h, respectively. Total particulate matter (TPM) decreases by half after 6.2 h. Other data on diluted SS in the indoor air were also reported.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Potocki-Lupski syndrome (PTLS) is associated with a microduplication of 17p11.2. Clinical features include multiple congenital and neurobehavioral abnormalities and autistic features. We have generated a PTLS mouse model, Dp(11)17/+, that recapitulates some of the physical and neurobehavioral phenotypes present in patients. Here, we investigated the social behavior and gene expression pattern of this mouse model in a pure C57BL/6-Tyr(c-Brd) genetic background. Dp(11)17/+ male mice displayed normal home-cage behavior but increased anxiety and increased dominant behavior in specific tests. A subtle impairment in the preference for a social target versus an inanimate target and abnormal preference for social novelty (the preference to explore an unfamiliar mouse versus a familiar one) was also observed. Our results indicate that these animals could provide a valuable model to identify the specific gene(s) that confer abnormal social behaviors and that map within this delimited genomic deletion interval. In a first attempt to identify candidate genes and for elucidating the mechanisms of regulation of these important phenotypes, we directly assessed the relative transcription of genes within and around this genomic interval. In this mouse model, we found that candidates genes include not only most of the duplicated genes, but also normal-copy genes that flank the engineered interval; both categories of genes showed altered expression levels in the hippocampus of Dp(11)17/+ mice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Regional rates of hospitalization for ambulatory care sensitive conditions (ACSC) are used to compare the availability and quality of ambulatory care but the risk adjustment for population health status is often minimal. The objectives of the study was to examine the impact of more extensive risk adjustment on regional comparisons and to investigate the relationship between various area-level factors and the properly adjusted rates. METHODS: Our study is an observational study based on routine data of 2 million anonymous insured in 26 Swiss cantons followed over one or two years. A binomial negative regression was modeled with increasingly detailed information on health status (age and gender only, inpatient diagnoses, outpatient conditions inferred from dispensed drugs and frequency of physician visits). Hospitalizations for ACSC were identified from principal diagnoses detecting 19 conditions, with an updated list of ICD-10 diagnostic codes. Co-morbidities and surgical procedures were used as exclusion criteria to improve the specificity of the detection of potentially avoidable hospitalizations. The impact of the adjustment approaches was measured by changes in the standardized ratios calculated with and without other data besides age and gender. RESULTS: 25% of cases identified by inpatient main diagnoses were removed by applying exclusion criteria. Cantonal ACSC hospitalizations rates varied from to 1.4 to 8.9 per 1,000 insured, per year. Morbidity inferred from diagnoses and drugs dramatically increased the predictive performance, the greatest effect found for conditions linked to an ACSC. More visits were associated with fewer PAH although very high users were at greater risk and subjects who had not consulted at negligible risk. By maximizing health status adjustment, two thirds of the cantons changed their adjusted ratio by more than 10 percent. Cantonal variations remained substantial but unexplained by supply or demand. CONCLUSION: Additional adjustment for health status is required when using ACSC to monitor ambulatory care. Drug-inferred morbidities are a promising approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Un système efficace de sismique tridimensionnelle (3-D) haute-résolution adapté à des cibles lacustres de petite échelle a été développé. Dans le Lac Léman, près de la ville de Lausanne, en Suisse, des investigations récentes en deux dimension (2-D) ont mis en évidence une zone de faille complexe qui a été choisie pour tester notre système. Les structures observées incluent une couche mince (<40 m) de sédiments quaternaires sub-horizontaux, discordants sur des couches tertiaires de molasse pentées vers le sud-est. On observe aussi la zone de faille de « La Paudèze » qui sépare les unités de la Molasse du Plateau de la Molasse Subalpine. Deux campagnes 3-D complètes, d?environ d?un kilomètre carré, ont été réalisées sur ce site de test. La campagne pilote (campagne I), effectuée en 1999 pendant 8 jours, a couvert 80 profils en utilisant une seule flûte. Pendant la campagne II (9 jours en 2001), le nouveau système trois-flûtes, bien paramétrés pour notre objectif, a permis l?acquisition de données de très haute qualité sur 180 lignes CMP. Les améliorations principales incluent un système de navigation et de déclenchement de tirs grâce à un nouveau logiciel. Celui-ci comprend un contrôle qualité de la navigation du bateau en temps réel utilisant un GPS différentiel (dGPS) à bord et une station de référence près du bord du lac. De cette façon, les tirs peuvent être déclenchés tous les 5 mètres avec une erreur maximale non-cumulative de 25 centimètres. Tandis que pour la campagne I la position des récepteurs de la flûte 48-traces a dû être déduite à partir des positions du bateau, pour la campagne II elle ont pu être calculées précisément (erreur <20 cm) grâce aux trois antennes dGPS supplémentaires placées sur des flotteurs attachés à l?extrémité de chaque flûte 24-traces. Il est maintenant possible de déterminer la dérive éventuelle de l?extrémité des flûtes (75 m) causée par des courants latéraux ou de petites variations de trajet du bateau. De plus, la construction de deux bras télescopiques maintenant les trois flûtes à une distance de 7.5 m les uns des autres, qui est la même distance que celle entre les lignes naviguées de la campagne II. En combinaison avec un espacement de récepteurs de 2.5 m, la dimension de chaque «bin» de données 3-D de la campagne II est de 1.25 m en ligne et 3.75 m latéralement. L?espacement plus grand en direction « in-line » par rapport à la direction «cross-line» est justifié par l?orientation structurale de la zone de faille perpendiculaire à la direction «in-line». L?incertitude sur la navigation et le positionnement pendant la campagne I et le «binning» imprécis qui en résulte, se retrouve dans les données sous forme d?une certaine discontinuité des réflecteurs. L?utilisation d?un canon à air à doublechambre (qui permet d?atténuer l?effet bulle) a pu réduire l?aliasing observé dans les sections migrées en 3-D. Celui-ci était dû à la combinaison du contenu relativement haute fréquence (<2000 Hz) du canon à eau (utilisé à 140 bars et à 0.3 m de profondeur) et d?un pas d?échantillonnage latéral insuffisant. Le Mini G.I 15/15 a été utilisé à 80 bars et à 1 m de profondeur, est mieux adapté à la complexité de la cible, une zone faillée ayant des réflecteurs pentés jusqu?à 30°. Bien que ses fréquences ne dépassent pas les 650 Hz, cette source combine une pénétration du signal non-aliasé jusqu?à 300 m dans le sol (par rapport au 145 m pour le canon à eau) pour une résolution verticale maximale de 1.1 m. Tandis que la campagne I a été acquise par groupes de plusieurs lignes de directions alternées, l?optimisation du temps d?acquisition du nouveau système à trois flûtes permet l?acquisition en géométrie parallèle, ce qui est préférable lorsqu?on utilise une configuration asymétrique (une source et un dispositif de récepteurs). Si on ne procède pas ainsi, les stacks sont différents selon la direction. Toutefois, la configuration de flûtes, plus courtes que pour la compagne I, a réduit la couverture nominale, la ramenant de 12 à 6. Une séquence classique de traitement 3-D a été adaptée à l?échantillonnage à haute fréquence et elle a été complétée par deux programmes qui transforment le format non-conventionnel de nos données de navigation en un format standard de l?industrie. Dans l?ordre, le traitement comprend l?incorporation de la géométrie, suivi de l?édition des traces, de l?harmonisation des «bins» (pour compenser l?inhomogénéité de la couverture due à la dérive du bateau et de la flûte), de la correction de la divergence sphérique, du filtrage passe-bande, de l?analyse de vitesse, de la correction DMO en 3-D, du stack et enfin de la migration 3-D en temps. D?analyses de vitesse détaillées ont été effectuées sur les données de couverture 12, une ligne sur deux et tous les 50 CMP, soit un nombre total de 600 spectres de semblance. Selon cette analyse, les vitesses d?intervalles varient de 1450-1650 m/s dans les sédiments non-consolidés et de 1650-3000 m/s dans les sédiments consolidés. Le fait que l?on puisse interpréter plusieurs horizons et surfaces de faille dans le cube, montre le potentiel de cette technique pour une interprétation tectonique et géologique à petite échelle en trois dimensions. On distingue cinq faciès sismiques principaux et leurs géométries 3-D détaillées sur des sections verticales et horizontales: les sédiments lacustres (Holocène), les sédiments glacio-lacustres (Pléistocène), la Molasse du Plateau, la Molasse Subalpine de la zone de faille (chevauchement) et la Molasse Subalpine au sud de cette zone. Les couches de la Molasse du Plateau et de la Molasse Subalpine ont respectivement un pendage de ~8° et ~20°. La zone de faille comprend de nombreuses structures très déformées de pendage d?environ 30°. Des tests préliminaires avec un algorithme de migration 3-D en profondeur avant sommation et à amplitudes préservées démontrent que la qualité excellente des données de la campagne II permet l?application de telles techniques à des campagnes haute-résolution. La méthode de sismique marine 3-D était utilisée jusqu?à présent quasi-exclusivement par l?industrie pétrolière. Son adaptation à une échelle plus petite géographiquement mais aussi financièrement a ouvert la voie d?appliquer cette technique à des objectifs d?environnement et du génie civil.<br/><br/>An efficient high-resolution three-dimensional (3-D) seismic reflection system for small-scale targets in lacustrine settings was developed. In Lake Geneva, near the city of Lausanne, Switzerland, past high-resolution two-dimensional (2-D) investigations revealed a complex fault zone (the Paudèze thrust zone), which was subsequently chosen for testing our system. Observed structures include a thin (<40 m) layer of subhorizontal Quaternary sediments that unconformably overlie southeast-dipping Tertiary Molasse beds and the Paudèze thrust zone, which separates Plateau and Subalpine Molasse units. Two complete 3-D surveys have been conducted over this same test site, covering an area of about 1 km2. In 1999, a pilot survey (Survey I), comprising 80 profiles, was carried out in 8 days with a single-streamer configuration. In 2001, a second survey (Survey II) used a newly developed three-streamer system with optimized design parameters, which provided an exceptionally high-quality data set of 180 common midpoint (CMP) lines in 9 days. The main improvements include a navigation and shot-triggering system with in-house navigation software that automatically fires the gun in combination with real-time control on navigation quality using differential GPS (dGPS) onboard and a reference base near the lake shore. Shots were triggered at 5-m intervals with a maximum non-cumulative error of 25 cm. Whereas the single 48-channel streamer system of Survey I requires extrapolation of receiver positions from the boat position, for Survey II they could be accurately calculated (error <20 cm) with the aid of three additional dGPS antennas mounted on rafts attached to the end of each of the 24- channel streamers. Towed at a distance of 75 m behind the vessel, they allow the determination of feathering due to cross-line currents or small course variations. Furthermore, two retractable booms hold the three streamers at a distance of 7.5 m from each other, which is the same distance as the sail line interval for Survey I. With a receiver spacing of 2.5 m, the bin dimension of the 3-D data of Survey II is 1.25 m in in-line direction and 3.75 m in cross-line direction. The greater cross-line versus in-line spacing is justified by the known structural trend of the fault zone perpendicular to the in-line direction. The data from Survey I showed some reflection discontinuity as a result of insufficiently accurate navigation and positioning and subsequent binning errors. Observed aliasing in the 3-D migration was due to insufficient lateral sampling combined with the relatively high frequency (<2000 Hz) content of the water gun source (operated at 140 bars and 0.3 m depth). These results motivated the use of a double-chamber bubble-canceling air gun for Survey II. A 15 / 15 Mini G.I air gun operated at 80 bars and 1 m depth, proved to be better adapted for imaging the complexly faulted target area, which has reflectors dipping up to 30°. Although its frequencies do not exceed 650 Hz, this air gun combines a penetration of non-aliased signal to depths of 300 m below the water bottom (versus 145 m for the water gun) with a maximum vertical resolution of 1.1 m. While Survey I was shot in patches of alternating directions, the optimized surveying time of the new threestreamer system allowed acquisition in parallel geometry, which is preferable when using an asymmetric configuration (single source and receiver array). Otherwise, resulting stacks are different for the opposite directions. However, the shorter streamer configuration of Survey II reduced the nominal fold from 12 to 6. A 3-D conventional processing flow was adapted to the high sampling rates and was complemented by two computer programs that format the unconventional navigation data to industry standards. Processing included trace editing, geometry assignment, bin harmonization (to compensate for uneven fold due to boat/streamer drift), spherical divergence correction, bandpass filtering, velocity analysis, 3-D DMO correction, stack and 3-D time migration. A detailed semblance velocity analysis was performed on the 12-fold data set for every second in-line and every 50th CMP, i.e. on a total of 600 spectra. According to this velocity analysis, interval velocities range from 1450-1650 m/s for the unconsolidated sediments and from 1650-3000 m/s for the consolidated sediments. Delineation of several horizons and fault surfaces reveal the potential for small-scale geologic and tectonic interpretation in three dimensions. Five major seismic facies and their detailed 3-D geometries can be distinguished in vertical and horizontal sections: lacustrine sediments (Holocene) , glaciolacustrine sediments (Pleistocene), Plateau Molasse, Subalpine Molasse and its thrust fault zone. Dips of beds within Plateau and Subalpine Molasse are ~8° and ~20°, respectively. Within the fault zone, many highly deformed structures with dips around 30° are visible. Preliminary tests with 3-D preserved-amplitude prestack depth migration demonstrate that the excellent data quality of Survey II allows application of such sophisticated techniques even to high-resolution seismic surveys. In general, the adaptation of the 3-D marine seismic reflection method, which to date has almost exclusively been used by the oil exploration industry, to a smaller geographical as well as financial scale has helped pave the way for applying this technique to environmental and engineering purposes.<br/><br/>La sismique réflexion est une méthode d?investigation du sous-sol avec un très grand pouvoir de résolution. Elle consiste à envoyer des vibrations dans le sol et à recueillir les ondes qui se réfléchissent sur les discontinuités géologiques à différentes profondeurs et remontent ensuite à la surface où elles sont enregistrées. Les signaux ainsi recueillis donnent non seulement des informations sur la nature des couches en présence et leur géométrie, mais ils permettent aussi de faire une interprétation géologique du sous-sol. Par exemple, dans le cas de roches sédimentaires, les profils de sismique réflexion permettent de déterminer leur mode de dépôt, leurs éventuelles déformations ou cassures et donc leur histoire tectonique. La sismique réflexion est la méthode principale de l?exploration pétrolière. Pendant longtemps on a réalisé des profils de sismique réflexion le long de profils qui fournissent une image du sous-sol en deux dimensions. Les images ainsi obtenues ne sont que partiellement exactes, puisqu?elles ne tiennent pas compte de l?aspect tridimensionnel des structures géologiques. Depuis quelques dizaines d?années, la sismique en trois dimensions (3-D) a apporté un souffle nouveau à l?étude du sous-sol. Si elle est aujourd?hui parfaitement maîtrisée pour l?imagerie des grandes structures géologiques tant dans le domaine terrestre que le domaine océanique, son adaptation à l?échelle lacustre ou fluviale n?a encore fait l?objet que de rares études. Ce travail de thèse a consisté à développer un système d?acquisition sismique similaire à celui utilisé pour la prospection pétrolière en mer, mais adapté aux lacs. Il est donc de dimension moindre, de mise en oeuvre plus légère et surtout d?une résolution des images finales beaucoup plus élevée. Alors que l?industrie pétrolière se limite souvent à une résolution de l?ordre de la dizaine de mètres, l?instrument qui a été mis au point dans le cadre de ce travail permet de voir des détails de l?ordre du mètre. Le nouveau système repose sur la possibilité d?enregistrer simultanément les réflexions sismiques sur trois câbles sismiques (ou flûtes) de 24 traces chacun. Pour obtenir des données 3-D, il est essentiel de positionner les instruments sur l?eau (source et récepteurs des ondes sismiques) avec une grande précision. Un logiciel a été spécialement développé pour le contrôle de la navigation et le déclenchement des tirs de la source sismique en utilisant des récepteurs GPS différentiel (dGPS) sur le bateau et à l?extrémité de chaque flûte. Ceci permet de positionner les instruments avec une précision de l?ordre de 20 cm. Pour tester notre système, nous avons choisi une zone sur le Lac Léman, près de la ville de Lausanne, où passe la faille de « La Paudèze » qui sépare les unités de la Molasse du Plateau et de la Molasse Subalpine. Deux campagnes de mesures de sismique 3-D y ont été réalisées sur une zone d?environ 1 km2. Les enregistrements sismiques ont ensuite été traités pour les transformer en images interprétables. Nous avons appliqué une séquence de traitement 3-D spécialement adaptée à nos données, notamment en ce qui concerne le positionnement. Après traitement, les données font apparaître différents faciès sismiques principaux correspondant notamment aux sédiments lacustres (Holocène), aux sédiments glacio-lacustres (Pléistocène), à la Molasse du Plateau, à la Molasse Subalpine de la zone de faille et la Molasse Subalpine au sud de cette zone. La géométrie 3-D détaillée des failles est visible sur les sections sismiques verticales et horizontales. L?excellente qualité des données et l?interprétation de plusieurs horizons et surfaces de faille montrent le potentiel de cette technique pour les investigations à petite échelle en trois dimensions ce qui ouvre des voies à son application dans les domaines de l?environnement et du génie civil.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Résumé: Output, inflation and interest rates are key macroeconomic variables, in particular for monetary policy. In modern macroeconomic models they are driven by random shocks which feed through the economy in various ways. Models differ in the nature of shocks and their transmission mechanisms. This is the common theme underlying the three essays of this thesis. Each essay takes a different perspective on the subject: First, the thesis shows empirically how different shocks lead to different behavior of interest rates over the business cycle. For commonly analyzed shocks (technology and monetary policy errors), the patterns square with standard models. The big unknown are sources of inflation persistence. Then the thesis presents a theory of monetary policy, when the central bank can better observe structural shocks than the public. The public will then seek to infer the bank's extra knowledge from its policy actions and expectation management becomes a key factor of optimal policy. In a simple New Keynesian model, monetary policy becomes more concerned with inflation persistence than otherwise. Finally, the thesis points to the huge uncertainties involved in estimating the responses to structural shocks with permanent effects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Approximate models (proxies) can be employed to reduce the computational costs of estimating uncertainty. The price to pay is that the approximations introduced by the proxy model can lead to a biased estimation. To avoid this problem and ensure a reliable uncertainty quantification, we propose to combine functional data analysis and machine learning to build error models that allow us to obtain an accurate prediction of the exact response without solving the exact model for all realizations. We build the relationship between proxy and exact model on a learning set of geostatistical realizations for which both exact and approximate solvers are run. Functional principal components analysis (FPCA) is used to investigate the variability in the two sets of curves and reduce the dimensionality of the problem while maximizing the retained information. Once obtained, the error model can be used to predict the exact response of any realization on the basis of the sole proxy response. This methodology is purpose-oriented as the error model is constructed directly for the quantity of interest, rather than for the state of the system. Also, the dimensionality reduction performed by FPCA allows a diagnostic of the quality of the error model to assess the informativeness of the learning set and the fidelity of the proxy to the exact model. The possibility of obtaining a prediction of the exact response for any newly generated realization suggests that the methodology can be effectively used beyond the context of uncertainty quantification, in particular for Bayesian inference and optimization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Main concepts : The Grades of Recommendation, Assessment, Development, and Evaluation (GRADE) approach defines quality of evidence as confidence in effect estimates; this conceptualization can readily be applied to bodies of evidence estimating the risk of future of events (that is, prognosis) in broadly defined populations In the field of prognosis, a body of observational evidence (including single arms of randomized controlled trials) begins as high quality evidence. The five domains GRADE considers in rating down confidence in estimates of treatment effect-that is, risk of bias, imprecision, inconsistency, indirectness, and publication bias-as well as the GRADE criteria for rating up quality, also apply to estimates of the risk of future of events from a body of prognostic studies Applying these concepts to systematic reviews of prognostic studies provides a ful approach to determine confidence in estimates of overall prognosis in broad populations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The avidity of the T-cell receptor (TCR) for antigenic peptides presented by the peptide-MHC (pMHC) on cells is a key parameter for cell-mediated immunity. Yet a fundamental feature of most tumor antigen-specific CD8(+) T cells is that this avidity is low. In this study, we addressed the need to identify and select tumor-specific CD8(+) T cells of highest avidity, which are of the greatest interest for adoptive cell therapy in patients with cancer. To identify these rare cells, we developed a peptide-MHC multimer technology, which uses reversible Ni(2+)-nitrilotriacetic acid histidine tags (NTAmers). NTAmers are highly stable but upon imidazole addition, they decay rapidly to pMHC monomers, allowing flow-cytometric-based measurements of monomeric TCR-pMHC dissociation rates of living CD8(+) T cells on a wide avidity spectrum. We documented strong correlations between NTAmer kinetic results and those obtained by surface plasmon resonance. Using NTAmers that were deficient for CD8 binding to pMHC, we found that CD8 itself stabilized the TCR-pMHC complex, prolonging the dissociation half-life several fold. Notably, our NTAmer technology accurately predicted the function of large panels of tumor-specific T cells that were isolated prospectively from patients with cancer. Overall, our results demonstrated that NTAmers are effective tools to isolate rare high-avidity cytotoxic T cells from patients for use in adoptive therapies for cancer treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To investigate the prevalence of discontinuation and nonpublication of surgical versus medical randomized controlled trials (RCTs) and to explore risk factors for discontinuation and nonpublication of surgical RCTs. BACKGROUND: Trial discontinuation has significant scientific, ethical, and economic implications. To date, the prevalence of discontinuation of surgical RCTs is unknown. METHODS: All RCT protocols approved between 2000 and 2003 by 6 ethics committees in Canada, Germany, and Switzerland were screened. Baseline characteristics were collected and, if published, full reports retrieved. Risk factors for early discontinuation for slow recruitment and nonpublication were explored using multivariable logistic regression analyses. RESULTS: In total, 863 RCT protocols involving adult patients were identified, 127 in surgery (15%) and 736 in medicine (85%). Surgical trials were discontinued for any reason more often than medical trials [43% vs 27%, risk difference 16% (95% confidence interval [CI]: 5%-26%); P = 0.001] and more often discontinued for slow recruitment [18% vs 11%, risk difference 8% (95% CI: 0.1%-16%); P = 0.020]. The percentage of trials not published as full journal article was similar in surgical and medical trials (44% vs 40%, risk difference 4% (95% CI: -5% to 14%); P = 0.373). Discontinuation of surgical trials was a strong risk factor for nonpublication (odds ratio = 4.18, 95% CI: 1.45-12.06; P = 0.008). CONCLUSIONS: Discontinuation and nonpublication rates were substantial in surgical RCTs and trial discontinuation was strongly associated with nonpublication. These findings need to be taken into account when interpreting surgical literature. Surgical trialists should consider feasibility studies before embarking on full-scale trials.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Free induction decay (FID) navigators were found to qualitatively detect rigid-body head movements, yet it is unknown to what extent they can provide quantitative motion estimates. Here, we acquired FID navigators at different sampling rates and simultaneously measured head movements using a highly accurate optical motion tracking system. This strategy allowed us to estimate the accuracy and precision of FID navigators for quantification of rigid-body head movements. Five subjects were scanned with a 32-channel head coil array on a clinical 3T MR scanner during several resting and guided head movement periods. For each subject we trained a linear regression model based on FID navigator and optical motion tracking signals. FID-based motion model accuracy and precision was evaluated using cross-validation. FID-based prediction of rigid-body head motion was found to be with a mean translational and rotational error of 0.14±0.21 mm and 0.08±0.13(°) , respectively. Robust model training with sub-millimeter and sub-degree accuracy could be achieved using 100 data points with motion magnitudes of ±2 mm and ±1(°) for translation and rotation. The obtained linear models appeared to be subject-specific as inter-subject application of a "universal" FID-based motion model resulted in poor prediction accuracy. The results show that substantial rigid-body motion information is encoded in FID navigator signal time courses. Although, the applied method currently requires the simultaneous acquisition of FID signals and optical tracking data, the findings suggest that multi-channel FID navigators have a potential to complement existing tracking technologies for accurate rigid-body motion detection and correction in MRI.