135 resultados para Quantification methods
Resumo:
Recently, the spin-echo full-intensity acquired localized (SPECIAL) spectroscopy technique was proposed to unite the advantages of short TEs on the order of milliseconds (ms) with full sensitivity and applied to in vivo rat brain. In the present study, SPECIAL was adapted and optimized for use on a clinical platform at 3T and 7T by combining interleaved water suppression (WS) and outer volume saturation (OVS), optimized sequence timing, and improved shimming using FASTMAP. High-quality single voxel spectra of human brain were acquired at TEs below or equal to 6 ms on a clinical 3T and 7T system for six volunteers. Narrow linewidths (6.6 +/- 0.6 Hz at 3T and 12.1 +/- 1.0 Hz at 7T for water) and the high signal-to-noise ratio (SNR) of the artifact-free spectra enabled the quantification of a neurochemical profile consisting of 18 metabolites with Cramér-Rao lower bounds (CRLBs) below 20% at both field strengths. The enhanced sensitivity and increased spectral resolution at 7T compared to 3T allowed a two-fold reduction in scan time, an increased precision of quantification for 12 metabolites, and the additional quantification of lactate with CRLB below 20%. Improved sensitivity at 7T was also demonstrated by a 1.7-fold increase in average SNR (= peak height/root mean square [RMS]-of-noise) per unit-time.
Resumo:
A high-resolution three-dimensional (3-D) seismic reflection survey was conducted in Lake Geneva, near the city of Lausanne, Switzerland, as part of a project for developing such seismic techniques. Using a single 48-channel streamer, the 3-D site with an area of 1200 m x 600 m was surveyed in 10 days. A variety of complex geologic structures (e.g. thrusts, folds, channel-fill) up to similar to150 m below the water bottom were obtained with a 15 in.(3) water gun. The 3-D data allowed the construction of an accurate velocity model and the distinction of five major seismic facies within the Lower Freshwater Molasse (Aquitanian) and the Quaternary sedimentary units. Additionally, the Plateau Molasse (PM) and Subalpine Molasse (SM) erosional surface, "La Paudeze" thrust fault (PM-SM boundary) and the thickness of Quaternary sediments were accurately delineated in 3-D.
Resumo:
Malaria is generally diagnosed by microscopy and rapid antigen testing. Molecular methods become more widely used. In the present study, the contribution of a quantitative multiplex malaria PCR was investigated. We assessed: (i) the agreement between PCR-based identification and microscopy and (ii) the correlation between the parasite load as determined by quantitative PCR and by microscopy. For 83 patients positive by microscopy for Plasmodium spp., the first EDTA-blood sample was tested by multiplex PCR to confirm smear-based species identification. Parasite load was assessed daily using both microscopy and PCR. Among the 83 patients tested, one was positive by microscopy only and 82 were positive by microscopy and PCR. Agreement between microscopy and PCR for the identification at the species level was 89% (73/82). Six of the nine discordant results corresponded to co-infections by two or three species and were attributed to inaccurate morphological identification of mixed cases. The parasite load generally decreased rapidly after treatment had been started, with similar decay curves being obtained using both microscopy and PCR. Our PCR proved especially useful for identifying mixed infections. The quantification obtained by PCR closely correlated with microscopy-based quantification and could be useful for monitoring treatment efficacy, at least in clinical trials.
Resumo:
Background: Microbiological diagnostic procedures have changed significantly over the last decade. Initially the implementation of the polymerase chain reaction (PCR) resulted in improved detection tests for microbes that were difficult or even impossible to detect by conventional methods such as culture and serology, especially in community-acquired respiratory tract infections (CA-RTI). A further improvement was the development of real-time PCR, which allows end point detection and quantification, and many diagnostic laboratories have now implemented this powerful method. Objective: At present, new performant and convenient molecular tests have emerged targeting in parallel many viruses and bacteria responsible for lower and/or upper respiratory tract infections. The range of test formats and microbial agents detected is evolving very quickly and the added value of these new tests needs to be studied in terms of better use of antibiotics, better patient management, duration of hospitalization and overall costs. Conclusions: Molecular tools for a better microbial documentation of CA-RTI are now available. Controlled studies are now required to address the relevance issue of these new methods, such as, for example, the role of some newly detected respiratory viruses or of the microbial DNA load in a particular patient at a particular time. The future challenge for molecular diagnosis will be to become easy to handle, highly efficient and cost-effective, delivering rapid results with a direct impact on clinical management.
Resumo:
Summary: Global warming has led to an average earth surface temperature increase of about 0.7 °C in the 20th century, according to the 2007 IPCC report. In Switzerland, the temperature increase in the same period was even higher: 1.3 °C in the Northern Alps anal 1.7 °C in the Southern Alps. The impacts of this warming on ecosystems aspecially on climatically sensitive systems like the treeline ecotone -are already visible today. Alpine treeline species show increased growth rates, more establishment of young trees in forest gaps is observed in many locations and treelines are migrating upwards. With the forecasted warming, this globally visible phenomenon is expected to continue. This PhD thesis aimed to develop a set of methods and models to investigate current and future climatic treeline positions and treeline shifts in the Swiss Alps in a spatial context. The focus was therefore on: 1) the quantification of current treeline dynamics and its potential causes, 2) the evaluation and improvement of temperaturebased treeline indicators and 3) the spatial analysis and projection of past, current and future climatic treeline positions and their respective elevational shifts. The methods used involved a combination of field temperature measurements, statistical modeling and spatial modeling in a geographical information system. To determine treeline shifts and assign the respective drivers, neighborhood relationships between forest patches were analyzed using moving window algorithms. Time series regression modeling was used in the development of an air-to-soil temperature transfer model to calculate thermal treeline indicators. The indicators were then applied spatially to delineate the climatic treeline, based on interpolated temperature data. Observation of recent forest dynamics in the Swiss treeline ecotone showed that changes were mainly due to forest in-growth, but also partly to upward attitudinal shifts. The recent reduction in agricultural land-use was found to be the dominant driver of these changes. Climate-driven changes were identified only at the uppermost limits of the treeline ecotone. Seasonal mean temperature indicators were found to be the best for predicting climatic treelines. Applying dynamic seasonal delimitations and the air-to-soil temperature transfer model improved the indicators' applicability for spatial modeling. Reproducing the climatic treelines of the past 45 years revealed regionally different attitudinal shifts, the largest being located near the highest mountain mass. Modeling climatic treelines based on two IPCC climate warming scenarios predicted major shifts in treeline altitude. However, the currently-observed treeline is not expected to reach this limit easily, due to lagged reaction, possible climate feedback effects and other limiting factors. Résumé: Selon le rapport 2007 de l'IPCC, le réchauffement global a induit une augmentation de la température terrestre de 0.7 °C en moyenne au cours du 20e siècle. En Suisse, l'augmentation durant la même période a été plus importante: 1.3 °C dans les Alpes du nord et 1.7 °C dans les Alpes du sud. Les impacts de ce réchauffement sur les écosystèmes - en particuliers les systèmes sensibles comme l'écotone de la limite des arbres - sont déjà visibles aujourd'hui. Les espèces de la limite alpine des forêts ont des taux de croissance plus forts, on observe en de nombreux endroits un accroissement du nombre de jeunes arbres s'établissant dans les trouées et la limite des arbres migre vers le haut. Compte tenu du réchauffement prévu, on s'attend à ce que ce phénomène, visible globalement, persiste. Cette thèse de doctorat visait à développer un jeu de méthodes et de modèles pour étudier dans un contexte spatial la position présente et future de la limite climatique des arbres, ainsi que ses déplacements, au sein des Alpes suisses. L'étude s'est donc focalisée sur: 1) la quantification de la dynamique actuelle de la limite des arbres et ses causes potentielles, 2) l'évaluation et l'amélioration des indicateurs, basés sur la température, pour la limite des arbres et 3) l'analyse spatiale et la projection de la position climatique passée, présente et future de la limite des arbres et des déplacements altitudinaux de cette position. Les méthodes utilisées sont une combinaison de mesures de température sur le terrain, de modélisation statistique et de la modélisation spatiale à l'aide d'un système d'information géographique. Les relations de voisinage entre parcelles de forêt ont été analysées à l'aide d'algorithmes utilisant des fenêtres mobiles, afin de mesurer les déplacements de la limite des arbres et déterminer leurs causes. Un modèle de transfert de température air-sol, basé sur les modèles de régression sur séries temporelles, a été développé pour calculer des indicateurs thermiques de la limite des arbres. Les indicateurs ont ensuite été appliqués spatialement pour délimiter la limite climatique des arbres, sur la base de données de températures interpolées. L'observation de la dynamique forestière récente dans l'écotone de la limite des arbres en Suisse a montré que les changements étaient principalement dus à la fermeture des trouées, mais aussi en partie à des déplacements vers des altitudes plus élevées. Il a été montré que la récente déprise agricole était la cause principale de ces changements. Des changements dus au climat n'ont été identifiés qu'aux limites supérieures de l'écotone de la limite des arbres. Les indicateurs de température moyenne saisonnière se sont avérés le mieux convenir pour prédire la limite climatique des arbres. L'application de limites dynamiques saisonnières et du modèle de transfert de température air-sol a amélioré l'applicabilité des indicateurs pour la modélisation spatiale. La reproduction des limites climatiques des arbres durant ces 45 dernières années a mis en évidence des changements d'altitude différents selon les régions, les plus importants étant situés près du plus haut massif montagneux. La modélisation des limites climatiques des arbres d'après deux scénarios de réchauffement climatique de l'IPCC a prédit des changements majeurs de l'altitude de la limite des arbres. Toutefois, l'on ne s'attend pas à ce que la limite des arbres actuellement observée atteigne cette limite facilement, en raison du délai de réaction, d'effets rétroactifs du climat et d'autres facteurs limitants.
Resumo:
In the International Olympic Committee (IOC) accredited laboratories, specific methods have been developed to detect anabolic steroids in athletes' urine. The technique of choice to achieve this is gas-chromatography coupled with mass spectrometry (GC-MS). In order to improve the efficiency of anti-doping programmes, the laboratories have defined new analytical strategies. The final sensitivity of the analytical procedure can be improved by choosing new technologies for use in detection, such as tandem mass spectrometry (MS-MS) or high resolution mass spectrometry (HRMS). A better sample preparation using immuno-affinity chromatography (IAC) is also a good tool for improving sensitivity. These techniques are suitable for the detection of synthetic anabolic steroids whose structure is not found naturally in the human body. The more and more evident use, on a large scale, of substances chemically similar to the endogenous steroids obliges both the laboratory and the sports authorities to use the steroid profile of the athlete in comparison with reference ranges from a population or with intraindividual reference values.
Resumo:
Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.
Resumo:
Evidence has emerged that the initiation and growth of gliomas is sustained by a subpopulation of cancer-initiating cells (CICs). Because of the difficulty of using markers to tag CICs in gliomas, we have previously exploited more robust phenotypic characteristics, including a specific morphology and intrincic autofluorescence, to identify and isolate a subpopulation of glioma CICs, called FL1(+). The objective of this study was to further validate our method in a large cohort of human glioma and a mouse model of glioma. Seventy-four human gliomas of all grades and the GFAP-V(12)HA-ras B8 mouse model were analyzed for in vitro self-renewal capacity and their content of FL1(+). Nonneoplastic brain tissue and embryonic mouse brain were used as control. Genetic traceability along passages was assessed with microsatellite analysis. We found that FL1(+) cells from low-grade gliomas and from control nonneoplasic brain tissue show a lower level of autofluorescence and undergo a restricted number of cell divisions before dying in culture. In contrast, we found that FL1(+) cells derived from many but not all high-grade gliomas acquire high levels of autofluorescence and can be propagated in long-term cultures. Moreover, FL1(+) cells show a remarkable traceability over time in vitro and in vivo. Our results show that FL1(+) cells can be found in all specimens of a large cohort of human gliomas of different grades and in a model of genetically induced mouse glioma as well as nonneoplastic brain. However, their self-renewal capacity is variable and seems to be dependent on the tumor grade.
Resumo:
Whole-body (WB) planar imaging has long been one of the staple methods of dosimetry, and its quantification has been formalized by the MIRD Committee in pamphlet no 16. One of the issues not specifically addressed in the formalism occurs when the count rates reaching the detector are sufficiently high to result in camera count saturation. Camera dead-time effects have been extensively studied, but all of the developed correction methods assume static acquisitions. However, during WB planar (sweep) imaging, a variable amount of imaged activity exists in the detector's field of view as a function of time and therefore the camera saturation is time dependent. A new time-dependent algorithm was developed to correct for dead-time effects during WB planar acquisitions that accounts for relative motion between detector heads and imaged object. Static camera dead-time parameters were acquired by imaging decaying activity in a phantom and obtaining a saturation curve. Using these parameters, an iterative algorithm akin to Newton's method was developed, which takes into account the variable count rate seen by the detector as a function of time. The algorithm was tested on simulated data as well as on a whole-body scan of high activity Samarium-153 in an ellipsoid phantom. A complete set of parameters from unsaturated phantom data necessary for count rate to activity conversion was also obtained, including build-up and attenuation coefficients, in order to convert corrected count rate values to activity. The algorithm proved successful in accounting for motion- and time-dependent saturation effects in both the simulated and measured data and converged to any desired degree of precision. The clearance half-life calculated from the ellipsoid phantom data was calculated to be 45.1 h after dead-time correction and 51.4 h with no correction; the physical decay half-life of Samarium-153 is 46.3 h. Accurate WB planar dosimetry of high activities relies on successfully compensating for camera saturation which takes into account the variable activity in the field of view, i.e. time-dependent dead-time effects. The algorithm presented here accomplishes this task.
Resumo:
RATIONALE AND OBJECTIVES: To determine optimum spatial resolution when imaging peripheral arteries with magnetic resonance angiography (MRA). MATERIALS AND METHODS: Eight vessel diameters ranging from 1.0 to 8.0 mm were simulated in a vascular phantom. A total of 40 three-dimensional flash MRA sequences were acquired with incremental variations of fields of view, matrix size, and slice thickness. The accurately known eight diameters were combined pairwise to generate 22 "exact" degrees of stenosis ranging from 42% to 87%. Then, the diameters were measured in the MRA images by three independent observers and with quantitative angiography (QA) software and used to compute the degrees of stenosis corresponding to the 22 "exact" ones. The accuracy and reproducibility of vessel diameter measurements and stenosis calculations were assessed for vessel size ranging from 6 to 8 mm (iliac artery), 4 to 5 mm (femoro-popliteal arteries), and 1 to 3 mm (infrapopliteal arteries). Maximum pixel dimension and slice thickness to obtain a mean error in stenosis evaluation of less than 10% were determined by linear regression analysis. RESULTS: Mean errors on stenosis quantification were 8.8% +/- 6.3% for 6- to 8-mm vessels, 15.5% +/- 8.2% for 4- to 5-mm vessels, and 18.9% +/- 7.5% for 1- to 3-mm vessels. Mean errors on stenosis calculation were 12.3% +/- 8.2% for observers and 11.4% +/- 15.1% for QA software (P = .0342). To evaluate stenosis with a mean error of less than 10%, maximum pixel surface, the pixel size in the phase direction, and the slice thickness should be less than 1.56 mm2, 1.34 mm, 1.70 mm, respectively (voxel size 2.65 mm3) for 6- to 8-mm vessels; 1.31 mm2, 1.10 mm, 1.34 mm (voxel size 1.76 mm3), for 4- to 5-mm vessels; and 1.17 mm2, 0.90 mm, 0.9 mm (voxel size 1.05 mm3) for 1- to 3-mm vessels. CONCLUSION: Higher spatial resolution than currently used should be selected for imaging peripheral vessels.
Resumo:
This paper characterizes and evaluates the potential of three commercial CT iterative reconstruction methods (ASIR?, VEO? and iDose(4 ()?())) for dose reduction and image quality improvement. We measured CT number accuracy, standard deviation (SD), noise power spectrum (NPS) and modulation transfer function (MTF) metrics on Catphan phantom images while five human observers performed four-alternative forced-choice (4AFC) experiments to assess the detectability of low- and high-contrast objects embedded in two pediatric phantoms. Results show that 40% and 100% ASIR as well as iDose(4) levels 3 and 6 do not affect CT number and strongly decrease image noise with relative SD constant in a large range of dose. However, while ASIR produces a shift of the NPS curve apex, less change is observed with iDose(4) with respect to FBP methods. With second-generation iterative reconstruction VEO, physical metrics are even further improved: SD decreased to 70.4% at 0.5 mGy and spatial resolution improved to 37% (MTF(50%)). 4AFC experiments show that few improvements in detection task performance are obtained with ASIR and iDose(4), whereas VEO makes excellent detections possible even at an ultra-low-dose (0.3 mGy), leading to a potential dose reduction of a factor 3 to 7 (67%-86%). In spite of its longer reconstruction time and the fact that clinical studies are still required to complete these results, VEO clearly confirms the tremendous potential of iterative reconstructions for dose reduction in CT and appears to be an important tool for patient follow-up, especially for pediatric patients where cumulative lifetime dose still remains high.
Resumo:
Despite the central role of quantitative PCR (qPCR) in the quantification of mRNA transcripts, most analyses of qPCR data are still delegated to the software that comes with the qPCR apparatus. This is especially true for the handling of the fluorescence baseline. This article shows that baseline estimation errors are directly reflected in the observed PCR efficiency values and are thus propagated exponentially in the estimated starting concentrations as well as 'fold-difference' results. Because of the unknown origin and kinetics of the baseline fluorescence, the fluorescence values monitored in the initial cycles of the PCR reaction cannot be used to estimate a useful baseline value. An algorithm that estimates the baseline by reconstructing the log-linear phase downward from the early plateau phase of the PCR reaction was developed and shown to lead to very reproducible PCR efficiency values. PCR efficiency values were determined per sample by fitting a regression line to a subset of data points in the log-linear phase. The variability, as well as the bias, in qPCR results was significantly reduced when the mean of these PCR efficiencies per amplicon was used in the calculation of an estimate of the starting concentration per sample.
Resumo:
PURPOSE OF REVIEW: Invasive candidiasis is a severe infectious complication occurring mostly in onco-hematologic and surgical patients. Its conventional diagnosis is insensitive and often late, leading to a delayed treatment and a high mortality. The purpose of this article is to review recent contributions in the nonconventional diagnostic approaches of invasive candidiasis, both for the detection of the epidose and the characterization of the etiologic agent. RECENT FINDINGS: Antigen-based tests to detect invasive candidiasis comprise a specific test, mannan, as well as a nonspecific test, beta-D-glucan. Both have a moderate sensitivity and a high specificity, and cannot be recommended alone as a negative screening tool or a positive syndrome driven diagnostic tool. Molecular-based tests still have not reached the stage of rapid, easy to use, standardized tests ideally complementing blood culture at the time of blood sampling. New tests (fluorescence in-situ hybridization or mass spectrometry) significantly reduce the delay of identification of Candida at the species level in positive blood cultures, and should have a positive impact on earlier appropriate antifungal therapy and possibly on outcome. SUMMARY: Both antigen-based and molecular tests appear as promising new tools to complement and accelerate the conventional diagnosis of invasive candidiasis with an expected significant impact on earlier and more focused treatment and on prognosis.
Resumo:
Fraud is as old as Mankind. There are an enormous number of historical documents which show the interaction between truth and untruth; therefore it is not really surprising that the prevalence of publication discrepancies is increasing. More surprising is that new cases especially in the medical field generate such a huge astonishment. In financial mathematics a statistical tool for detection of fraud is known which uses the knowledge of Newcomb and Benford regarding the distribution of natural numbers. This distribution is not equal and lower numbers are more likely to be detected compared to higher ones. In this investigation all numbers contained in the blinded abstracts of the 2009 annual meeting of the Swiss Society of Anesthesia and Resuscitation (SGAR) were recorded and analyzed regarding the distribution. A manipulated abstract was also included in the investigation. The χ(2)-test was used to determine statistical differences between expected and observed counts of numbers. There was also a faked abstract integrated in the investigation. A p<0.05 was considered significant. The distribution of the 1,800 numbers in the 77 submitted abstracts followed Benford's law. The manipulated abstract was detected by statistical means (difference in expected versus observed p<0.05). Statistics cannot prove whether the content is true or not but can give some serious hints to look into the details in such conspicuous material. These are the first results of a test for the distribution of numbers presented in medical research.