919 resultados para Multi-method evaluation
Resumo:
AbstractThe effective evaluation for the treatment of patients with Ewing tumors depends on the accuracy in the determination of the primary tumor extent and the presence of metastatic disease. Currently, no universally accepted staging system is available to assess Ewing tumors. The present study aimed at discussing the use of PET/CT as a tool for staging, restaging and assessment of therapeutic response in patients with Ewing tumors. In spite of some limitations of PET/CT as compared with anatomical imaging methods, its relevance in the assessment of these patients is related to the capacity of the method to provide further physiological information, which often generates important clinical implications. Currently, the assessment of patients with Ewing tumor should comprise a study with PET/CT combined with other anatomical imaging modalities, such as radiography, computed tomography and magnetic resonance imaging.
Resumo:
AbstractObjective:To evaluate by magnetic resonance imaging changes in bone marrow of patients undergoing treatment for type I Gaucher’s disease.Materials and Methods:Descriptive, cross-sectional study of Gaucher’s disease patients submitted to 3 T magnetic resonance imaging of femurs and lumbar spine. The images were blindly reviewed and the findings were classified according to the semiquantitative bone marrow burden (BMB) scoring system.Results:All of the seven evaluated patients (three men and four women) presented signs of bone marrow infiltration. Osteonecrosis of the femoral head was found in three patients, Erlenmeyer flask deformity in five, and no patient had vertebral body collapse. The mean BMB score was 11, ranging from 9 to 14.Conclusion:Magnetic resonance imaging is currently the method of choice for assessing bone involvement in Gaucher’s disease in adults due to its high sensitivity to detect both focal and diffuse bone marrow changes, and the BMB score is a simplified method for semiquantitative analysis, without depending on advanced sequences or sophisticated hardware, allowing for the classification of the disease extent and assisting in the treatment monitoring.
Resumo:
AbstractObjective:To establish benchmarks and study some sonographic characteristics of the thyroid gland in a group of euthyroid children aged up to 5 years as compared with age-matched children with congenital hypothyroidism.Materials and Methods:Thirty-six children (17 female and 19 male) aged between 2 months and 5 years were divided into two groups – 23 euthyroid children and 13 children with congenital hypothyroidism – and were called to undergo ultrasonography.Results:In the group of euthyroid children (n = 23), mean total volume of the thyroid gland was 1.12 mL (minimum, 0.39 mL; maximum, 2.72 mL); a homogeneous gland was found in 17 children (73.91%) and 6 children (26.08%) had a heterogeneous gland. In the group of children with congenital hypothyroidism (n = 13), mean total volume of the thyroid gland was 2.73 mL (minimum, 0.20 mL; maximum, 11.00 mL). As regards thyroid location, 3 patients (23.07%) had ectopic thyroid, and 10 (69.23%) had topic thyroid, and out of the latter, 5 had a homogeneous gland (50%) and 5, a heterogeneous gland (50%). In the group with congenital hypothyroidism, 6 (46.15%) children had etiological diagnosis of dyshormoniogenesis, 3 (23.07%), of ectopic thyroid, and 4 (30.76%), of thyroid hypoplasia.Conclusion:Thyroid ultrasonography is a noninvasive imaging method, widely available, easy to perform and for these reasons could, and should, be performed at any time, including at birth, with no preparation or treatment discontinuation, to aid in the early etiological definition of congenital hypothyroidism.
Resumo:
This study aimed at comparing the efficiency of various sampling materials for the collection and subsequent analysis of organic gunshot residues (OGSR). To the best of our knowledge, it is the first time that sampling devices were investigated in detail for further quantitation of OGSR by LC-MS. Seven sampling materials, namely two "swab"-type and five "stub"-type collection materials, were tested. The investigation started with the development of a simple and robust LC-MS method able to separate and quantify molecules typically found in gunpowders, such as diphenylamine or ethylcentralite. The evaluation of sampling materials was then systematically carried out by first analysing blank extracts of the materials to check for potential interferences and determining matrix effects. Based on these results, the best four materials, namely cotton buds, polyester swabs, a tape from 3M and PTFE were compared in terms of collection efficiency during shooting experiments using a set of 9 mm Luger ammunition. It was found that the tape was capable of recovering the highest amounts of OGSR. As tape-lifting is the technique currently used in routine for inorganic GSR, OGSR analysis might be implemented without modifying IGSR sampling and analysis procedure.
Resumo:
Lorsque de l'essence est employée pour allumer et/ou propager un incendie, l'inférence de la source de l'essence peut permettre d'établir un lien entre le sinistre et une source potentielle. Cette inférence de la source constitue une alternative intéressante pour fournir des éléments de preuve dans ce type d'événements où les preuves matérielles laissées par l'auteur sont rares. Le but principal de cette recherche était le développement d'une méthode d'analyse de spécimens d'essence par GC-IRMS, méthode pas routinière et peu étudiée en science forensique, puis l'évaluation de son potentiel à inférer la source de traces d'essence en comparaison aux performances de la GC-MS. Un appareillage permettant d'analyser simultanément les échantillons par MS et par IRMS a été utilisé dans cette recherche. Une méthode d'analyse a été développée, optimisée et validée pour cet appareillage. Par la suite, des prélèvements d'essence provenant d'un échantillonnage conséquent et représentatif du marché de la région lausannoise ont été analysés. Finalement, les données obtenues ont été traitées et interprétées à l'aide de méthodes chimiométriques. Les analyses effectuées ont permis de montrer que la méthodologie mise en place, aussi bien pour la composante MS que pour l'IRMS, permet de différencier des échantillons d'essence non altérée provenant de différentes stations-service. Il a également pu être démontré qu'à chaque nouveau remplissage des cuves d'une station-service, la composition de l'essence distribuée par cette station est quasi unique. La GC-MS permet une meilleure différenciation d'échantillons prélevés dans différentes stations, alors que la GC-IRMS est plus performante lorsqu'il s'agit de comparer des échantillons collectés après chacun des remplissages d'une cuve. Ainsi, ces résultats indiquent que les deux composantes de la méthode peuvent être complémentaires pour l'analyse d'échantillons d'essence non altérée. Les résultats obtenus ont également permis de montrer que l'évaporation des échantillons d'essence ne compromet pas la possibilité de grouper des échantillons de même source par GC-MS. Il est toutefois nécessaire d'effectuer une sélection des variables afin d'éliminer celles qui sont influencées par le phénomène d'évaporation. Par contre, les analyses effectuées ont montré que l'évaporation des échantillons d'essence a une forte influence sur la composition isotopique des échantillons. Cette influence est telle qu'il n'est pas possible, même en effectuant une sélection des variables, de grouper correctement des échantillons évaporés par GC-IRMS. Par conséquent, seule la composante MS de la méthodologie mise en place permet d'inférer la source d'échantillons d'essence évaporée. _________________________________________________________________________________________________ When gasoline is used to start and / or propagate an arson, source inference of gasoline can allow to establish a link between the fire and a potential source. This source inference is an interesting alternative to provide evidence in this type of events where physical evidence left by the author are rare. The main purpose of this research was to develop a GC-IRMS method for the analysis of gasoline samples, a non-routine method and little investigated in forensic science, and to evaluate its potential to infer the source of gasoline traces compared to the GC-MS performances. An instrument allowing to analyze simultaneously samples by MS and IRMS was used in this research. An analytical method was developed, optimized and validated for this instrument. Thereafter, gasoline samples from a large sampling and representative of the Lausanne area market were analyzed. Finally, the obtained data were processed and interpreted using chemometric methods. The analyses have shown that the methodology, both for MS and for IRMS, allow to differentiate unweathered gasoline samples from different service stations. It has also been demonstrated that each new filling of the tanks of a station generates an almost unique composition of gasoline. GC-MS achieves a better differentiation of samples coming from different stations, while GC-IRMS is more efficient to distinguish samples collected after each filling of a tank. Thus, these results indicate that the two components of the method can be complementary to the analysis of unweathered gasoline samples. The results have also shown that the evaporation of gasoline samples does not compromise the possibility to group samples coming from the same source by GC-MS. It is however necessary to make a selection of variables in order to eliminate those which are influenced by the evaporation. On the other hand, the carried out analyses have shown that the evaporation of gasoline samples has such a strong influence on the isotopic composition of the samples that it is not possible, even by performing a selection of variables, to properly group evaporated samples by GC-IRMS. Therefore, only the MS allows to infer the source of evaporated gasoline samples.
Resumo:
The most suitable method for estimation of size diversity is investigated. Size diversity is computed on the basis of the Shannon diversity expression adapted for continuous variables, such as size. It takes the form of an integral involving the probability density function (pdf) of the size of the individuals. Different approaches for the estimation of pdf are compared: parametric methods, assuming that data come from a determinate family of pdfs, and nonparametric methods, where pdf is estimated using some kind of local evaluation. Exponential, generalized Pareto, normal, and log-normal distributions have been used to generate simulated samples using estimated parameters from real samples. Nonparametric methods include discrete computation of data histograms based on size intervals and continuous kernel estimation of pdf. Kernel approach gives accurate estimation of size diversity, whilst parametric methods are only useful when the reference distribution have similar shape to the real one. Special attention is given for data standardization. The division of data by the sample geometric mean is proposedas the most suitable standardization method, which shows additional advantages: the same size diversity value is obtained when using original size or log-transformed data, and size measurements with different dimensionality (longitudes, areas, volumes or biomasses) may be immediately compared with the simple addition of ln k where kis the dimensionality (1, 2, or 3, respectively). Thus, the kernel estimation, after data standardization by division of sample geometric mean, arises as the most reliable and generalizable method of size diversity evaluation
Resumo:
In order to develop applications for z;isual interpretation of medical images, the early detection and evaluation of microcalcifications in digital mammograms is verg important since their presence is oftenassociated with a high incidence of breast cancers. Accurate classification into benign and malignant groups would help improve diagnostic sensitivity as well as reduce the number of unnecessa y biopsies. The challenge here is the selection of the useful features to distinguish benign from malignant micro calcifications. Our purpose in this work is to analyse a microcalcification evaluation method based on a set of shapebased features extracted from the digitised mammography. The segmentation of the microcalcificationsis performed using a fixed-tolerance region growing method to extract boundaries of calcifications with manually selected seed pixels. Taking into account that shapes and sizes of clustered microcalcificationshave been associated with a high risk of carcinoma based on digerent subjective measures, such as whether or not the calcifications are irregular, linear, vermiform, branched, rounded or ring like, our efforts were addressed to obtain a feature set related to the shape. The identification of the pammeters concerning the malignant character of the microcalcifications was performed on a set of 146 mammograms with their real diagnosis known in advance from biopsies. This allowed identifying the following shape-based parameters as the relevant ones: Number of clusters, Number of holes, Area, Feret elongation, Roughness, and Elongation. Further experiments on a set of 70 new mammogmms showed that the performance of the classification scheme is close to the mean performance of three expert radiologists, which allows to consider the proposed method for assisting the diagnosis and encourages to continue the investigation in the senseof adding new features not only related to the shape
Resumo:
Simultaneous localization and mapping(SLAM) is a very important problem in mobile robotics. Many solutions have been proposed by different scientists during the last two decades, nevertheless few studies have considered the use of multiple sensors simultane¬ously. The solution is on combining several data sources with the aid of an Extended Kalman Filter (EKF). Two approaches are proposed. The first one is to use the ordinary EKF SLAM algorithm for each data source separately in parallel and then at the end of each step, fuse the results into one solution. Another proposed approach is the use of multiple data sources simultaneously in a single filter. The comparison of the computational com¬plexity of the two methods is also presented. The first method is almost four times faster than the second one.
Resumo:
Different compounds have been reported as biomarkers of a smoking habit, but, to date, there is no appropriate biomarker for tobacco-related exposure because the proposed chemicals seem to be nonspecific or they are only appropriate for short-term exposure. Moreover, conventional sampling methodologies require an invasive method because blood or urine samples are required. The use of a microtrap system coupled to gas chromatography–mass spectrometry analysis has been found to be very effective for the noninvasive analysis of volatile organic compounds in breath samples. The levels of benzene, 2,5-dimethylfuran, toluene, o-xylene, and m- p-xylene have been analyzed in breath samples obtained from 204 volunteers (100 smokers, 104 nonsmokers; 147 females, 57 males; ages 16 to 53 years). 2,5-Dimethylfuran was always below the limit of detection (0.005 ppbv) in the nonsmoker population and always detected in smokers independently of the smoking habits. Benzene was only an effective biomarker for medium and heavy smokers, and its level was affected by smoking habits. Regarding the levels of xylenes and toluene, they were only different in heavy smokers and after short-term exposure. The results obtained suggest that 2,5-dimethylfuran is a specific breath biomarker of smoking status independently of the smoking habits (e.g., short- and long-term exposure, light and heavy consumption), and so this compound might be useful as a biomarker of smoking exposure
Resumo:
Needle trap devices (NTDs) are a relatively new and promising tool for headspace (HS) analysis. In this study, a dynamic HS sampling procedure is evaluated for the determination of volatile organic compounds (VOCs) in whole blood samples. A full factorial design was used to evaluate the influence of the number of cycles and incubation time and it is demonstrated that the controlling factor in the process is the number of cycles. A mathematical model can be used to determine the most appropriate number of cycles required to adsorb a prefixed amount of VOCs present in the HS phase whenever quantitative adsorption is reached in each cycle. Matrix effect is of great importance when complex biological samples, such as blood, are analyzed. The evaluation of the salting out effect showed a significant improvement in the volatilization of VOCs to the HS in this type of matrices. Moreover, a 1:4 (blood:water) dilution is required to obtain quantitative recoveries of the target analytes when external calibration is used. The method developed gives detection limits in the 0.020–0.080 μg L−1 range (0.1–0.4 μg L−1 range for undiluted blood samples) with appropriate repeatability values (RSD < 15% at high level and <23% at LOQ level). Figure of merits of the method can be improved by using a smaller phase ratio (i.e., an increase in the blood volume and a decrease in the HS volume), which lead to lower detection limits, better repeatability values and greater sensibility. Twenty-eight blood samples have been evaluated with the proposed method and the results agree with those indicated in other studies. Benzene was the only target compound that gave significant differences between blood levels detected in volunteer non-smokers and smokers
Resumo:
This paper describes the basis of citation auctions as a new approach to selecting scientific papers for publication. Our main idea is to use an auction for selecting papers for publication through - differently from the state of the art - bids that consist of the number of citations that a scientist expects to receive if the paper is published. Hence, a citation auction is the selection process itself, and no reviewers are involved. The benefits of the proposed approach are two-fold. First, the cost of refereeing will be either totally eliminated or significantly reduced, because the process of citation auction does not need prior understanding of the paper's content to judge the quality of its contribution. Additionally, the method will not prejudge the content of the paper, so it will increase the openness of publications to new ideas. Second, scientists will be much more committed to the quality of their papers, paying close attention to distributing and explaining their papers in detail to maximize the number of citations that the paper receives. Sample analyses of the number of citations collected in papers published in years 1999-2004 for one journal, and in years 2003-2005 for a series of conferences (in a totally different discipline), via Google scholar, are provided. Finally, a simple simulation of an auction is given to outline the behaviour of the citation auction approach
Resumo:
Objectives: The objectives of this study is to review the set of criteria of the Institute of Medicine (IOM) for priority-setting in research with addition of new criteria if necessary, and to develop and evaluate the reliability and validity of the final priority score. Methods: Based on the evaluation of 199 research topics, forty-five experts identified additional criteria for priority-setting, rated their relevance, and ranked and weighted them in a three-round modified Delphi technique. A final priority score was developed and evaluated. Internal consistency, test–retest and inter-rater reliability were assessed. Correlation with experts’ overall qualitative topic ratings were assessed as an approximation to validity. Results: All seven original IOM criteria were considered relevant and two new criteria were added (“potential for translation into practice”, and “need for knowledge”). Final ranks and relative weights differed from those of the original IOM criteria: “research impact on health outcomes” was considered the most important criterion (4.23), as opposed to “burden of disease” (3.92). Cronbach’s alpha (0.75) and test–retest stability (interclass correlation coefficient = 0.66) for the final set of criteria were acceptable. The area under the receiver operating characteristic curve for overall assessment of priority was 0.66. Conclusions: A reliable instrument for prioritizing topics in clinical and health services research has been developed. Further evaluation of its validity and impact on selecting research topics is required
Resumo:
Validation and verification operations encounter various challenges in product development process. Requirements for increasing the development cycle pace set new requests for component development process. Verification and validation usually represent the largest activities, up to 40 50 % of R&D resources utilized. This research studies validation and verification as part of case company's component development process. The target is to define framework that can be used in improvement of the validation and verification capability evaluation and development in display module development projects. Validation and verification definition and background is studied in this research. Additionally, theories such as project management, system, organisational learning and causality is studied. Framework and key findings of this research are presented. Feedback system according of the framework is defined and implemented to the case company. This research is divided to the theory and empirical parts. Theory part is conducted in literature review. Empirical part is done in case study. Constructive methode and design research methode are used in this research A framework for capability evaluation and development was defined and developed as result of this research. Key findings of this study were that double loop learning approach with validation and verification V+ model enables defining a feedback reporting solution. Additional results, some minor changes in validation and verification process were proposed. There are a few concerns expressed on the results on validity and reliability of this study. The most important one was the selected research method and the selected model itself. The final state can be normative, the researcher may set study results before the actual study and in the initial state, the researcher may describe expectations for the study. Finally reliability of this study, and validity of this work are studied.
Resumo:
Robotic platforms have advanced greatly in terms of their remote sensing capabilities, including obtaining optical information using cameras. Alongside these advances, visual mapping has become a very active research area, which facilitates the mapping of areas inaccessible to humans. This requires the efficient processing of data to increase the final mosaic quality and computational efficiency. In this paper, we propose an efficient image mosaicing algorithm for large area visual mapping in underwater environments using multiple underwater robots. Our method identifies overlapping image pairs in the trajectories carried out by the different robots during the topology estimation process, being this a cornerstone for efficiently mapping large areas of the seafloor. We present comparative results based on challenging real underwater datasets, which simulated multi-robot mapping
Resumo:
Notwithstanding the functional role that the aggregates of some amyloidogenic proteins can play in different organisms, protein aggregation plays a pivotal role in the pathogenesis of a large number of human diseases. One of such diseases is Alzheimer"s disease (AD), where the overproduction and aggregation of the β-amyloid peptide (Aβ) are regarded as early critical factors. Another protein that seems to occupy a prominent position within the complex pathological network of AD is the enzyme acetylcholinesterase (AChE), with classical and non-classical activities involved at the late (cholinergic deficit) and early (Aβ aggregation) phases of the disease. Dual inhibitors of Aβ aggregation and AChE are thus emerging as promising multi-target agents with potential to efficiently modify the natural course of AD. In the initial phases of the drug discovery process of such compounds, in vitro evaluation of the inhibition of Aβ aggregation is rather troublesome, as it is very sensitive to experimental assay conditions, and requires expensive synthetic Aβ peptides, which makes cost-prohibitive the screening of large compound libraries. Herein, we review recently developed multi-target anti-Alzheimer compounds that exhibit both Aβ aggregation and AChE inhibitory activities, and, in some cases also additional valuable activities such as BACE-1 inhibition or antioxidant properties. We also discuss the development of simplified in vivo methods for the rapid, simple, reliable, unexpensive, and high-throughput amenable screening of Aβ aggregation inhibitors that rely on the overexpression of Aβ42 alone or fused with reporter proteins in Escherichia coli.