992 resultados para Quantitative oogram technique


Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are many techniques for the treatment of hip dysplasia, and novel research is currently being undertaken in the hope of obtaining more efficient and less traumatic techniques. The denervation of the hip joint capsule is a simple and effective technique that allows recovery of the functional activity of the affected limbs in significantly less time than other techniques. This surgical procedure consists of removing the acetabular periosteum, thus eliminating the nerve fibres with consequent analgesia. The aim of this investigation was to quantify the number of nerve fibres present in different regions of the acetabular periosteum. The knowledge of regional differences is potentially valuable for the refining of the denervation technique of the hip joint capsule. Thirty canine acetabular fragments were used to compare the nerve fibre density of the periosteum. The results showed a significant difference between the mean density of nerve fibres at the cranial and dorsal-lateral portion (approximately 75 fibres/mm(2)) and caudal lateral portion (approximately 60 fibres/mm(2)) of the acetabulum. Those fibres at the pedosteum are almost positioned in a sagittal plane, pointing towards the joint capsule, suggesting the some density in the latter region. These results indicate a new approach to the articular denervation technique, thus obtanining even better results for the treatment of hip dysplasia in dogs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Inflammatory papillary hyperplasia of the palate (IPHP) is a tissue-reactive overgrowth characterized by hyperemic mucosa with nodular or papillary appearance in the palate. The exact pathogenesis is still unclear. In this study, the presence of Candida albicans in the epithelial lining was evaluated using the indirect immunofluorescence staining technique. Strongly stained C albicans was observed only in the lesions of the IPHP group. Therefore, the detection of C albicans in almost all samples from IPHP tissue enabled a suggestion as to the microbial etiology of the disease, since the use of dental prostheses was reported. Int J Prosthodont 2011;24:235-237

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese de Doutoramento em Biologia apresentada à Faculdade de Ciências da Universidade do Porto, 2015.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação apresentada para a obtenção do Grau de Doutor em Informática pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mice infected with about 90 cercariae of Schistosoma mansoni (LE strain) were treated during five consecutive days with dexamethasone (50 mg/Kg, subcutaneously), starting on the 42th day of infection. Groups of five mice were then daily sacrificed from the first day after onset of treatment until the first day after. The perfusion of the portal system was performed and a piece of the intestine was processed for qualitative and quantitative oograms. This treatment carries to larger numbers of eggs in the tissues of treated mice, when compared with untreated groups. No changes were observed in the kinetics of oviposition, as all stages of viable eggs were observed in the tissues of treated and control mice. These data reinforce the hypothesis of a partial blockade of the egg excretion in immunossupressed mice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We detected Toxoplasma gondii oocysts in feces of experimentally infected cats, using a Kato Katz approach with subsequent Kinyoun staining. Animals serologically negative to T. gondii were infected orally with 5x10² mice brain cysts of ME49 strain. Feces were collected daily from the 3rd to the 30th day after challenge. Oocysts were detected by qualitative sugar flotation and the quantitative modified Kato Katz stained by Kinyoun (KKK). In the experimentally infected cats, oocysts were detected from the 7th to 15th day through sugar flotation technique, but oocysts were found in KKK from the 6th to 16th day, being sensitive for a larger period, with permanent documentation. The peak of oocysts excretion occurred between the 8th to 11th days after challenge, before any serological positive result. KKK could be used in the screening and quantification of oocysts excretion in feces of suspected animals, with reduced handling of infective material, decreasing the possibility of environmental and operator contamination.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Myocardialexsudate CD4+ andCD8+ lymphocytes were counted in transmural left ventricular free wall frozen sections taken from 10 necropsied chronic cardiac chagasic patients. The cells were labeled with monoclonal antibodies using a streptavidin-biotin technique. We counted: 1) lymphocytes in the total exsudate (LTE) and, separately, 2) the lymphocytes touching orvery near to my oc ells (LTVNM). Lymphocytes were considered very near whenever their own nuclear shortest nuclear diameter was larger than their distance from myocells. CD8+ lymphocytes were more numerous than CD4+ lymphocytes, especially among the LTVNM. The LTE CD4/CD8 ratio was 0,37 ± 0,20, but the LTVNM CD4/CD8 ratio was smaller (0,23 ± 0,11). Among theLTE, 34 ± 11% ofCD8+ (against24 + 12% of CD4+) were LTVNM. All these differences were statistically significant. Both subtypes ofT-lymphocytes were found to have an intimate relationship with both ruptured and unruptured myocells, and parasites were not seen. These findings are in accordance with the idea that the myocardial cell lesions in the cardiac form of human Chagas' disease are mediated mainly by T- cytotoxic lymphocytes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ion Mobility Spectrometry coupled with Multi Capillary Columns (MCC -IMS) is a fast analytical technique working at atmospheric pressure with high sensitivity and selectivity making it suitable for the analysis of complex biological matrices. MCC-IMS analysis generates its information through a 3D spectrum with peaks, corresponding to each of the substances detected, providing quantitative and qualitative information. Sometimes peaks of different substances overlap, making the quantification of substances present in the biological matrices a difficult process. In the present work we use peaks of isoprene and acetone as a model for this problem. These two volatile organic compounds (VOCs) that when detected by MCC-IMS produce two overlapping peaks. In this work it’s proposed an algorithm to identify and quantify these two peaks. This algorithm uses image processing techniques to treat the spectra and to detect the position of the peaks, and then fits the data to a custom model in order to separate the peaks. Once the peaks are separated it calculates the contribution of each peak to the data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Molecular monitoring of BCR/ABL transcripts by real time quantitative reverse transcription PCR (qRT-PCR) is an essential technique for clinical management of patients with BCR/ABL-positive CML and ALL. Though quantitative BCR/ABL assays are performed in hundreds of laboratories worldwide, results among these laboratories cannot be reliably compared due to heterogeneity in test methods, data analysis, reporting, and lack of quantitative standards. Recent efforts towards standardization have been limited in scope. Aliquots of RNA were sent to clinical test centers worldwide in order to evaluate methods and reporting for e1a2, b2a2, and b3a2 transcript levels using their own qRT-PCR assays. Total RNA was isolated from tissue culture cells that expressed each of the different BCR/ABL transcripts. Serial log dilutions were prepared, ranging from 100 to 10-5, in RNA isolated from HL60 cells. Laboratories performed 5 independent qRT-PCR reactions for each sample type at each dilution. In addition, 15 qRT-PCR reactions of the 10-3 b3a2 RNA dilution were run to assess reproducibility within and between laboratories. Participants were asked to run the samples following their standard protocols and to report cycle threshold (Ct), quantitative values for BCR/ABL and housekeeping genes, and ratios of BCR/ABL to housekeeping genes for each sample RNA. Thirty-seven (n=37) participants have submitted qRT-PCR results for analysis (36, 37, and 34 labs generated data for b2a2, b3a2, and e1a2, respectively). The limit of detection for this study was defined as the lowest dilution that a Ct value could be detected for all 5 replicates. For b2a2, 15, 16, 4, and 1 lab(s) showed a limit of detection at the 10-5, 10-4, 10-3, and 10-2 dilutions, respectively. For b3a2, 20, 13, and 4 labs showed a limit of detection at the 10-5, 10-4, and 10-3 dilutions, respectively. For e1a2, 10, 21, 2, and 1 lab(s) showed a limit of detection at the 10-5, 10-4, 10-3, and 10-2 dilutions, respectively. Log %BCR/ABL ratio values provided a method for comparing results between the different laboratories for each BCR/ABL dilution series. Linear regression analysis revealed concordance among the majority of participant data over the 10-1 to 10-4 dilutions. The overall slope values showed comparable results among the majority of b2a2 (mean=0.939; median=0.9627; range (0.399 - 1.1872)), b3a2 (mean=0.925; median=0.922; range (0.625 - 1.140)), and e1a2 (mean=0.897; median=0.909; range (0.5174 - 1.138)) laboratory results (Fig. 1-3)). Thirty-four (n=34) out of the 37 laboratories reported Ct values for all 15 replicates and only those with a complete data set were included in the inter-lab calculations. Eleven laboratories either did not report their copy number data or used other reporting units such as nanograms or cell numbers; therefore, only 26 laboratories were included in the overall analysis of copy numbers. The median copy number was 348.4, with a range from 15.6 to 547,000 copies (approximately a 4.5 log difference); the median intra-lab %CV was 19.2% with a range from 4.2% to 82.6%. While our international performance evaluation using serially diluted RNA samples has reinforced the fact that heterogeneity exists among clinical laboratories, it has also demonstrated that performance within a laboratory is overall very consistent. Accordingly, the availability of defined BCR/ABL RNAs may facilitate the validation of all phases of quantitative BCR/ABL analysis and may be extremely useful as a tool for monitoring assay performance. Ongoing analyses of these materials, along with the development of additional control materials, may solidify consensus around their application in routine laboratory testing and possible integration in worldwide efforts to standardize quantitative BCR/ABL testing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Validation is arguably the bottleneck in the diffusion magnetic resonance imaging (MRI) community. This paper evaluates and compares 20 algorithms for recovering the local intra-voxel fiber structure from diffusion MRI data and is based on the results of the "HARDI reconstruction challenge" organized in the context of the "ISBI 2012" conference. Evaluated methods encompass a mixture of classical techniques well known in the literature such as diffusion tensor, Q-Ball and diffusion spectrum imaging, algorithms inspired by the recent theory of compressed sensing and also brand new approaches proposed for the first time at this contest. To quantitatively compare the methods under controlled conditions, two datasets with known ground-truth were synthetically generated and two main criteria were used to evaluate the quality of the reconstructions in every voxel: correct assessment of the number of fiber populations and angular accuracy in their orientation. This comparative study investigates the behavior of every algorithm with varying experimental conditions and highlights strengths and weaknesses of each approach. This information can be useful not only for enhancing current algorithms and develop the next generation of reconstruction methods, but also to assist physicians in the choice of the most adequate technique for their studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article describes the composition of fingermark residue as being a complex system with numerous compounds coming from different sources and evolving over time from the initial composition (corresponding to the composition right after deposition) to the aged composition (corresponding to the evolution of the initial composition over time). This complex system will additionally vary due to effects of numerous influence factors grouped in five different classes: the donor characteristics, the deposition conditions, the substrate nature, the environmental conditions and the applied enhancement techniques. The initial and aged compositions as well as the influence factors are thus considered in this article to provide a qualitative and quantitative review of all compounds identified in fingermark residue up to now. The analytical techniques used to obtain these data are also enumerated. This review highlights the fact that despite the numerous analytical processes that have already been proposed and tested to elucidate fingermark composition, advanced knowledge is still missing. Thus, there is a real need to conduct future research on the composition of fingermark residue, focusing particularly on quantitative measurements, aging kinetics and effects of influence factors. The results of future research are particularly important for advances in fingermark enhancement and dating technique developments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three techniques to extract parasite remains from archaeological sediments were tested. The aim was to improve the sensibility of recommended paleoparasitological techniques applied in archaeological remains. Sediment collected from the pelvic girdle of a human body found in Cabo Vírgenes, Santa Cruz, Argentina, associated to a Spanish settlement founded in 1584 known as Nombre de Jesús, was used to search for parasites. Sediment close to the skull was used as control. The techniques recommended by Jones, Reinhard, and Dittmar and Teejen were used and compared with the modified technique presented here, developed to improve the sensibility to detect parasite remains. Positive results were obtained only with the modified technique, resulting in the finding of Trichuris trichiura eggs in the sediment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article describes the composition of fingermark residue as being a complex system with numerous compounds coming from different sources and evolving over time from the initial composition (corresponding to the composition right after deposition) to the aged composition (corresponding to the evolution of the initial composition over time). This complex system will additionally vary due to effects of numerous influence factors grouped in five different classes: the donor characteristics, the deposition conditions, the substrate nature, the environmental conditions and the applied enhancement techniques. The initial and aged compositions as well as the influence factors are thus considered in this article to provide a qualitative and quantitative review of all compounds identified in fingermark residue up to now. The analytical techniques used to obtain these data are also enumerated. This review highlights the fact that despite the numerous analytical processes that have already been proposed and tested to elucidate fingermark composition, advanced knowledge is still missing. Thus, there is a real need to conduct future research on the composition of fingermark residue, focusing particularly on quantitative measurements, aging kinetics and effects of influence factors. The results of future research are particularly important for advances in fingermark enhancement and dating technique developments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Osteoporosis is well recognized as a public health problem in industrialized countries. Because of the efficiency of new treatments to decrease fracture risk, it is of a major interest to detect the patients who should benefit from such treatments. A diagnosis of osteoporosis is necessary before to start a specific treatment. This diagnosis is based on the measurement of the skeleton (hip and spine) with dual X-ray absorptiometry, using diagnostic criteria established by the World Health Organisation (WHO). In Switzerland, indications for bone densitometry are limited to precise situations. This technique cannot be applied for screening. For this purpose, peripheral measurements and particularly quantitative ultrasounds of bone seem to be promising. Indeed, several prospective studies clearly showed their predictive power for hip fracture risk in women aged more than 65 years. In order to facilitate the clinical use of bone ultrasounds, thresholds of risk of fracture and osteoporosis of the hip will be shortly published. This will integrate bone ultrasound in a global concept including bone densitometry and its indications, but also other risk factors for osteoporosis recognized by the Swiss association against osteoporosis (ASCO).