816 resultados para Gold standard.


Relevância:

60.00% 60.00%

Publicador:

Resumo:

INTRODUCTION: Objective assessment of motor skills has become an important challenge in minimally invasive surgery (MIS) training.Currently, there is no gold standard defining and determining the residents' surgical competence.To aid in the decision process, we analyze the validity of a supervised classifier to determine the degree of MIS competence based on assessment of psychomotor skills METHODOLOGY: The ANFIS is trained to classify performance in a box trainer peg transfer task performed by two groups (expert/non expert). There were 42 participants included in the study: the non-expert group consisted of 16 medical students and 8 residents (< 10 MIS procedures performed), whereas the expert group consisted of 14 residents (> 10 MIS procedures performed) and 4 experienced surgeons. Instrument movements were captured by means of the Endoscopic Video Analysis (EVA) tracking system. Nine motion analysis parameters (MAPs) were analyzed, including time, path length, depth, average speed, average acceleration, economy of area, economy of volume, idle time and motion smoothness. Data reduction was performed by means of principal component analysis, and then used to train the ANFIS net. Performance was measured by leave one out cross validation. RESULTS: The ANFIS presented an accuracy of 80.95%, where 13 experts and 21 non-experts were correctly classified. Total root mean square error was 0.88, while the area under the classifiers' ROC curve (AUC) was measured at 0.81. DISCUSSION: We have shown the usefulness of ANFIS for classification of MIS competence in a simple box trainer exercise. The main advantage of using ANFIS resides in its continuous output, which allows fine discrimination of surgical competence. There are, however, challenges that must be taken into account when considering use of ANFIS (e.g. training time, architecture modeling). Despite this, we have shown discriminative power of ANFIS for a low-difficulty box trainer task, regardless of the individual significances between MAPs. Future studies are required to confirm the findings, inclusion of new tasks, conditions and sample population.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Introduction Diffusion weighted Imaging (DWI) techniques are able to measure, in vivo and non-invasively, the diffusivity of water molecules inside the human brain. DWI has been applied on cerebral ischemia, brain maturation, epilepsy, multiple sclerosis, etc. [1]. Nowadays, there is a very high availability of these images. DWI allows the identification of brain tissues, so its accurate segmentation is a common initial step for the referred applications. Materials and Methods We present a validation study on automated segmentation of DWI based on the Gaussian mixture and hidden Markov random field models. This methodology is widely solved with iterative conditional modes algorithm, but some studies suggest [2] that graph-cuts (GC) algorithms improve the results when initialization is not close to the final solution. We implemented a segmentation tool integrating ITK with a GC algorithm [3], and a validation software using fuzzy overlap measures [4]. Results Segmentation accuracy of each tool is tested against a gold-standard segmentation obtained from a T1 MPRAGE magnetic resonance image of the same subject, registered to the DWI space. The proposed software shows meaningful improvements by using the GC energy minimization approach on DTI and DSI (Diffusion Spectrum Imaging) data. Conclusions The brain tissues segmentation on DWI is a fundamental step on many applications. Accuracy and robustness improvements are achieved with the proposed software, with high impact on the application’s final result.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background Objective assessment of psychomotor skills has become an important challenge in the training of minimally invasive surgical (MIS) techniques. Currently, no gold standard defining surgical competence exists for classifying residents according to their surgical skills. Supervised classification has been proposed as a means for objectively establishing competence thresholds in psychomotor skills evaluation. This report presents a study comparing three classification methods for establishing their validity in a set of tasks for basic skills’ assessment. Methods Linear discriminant analysis (LDA), support vector machines (SVM), and adaptive neuro-fuzzy inference systems (ANFIS) were used. A total of 42 participants, divided into an experienced group (4 expert surgeons and 14 residents with >10 laparoscopic surgeries performed) and a nonexperienced group (16 students and 8 residents with <10 laparoscopic surgeries performed), performed three box trainer tasks validated for assessment of MIS psychomotor skills. Instrument movements were captured using the TrEndo tracking system, and nine motion analysis parameters (MAPs) were analyzed. The performance of the classifiers was measured by leave-one-out cross-validation using the scores obtained by the participants. Results The mean accuracy performances of the classifiers were 71 % (LDA), 78.2 % (SVM), and 71.7 % (ANFIS). No statistically significant differences in the performance were identified between the classifiers. Conclusions The three proposed classifiers showed good performance in the discrimination of skills, especially when information from all MAPs and tasks combined were considered. A correlation between the surgeons’ previous experience and their execution of the tasks could be ascertained from results. However, misclassifications across all the classifiers could imply the existence of other factors influencing psychomotor competence.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of automatic pathological voice detection systems is to serve as tools, to medical specialists, for a more objective, less invasive and improved diagnosis of diseases. In this respect, the gold standard for those system include the usage of a optimized representation of the spectral envelope, either based on cepstral coefficients from the mel-scaled Fourier spectral envelope (Mel-Frequency Cepstral Coefficients) or from an all-pole estimation (Linear Prediction Coding Cepstral Coefficients) forcharacterization, and Gaussian Mixture Models for posterior classification. However, the study of recently proposed GMM-based classifiers as well as Nuisance mitigation techniques, such as those employed in speaker recognition, has not been widely considered inpathology detection labours. The present work aims at testing whether or not the employment of such speaker recognition tools might contribute to improve system performance in pathology detection systems, specifically in the automatic detection of Obstructive Sleep Apnea. The testing procedure employs an Obstructive Sleep Apnea database, in conjunction with GMM-based classifiers looking for a better performance. The results show that an improved performance might be obtained by using such approach.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La nanotecnología es un área de investigación de reciente creación que trata con la manipulación y el control de la materia con dimensiones comprendidas entre 1 y 100 nanómetros. A escala nanométrica, los materiales exhiben fenómenos físicos, químicos y biológicos singulares, muy distintos a los que manifiestan a escala convencional. En medicina, los compuestos miniaturizados a nanoescala y los materiales nanoestructurados ofrecen una mayor eficacia con respecto a las formulaciones químicas tradicionales, así como una mejora en la focalización del medicamento hacia la diana terapéutica, revelando así nuevas propiedades diagnósticas y terapéuticas. A su vez, la complejidad de la información a nivel nano es mucho mayor que en los niveles biológicos convencionales (desde el nivel de población hasta el nivel de célula) y, por tanto, cualquier flujo de trabajo en nanomedicina requiere, de forma inherente, estrategias de gestión de información avanzadas. Desafortunadamente, la informática biomédica todavía no ha proporcionado el marco de trabajo que permita lidiar con estos retos de la información a nivel nano, ni ha adaptado sus métodos y herramientas a este nuevo campo de investigación. En este contexto, la nueva área de la nanoinformática pretende detectar y establecer los vínculos existentes entre la medicina, la nanotecnología y la informática, fomentando así la aplicación de métodos computacionales para resolver las cuestiones y problemas que surgen con la información en la amplia intersección entre la biomedicina y la nanotecnología. Las observaciones expuestas previamente determinan el contexto de esta tesis doctoral, la cual se centra en analizar el dominio de la nanomedicina en profundidad, así como en el desarrollo de estrategias y herramientas para establecer correspondencias entre las distintas disciplinas, fuentes de datos, recursos computacionales y técnicas orientadas a la extracción de información y la minería de textos, con el objetivo final de hacer uso de los datos nanomédicos disponibles. El autor analiza, a través de casos reales, alguna de las tareas de investigación en nanomedicina que requieren o que pueden beneficiarse del uso de métodos y herramientas nanoinformáticas, ilustrando de esta forma los inconvenientes y limitaciones actuales de los enfoques de informática biomédica a la hora de tratar con datos pertenecientes al dominio nanomédico. Se discuten tres escenarios diferentes como ejemplos de actividades que los investigadores realizan mientras llevan a cabo su investigación, comparando los contextos biomédico y nanomédico: i) búsqueda en la Web de fuentes de datos y recursos computacionales que den soporte a su investigación; ii) búsqueda en la literatura científica de resultados experimentales y publicaciones relacionadas con su investigación; iii) búsqueda en registros de ensayos clínicos de resultados clínicos relacionados con su investigación. El desarrollo de estas actividades requiere el uso de herramientas y servicios informáticos, como exploradores Web, bases de datos de referencias bibliográficas indexando la literatura biomédica y registros online de ensayos clínicos, respectivamente. Para cada escenario, este documento proporciona un análisis detallado de los posibles obstáculos que pueden dificultar el desarrollo y el resultado de las diferentes tareas de investigación en cada uno de los dos campos citados (biomedicina y nanomedicina), poniendo especial énfasis en los retos existentes en la investigación nanomédica, campo en el que se han detectado las mayores dificultades. El autor ilustra cómo la aplicación de metodologías provenientes de la informática biomédica a estos escenarios resulta efectiva en el dominio biomédico, mientras que dichas metodologías presentan serias limitaciones cuando son aplicadas al contexto nanomédico. Para abordar dichas limitaciones, el autor propone un enfoque nanoinformático, original, diseñado específicamente para tratar con las características especiales que la información presenta a nivel nano. El enfoque consiste en un análisis en profundidad de la literatura científica y de los registros de ensayos clínicos disponibles para extraer información relevante sobre experimentos y resultados en nanomedicina —patrones textuales, vocabulario en común, descriptores de experimentos, parámetros de caracterización, etc.—, seguido del desarrollo de mecanismos para estructurar y analizar dicha información automáticamente. Este análisis concluye con la generación de un modelo de datos de referencia (gold standard) —un conjunto de datos de entrenamiento y de test anotados manualmente—, el cual ha sido aplicado a la clasificación de registros de ensayos clínicos, permitiendo distinguir automáticamente los estudios centrados en nanodrogas y nanodispositivos de aquellos enfocados a testear productos farmacéuticos tradicionales. El presente trabajo pretende proporcionar los métodos necesarios para organizar, depurar, filtrar y validar parte de los datos nanomédicos existentes en la actualidad a una escala adecuada para la toma de decisiones. Análisis similares para otras tareas de investigación en nanomedicina ayudarían a detectar qué recursos nanoinformáticos se requieren para cumplir los objetivos actuales en el área, así como a generar conjunto de datos de referencia, estructurados y densos en información, a partir de literatura y otros fuentes no estructuradas para poder aplicar nuevos algoritmos e inferir nueva información de valor para la investigación en nanomedicina. ABSTRACT Nanotechnology is a research area of recent development that deals with the manipulation and control of matter with dimensions ranging from 1 to 100 nanometers. At the nanoscale, materials exhibit singular physical, chemical and biological phenomena, very different from those manifested at the conventional scale. In medicine, nanosized compounds and nanostructured materials offer improved drug targeting and efficacy with respect to traditional formulations, and reveal novel diagnostic and therapeutic properties. Nevertheless, the complexity of information at the nano level is much higher than the complexity at the conventional biological levels (from populations to the cell). Thus, any nanomedical research workflow inherently demands advanced information management. Unfortunately, Biomedical Informatics (BMI) has not yet provided the necessary framework to deal with such information challenges, nor adapted its methods and tools to the new research field. In this context, the novel area of nanoinformatics aims to build new bridges between medicine, nanotechnology and informatics, allowing the application of computational methods to solve informational issues at the wide intersection between biomedicine and nanotechnology. The above observations determine the context of this doctoral dissertation, which is focused on analyzing the nanomedical domain in-depth, and developing nanoinformatics strategies and tools to map across disciplines, data sources, computational resources, and information extraction and text mining techniques, for leveraging available nanomedical data. The author analyzes, through real-life case studies, some research tasks in nanomedicine that would require or could benefit from the use of nanoinformatics methods and tools, illustrating present drawbacks and limitations of BMI approaches to deal with data belonging to the nanomedical domain. Three different scenarios, comparing both the biomedical and nanomedical contexts, are discussed as examples of activities that researchers would perform while conducting their research: i) searching over the Web for data sources and computational resources supporting their research; ii) searching the literature for experimental results and publications related to their research, and iii) searching clinical trial registries for clinical results related to their research. The development of these activities will depend on the use of informatics tools and services, such as web browsers, databases of citations and abstracts indexing the biomedical literature, and web-based clinical trial registries, respectively. For each scenario, this document provides a detailed analysis of the potential information barriers that could hamper the successful development of the different research tasks in both fields (biomedicine and nanomedicine), emphasizing the existing challenges for nanomedical research —where the major barriers have been found. The author illustrates how the application of BMI methodologies to these scenarios can be proven successful in the biomedical domain, whilst these methodologies present severe limitations when applied to the nanomedical context. To address such limitations, the author proposes an original nanoinformatics approach specifically designed to deal with the special characteristics of information at the nano level. This approach consists of an in-depth analysis of the scientific literature and available clinical trial registries to extract relevant information about experiments and results in nanomedicine —textual patterns, common vocabulary, experiment descriptors, characterization parameters, etc.—, followed by the development of mechanisms to automatically structure and analyze this information. This analysis resulted in the generation of a gold standard —a manually annotated training or reference set—, which was applied to the automatic classification of clinical trial summaries, distinguishing studies focused on nanodrugs and nanodevices from those aimed at testing traditional pharmaceuticals. The present work aims to provide the necessary methods for organizing, curating and validating existing nanomedical data on a scale suitable for decision-making. Similar analysis for different nanomedical research tasks would help to detect which nanoinformatics resources are required to meet current goals in the field, as well as to generate densely populated and machine-interpretable reference datasets from the literature and other unstructured sources for further testing novel algorithms and inferring new valuable information for nanomedicine.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Actualmente, la Web provee un inmenso conjunto de servicios (WS-*, RESTful, OGC WFS), los cuales están normalmente expuestos a través de diferentes estándares que permiten localizar e invocar a estos servicios. Estos servicios están, generalmente, descritos utilizando información textual, sin una descripción formal, es decir, la descripción de los servicios es únicamente sintáctica. Para facilitar el uso y entendimiento de estos servicios, es necesario anotarlos de manera formal a través de la descripción de los metadatos. El objetivo de esta tesis es proponer un enfoque para la anotación semántica de servicios Web en el dominio geoespacial. Este enfoque permite automatizar algunas de las etapas del proceso de anotación, mediante el uso combinado de recursos ontológicos y servicios externos. Este proceso ha sido evaluado satisfactoriamente con un conjunto de servicios en el dominio geoespacial. La contribución principal de este trabajo es la automatización parcial del proceso de anotación semántica de los servicios RESTful y WFS, lo cual mejora el estado del arte en esta área. Una lista detallada de las contribuciones son: • Un modelo para representar servicios Web desde el punto de vista sintáctico y semántico, teniendo en cuenta el esquema y las instancias. • Un método para anotar servicios Web utilizando ontologías y recursos externos. • Un sistema que implementa el proceso de anotación propuesto. • Un banco de pruebas para la anotación semántica de servicios RESTful y OGC WFS. Abstract The Web contains an immense collection of Web services (WS-*, RESTful, OGC WFS), normally exposed through standards that tell us how to locate and invocate them. These services are usually described using mostly textual information and without proper formal descriptions, that is, existing service descriptions mostly stay on a syntactic level. If we want to make such services potentially easier to understand and use, we may want to annotate them formally, by means of descriptive metadata. The objective of this thesis is to propose an approach for the semantic annotation of services in the geospatial domain. Our approach automates some stages of the annotation process, by using a combination of thirdparty resources and services. It has been successfully evaluated with a set of geospatial services. The main contribution of this work is the partial automation of the process of RESTful and WFS semantic annotation services, what improves the current state of the art in this area. The more detailed list of contributions are: • A model for representing Web services. • A method for annotating Web services using ontological and external resources. • A system that implements the proposed annotation process. • A gold standard for the semantic annotation of RESTful and OGC WFS services, and algorithms for evaluating the annotations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La nanotecnología es el estudio que la mayoría de veces es tomada como una meta tecnológica que nos ayuda en el área de investigación para tratar con la manipulación y el control en forma precisa de la materia con dimensiones comprendidas entre 1 y 100 nanómetros. Recordando que el prefijo nano proviene del griego vavoc que significa enano y corresponde a un factor de 10^-9, que aplicada a las unidades de longitud corresponde a una mil millonésima parte de un metro. Ahora sabemos que esta ciencia permite trabajar con estructuras moleculares y sus átomos, obteniendo materiales que exhiben fenómenos físicos, químicos y biológicos, muy distintos a los que manifiestan los materiales usados con una longitud mayor. Por ejemplo en medicina, los compuestos manométricos y los materiales nano estructurados muchas veces ofrecen una mayor eficacia con respecto a las formulaciones químicas tradicionales, ya que muchas veces llegan a combinar los antiguos compuestos con estos nuevos para crear nuevas terapias e inclusive han llegado a reemplazarlos, revelando así nuevas propiedades diagnósticas y terapéuticas. A su vez, la complejidad de la información a nivel nano es mucho mayor que en los niveles biológicos convencionales y, por tanto, cualquier flujo de trabajo en nano medicina requiere, de forma inherente, estrategias de gestión de información avanzadas. Muchos investigadores en la nanotecnología están buscando la manera de obtener información acerca de estos materiales nanométricos, para mejorar sus estudios que muchas veces lleva a probar estos métodos o crear nuevos compuestos para ayudar a la medicina actual, contra las enfermedades más poderosas como el cáncer. Pero en estos días es muy difícil encontrar una herramienta que les brinde la información específica que buscan en los miles de ensayos clínicos que se suben diariamente en la web. Actualmente, la informática biomédica trata de proporcionar el marco de trabajo que permita lidiar con estos retos de la información a nivel nano, en este contexto, la nueva área de la nano informática pretende detectar y establecer los vínculos existentes entre la medicina, la nanotecnología y la informática, fomentando así la aplicación de métodos computacionales para resolver las cuestiones y problemas que surgen con la información en la amplia intersección entre la biomedicina y la nanotecnología. Otro caso en la actualidad es que muchos investigadores de biomedicina desean saber y comparar la información dentro de los ensayos clínicos que contiene temas de nanotecnología en las diferentes paginas en la web por todo el mundo, obteniendo en si ensayos clínicos que se han creado en Norte América, y ensayos clínicos que se han creado en Europa, y saber si en este tiempo este campo realmente está siendo explotado en los dos continentes. El problema es que no se ha creado una herramienta que estime un valor aproximado para saber los porcentajes del total de ensayos clínicos que se han creado en estas páginas web. En esta tesis de fin de máster, el autor utiliza un mejorado pre-procesamiento de texto y un algoritmo que fue determinado como el mejor procesamiento de texto en una tesis doctoral, que incluyo algunas pruebas con muchos de estos para obtener una estimación cercana que ayudaba a diferenciar cuando un ensayo clínico contiene información sobre nanotecnología y cuando no. En otras palabras aplicar un análisis de la literatura científica y de los registros de ensayos clínicos disponibles en los dos continentes para extraer información relevante sobre experimentos y resultados en nano medicina (patrones textuales, vocabulario en común, descriptores de experimentos, parámetros de caracterización, etc.), seguido el mecanismo de procesamiento para estructurar y analizar dicha información automáticamente. Este análisis concluye con la estimación antes mencionada necesaria para comparar la cantidad de estudios sobre nanotecnología en estos dos continentes. Obviamente usamos un modelo de datos de referencia (gold standard) —un conjunto de datos de entrenamiento anotados manualmente—, y el conjunto de datos para el test es toda la base de datos de estos registros de ensayos clínicos, permitiendo distinguir automáticamente los estudios centrados en nano drogas, nano dispositivos y nano métodos de aquellos enfocados a testear productos farmacéuticos tradicionales.---ABSTRACT---Nanotechnology is the scientific study that usually is seen as a technological goal that helps us in the investigation field to deal with the manipulation and precise control of the matter with dimensions that range from 1 to 100 nanometers. Remembering that the prefix nano comes from the Greek word νᾶνος, meaning dwarf and denotes a factor of 10^-9, that applyied the longitude units is equal to a billionth of a meter. Now we know that this science allows us to work with molecular structures and their atoms, obtaining material that exhibit physical, chemical and biological phenomena very different to those manifesting in materials with a bigger longitude. As an example in medicine, the nanometric compounds and the materials in nano structures are often offered with more effectiveness regarding to the traditional chemical formulas. This is due to the fact that many occasions combining these old compounds with the new ones, creates new therapies and even replaced them, reveling new diagnostic and therapeutic properties. Even though the complexity of the information at nano level is greater than that in conventional biologic level and, thus, any work flow in nano medicine requires, in an inherent way, advance information management strategies. Many researchers in nanotechnology are looking for a way to obtain information about these nanometric materials to improve their studies that leads in many occasions to prove these methods or to create a new compound that helps modern medicine against powerful diseases, such as cancer. But in these days it is difficult to find a tool that searches and provides a specific information in the thousands of clinic essays that are uploaded daily on the web. Currently, the bio medic informatics tries to provide the work frame that will allow to deal with these information challenge in nano level. In this context, the new area of nano informatics pretends to detect and establish the existing links between medicine, nanotechnology and informatics, encouraging the usage of computational methods to resolve questions and problems that surge with the wide information intersection that is between biomedicine and nanotechnology. Another present case, is that many biomedicine researchers want to know and be able to compare the information inside those clinic essays that contains subjects of nanotechnology on the different webpages across the world, obtaining the clinic essays that has been done in North America and the essays done in Europe, and thus knowing if in this time, this field is really being exploited in both continents. In this master thesis, the author will use an enhanced text pre-processor with an algorithm that was defined as the best text processor in a doctoral thesis, that included many of these tests to obtain a close estimation that helps to differentiate when a clinic essay contains information about nanotechnology and when it does not. In other words, applying an analysis to the scientific literature and clinic essay available in both continents, in order to extract relevant information about experiments and the results in nano-medicine (textual patterns, common vocabulary, experiments descriptors, characterization parameters, etc.), followed by the mechanism process to structure and analyze said information automatically. This analysis concludes with the estimation, mentioned before, needed to compare the quantity of studies about nanotechnology in these two continents. Obviously we use a data reference model (Gold standard) – a set of training data manually annotated –, and the set of data for the test conforms the entire database of these clinic essay registers, allowing to distinguish automatically the studies centered on nano drugs, nano devices and nano methods of those focus on testing traditional pharmaceutical products.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The structural connectivity of the brain is considered to encode species-wise and subject-wise patterns that will unlock large areas of understanding of the human brain. Currently, diffusion MRI of the living brain enables to map the microstructure of tissue, allowing to track the pathways of fiber bundles connecting the cortical regions across the brain. These bundles are summarized in a network representation called connectome that is analyzed using graph theory. The extraction of the connectome from diffusion MRI requires a large processing flow including image enhancement, reconstruction, segmentation, registration, diffusion tracking, etc. Although a concerted effort has been devoted to the definition of standard pipelines for the connectome extraction, it is still crucial to define quality assessment protocols of these workflows. The definition of quality control protocols is hindered by the complexity of the pipelines under test and the absolute lack of gold-standards for diffusion MRI data. Here we characterize the impact on structural connectivity workflows of the geometrical deformation typically shown by diffusion MRI data due to the inhomogeneity of magnetic susceptibility across the imaged object. We propose an evaluation framework to compare the existing methodologies to correct for these artifacts including whole-brain realistic phantoms. Additionally, we design and implement an image segmentation and registration method to avoid performing the correction task and to enable processing in the native space of diffusion data. We release PySDCev, an evaluation framework for the quality control of connectivity pipelines, specialized in the study of susceptibility-derived distortions. In this context, we propose Diffantom, a whole-brain phantom that provides a solution to the lack of gold-standard data. The three correction methodologies under comparison performed reasonably, and it is difficult to determine which method is more advisable. We demonstrate that susceptibility-derived correction is necessary to increase the sensitivity of connectivity pipelines, at the cost of specificity. Finally, with the registration and segmentation tool called regseg we demonstrate how the problem of susceptibility-derived distortion can be overcome allowing data to be used in their original coordinates. This is crucial to increase the sensitivity of the whole pipeline without any loss in specificity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Toxoplasma gondii is a coccidian parasite with a global distribution. The definitive host is the cat (and other felids). All warm-blooded animals can act as intermediate hosts, including humans. Sexual reproduction (gametogony) takes place in the final host and oocysts are released in the environment, where they then sporulate to become infective. In intermediate hosts the cycle is extra-intestinal and results in the formation of tachyzoites and bradyzoites. Tachyzoites represent the invasive and proliferative stage and on entering a cell it multiplies asexually by endodyogeny. Bradyzoites within tissue cysts are the latent form. T. gondii is a food-borne parasite causing toxoplasmosis, which can occur in both animals and humans. Infection in humans is asymptomatic in more than 80% of cases in Europe and North-America. In the remaining cases patients present fever, cervical lymphadenopathy and other non-specific clinical signs. Nevertheless, toxoplasmosis is life threatening if it occurs in immunocompromised subjects. The main organs involved are brain (toxoplasmic encephalitis), heart (myocarditis), lungs (pulmonary toxoplasmosis), eyes, pancreas and parasite can be isolated from these tissues. Another aspect is congenital toxoplasmosis that may occur in pregnant women and the severity of the consequences depends on the stage of pregnancy when maternal infection occurs. Acute toxoplasmosis in developing foetuses may result in blindness, deformation, mental retardation or even death. The European Food Safety Authority (EFSA), in recent reports on zoonoses, highlighted that an increasing numbers of animals resulted infected with T. gondii in EU (reported by the European Member States for pigs, sheep, goats, hunted wild boar and hunted deer, in 2011 and 2012). In addition, high prevalence values have been detected in cats, cattle and dogs, as well as several other animal species, indicating the wide distribution of the parasite among different animal and wildlife species. The main route of transmission is consumption of food and water contaminated with sporulated oocysts. However, infection through the ingestion of meat contaminated with tissue cysts is frequent. Finally, although less frequent, other food products contaminated with tachyzoites such as milk, may also pose a risk. The importance of this parasite as a risk for human health was recently highlighted by EFSA’s opinion on modernization of meat inspection, where Toxoplasma gondii was identified as a relevant hazard to be addressed in revised meat inspection systems for pigs, sheep, goats, farmed wild boar and farmed deer (Call for proposals -GP/EFSA/BIOHAZ/2013/01). The risk of infection is more highly associated to animals reared outside, also in free-range or organic farms, where biohazard measure are less strict than in large scale, industrial farms. Here, animals are kept under strict biosecurity measures, including barriers, which inhibit access by cats, thus making soil contamination by oocysts nearly impossible. A growing demand by the consumer for organic products, coming from free-range livestock, in respect of animal-welfare, and the desire for the best quality of derived products, have all led to an increase in the farming of free-range animals. The risk of Toxoplasma gondii infection increases when animals have access to environment and the absence of data in Italy, together with need for in depth study of both the prevalence and genotypes of Toxoplasma gondii present in our country were the main reasons for the development of this thesis project. A total of 152 animals have been analyzed, including 21 free-range pigs (Suino Nero race), 24 transhumant Cornigliese sheep, 77 free-range chickens and 21 wild animals. Serology (on meat juice) and identification of T. gondii DNA through PCR was performed on all samples, except for wild animals (no serology). An in-vitro test was also applied with the aim to find an alternative and valid method to bioassay, actually the gold standard. Meat samples were digested and seeded onto Vero cells, checked every day and a RT-PCR protocol was used to determine an eventual increase in the amount of DNA, demonstrating the viability of the parasite. Several samples were alos genetically characterized using a PCR-RFLP protocol to define the major genotypes diffused in the geographical area studied. Within the context of a project promoted by Istituto Zooprofilattico of Pavia and Brescia (Italy), experimentally infected pigs were also analyzed. One of the aims was to verify if the production process of cured “Prosciutto di Parma” is able to kill the parasite. Our contribution included the digestion and seeding of homogenates on Vero cells and applying the Elisa test on meat juice. This thesis project has highlighted widespread diffusion of T. gondii in the geographical area taken into account. Pigs, sheep, chickens and wild animals showed high prevalence of infection. The data obtained with serology were 95.2%, 70.8%, 36.4%, respectively, indicating the spread of the parasite among numerous animal species. For wild animals, the average value of parasite infection determined through PCR was 44.8%. Meat juice serology appears to be a very useful, rapid and sensitive method for screening carcasses at slaughterhouse and for marketing “Toxo-free” meat. The results obtained on fresh pork meat (derived from experimentally infected pigs) before (on serum) and after (on meat juice) slaughter showed a good concordance. The free-range farming put in evidence a marked risk for meat-producing animals and as a consequence also for the consumer. Genotyping revealed the diffusion of Type-II and in a lower percentage of Type-III. In pigs is predominant the Type-II profile, while in wildlife is more diffused a Type-III and mixed profiles (mainly Type-II/III). The mixed genotypes (Type-II/III) could be explained by the presence of mixed infections. Free-range farming and the contact with wildlife could facilitate the spread of the parasite and the generation of new and atypical strains, with unknown consequences on human health. The curing process employed in this study appears to produce hams that do not pose a serious concern to human health and therefore could be marketed and consumed without significant health risk. Little is known about the diffusion and genotypes of T. gondii in wild animals; further studies on the way in which new and mixed genotypes may be introduced into the domestic cycle should be very interesting, also with the use of NGS techniques, more rapid and sensitive than PCR-RFLP. Furthermore wildlife can become a valuable indicator of environmental contamination with T. gondii oocysts. Other future perspectives regarding pigs include the expansion of the number of free-range animals and farms and for Cornigliese sheep the evaluation of other food products as raw milk and cheeses. It should be interesting to proceed with the validation of an ELISA test for infection in chickens, using both serum and meat juice on a larger number of animals and the same should be done also for wildlife (at the moment no ELISA tests are available and MAT is the reference method for them). Results related to Parma ham do not suggest a concerning risk for consumers. However, further studies are needed to complete the risk assessment and the analysis of other products cured using technological processes other than those investigated in the present study. For example, it could be interesting to analyze products such as salami, produced with pig meat all over the Italian country, with very different recipes, also in domestic and rural contexts, characterized by a very short period of curing (1 to 6 months). Toxoplasma gondii is one of the most diffuse food-borne parasites globally. Public health safety, improved animal production and protection of endangered livestock species are all important goals of research into reliable diagnostic tools for this infection. Future studies into the epidemiology, parasite survival and genotypes of T. gondii in meat producing animals should continue to be a research priority.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

INTRODUCAO: A comunicação interatrial tipo \"ostium secundum\" é um defeito cardíaco congênito caracterizado pela deficiência parcial ou total da lâmina da fossa oval, também chamada de septo primo. Corresponde a 10 a 12% do total de cardiopatias congênitas, sendo a mais frequente na idade adulta. Atualmente a oclusão percutânea é o método terapêutico de escolha em defeitos com características anatômicas favoráveis para o implante de próteses na maioria dos grandes centros mundiais. A ecocardiografia transesofágica bidimensional com mapeamento de fluxo em cores é considerada a ferramenta padrão-ouro para a avaliação anatômica e monitoração durante do procedimento, sendo crucial para a ótima seleção do dispositivo. Neste sentido, um balão medidor é introduzido e insuflado através do defeito de forma a ocluí-lo temporariamente. A medida da cintura que se visualiza no balão (diâmetro estirado) é utilizada como referência para a escolha do tamanho da prótese. Recentemente a ecocardiografia tridimensional transesofágica em tempo real tem sido utilizada neste tipo de intervenção percutânea. Neste estudo avaliamos o papel da mesma na ótima seleção do dispositivo levando-se em consideração as dimensões e a geometria do defeito e a espessura das bordas do septo interatrial. METODO: Estudo observacional, prospectivo, não randomizado, de único braço, de uma coorte de 33 pacientes adultos portadores de comunicação interatrial submetidos a fechamento percutâneo utilizando dispositivo de nitinol autocentrável (Cera ®, Lifetech Scientific, Shenzhen, China). Foram analisadas as medidas do maior e menor diâmetro do defeito, sua área e as medidas do diâmetro estirado com balão medidor obtidas por meio das duas modalidades ecocardiográficas. Os defeitos foram considerados como elípticos ou circulares segundo a sua geometria; as bordas ao redor da comunicação foram consideradas espessas (>2 mm) ou finas. O dispositivo selecionado foi igual ou ate 2 mm maior que o diâmetro estirado na ecocardiografia transesofágica bidimensional (padrão-ouro). Na tentativa de identificar uma variável que pudesse substituir o diâmetro estirado do balão para a ótima escolha do dispositivo uma série de correlações lineares foram realizadas. RESULTADOS: A idade e peso médio foram de 42,1 ± 14,9 anos e 66,0 ± 9,4kg, respectivamente; sendo 22 de sexo feminino. Não houve diferenças estatísticas entre os diâmetros maior e menor ou no diâmetro estirado dos defeitos determinados por ambas as modalidades ecocardiográficas. A correlação entre as medidas obtidas com ambos os métodos foi ótima (r > 0,90). O maior diâmetro do defeito, obtido à ecoardiografia transesofágica tridimensional, foi a variável com melhor correlação com o tamanho do dispositivo selecionado no grupo como um todo (r= 0,89) e, especialmente, nos subgrupos com geometria elíptica (r= 0,96) e com bordas espessas ao redor do defeito (r= 0,96). CONCLUSÃO: Neste estudo em adultos com comunicações interatriais tipo ostium secundum submetidos à oclusão percutânea com a prótese Cera ®, a ótima seleção do dispositivo pôde ser realizada utilizando-se apenas a maior medida do defeito obtida na ecocardiografia transesofágica tridimensional em tempo real, especialmente nos pacientes com defeitos elípticos e com bordas espessas.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O biótipo gengival, definido como a espessura da gengiva no sentido vestíbulo-lingual, desempenha importante papel na homeostasia dos tecidos periodontais, podendo ser considerado um preditor no sucesso em longo prazo das terapias periodontais e periimplantares. Assim sendo, é de suma importância reconhecer as dimensões do tecido gengival e as diferentes formas de qualificá-lo e principalmente quantificá-lo. Apesar de haver descrito na literatura inúmeros métodos para este fim, existem poucos estudos comparando a efetividade de um método em relação a outro. Desta maneira, este estudo buscou avaliar se há concordância entre avaliações clínicas e tomográficas na classificação do biótipo gengival, se existe correlação entre o biótipo gengival e a espessura óssea subjacente, além de descrever um novo método de tomada tomográfica que permita a análise quantitativa do biótipo gengival. Foram avaliados 12 indivíduos os quais realizaram tomografias computadorizadas de feixe cônico como exame imaginológico de diagnóstico ou planejamento pré-cirúrgico. Em cada paciente foram realizados quatro diferentes métodos de avaliação qualitativa da espessura gengival (transparência a sondagem, transgengival, visual através de fotografia e tomográfico), dois métodos de avaliação quantitativa (transgengival e tomográfico) da espessura gengival e avaliação da espessura óssea através da tomografia computadorizada de feixe cônico. Os resultados foram avaliados estatisticamente através do teste KAPPA, Teste t pareado e coeficiente de correlação de Pearson (pM0.05). O novo método de tomada tomográfica descrito neste estudo é eficaz para avaliação do biótipo gengival, havendo grande concordância (86,1% Kappa 0,51) e forte correlação (r=0,824) entre ele e o método transgengival (padrão ouro). A correlação entre a espessura óssea e a espessura gengival foi moderada quando utilizado o método transgengival e tomográfico (r=0,567 e r=0,653 respectivamente).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this paper is to conduct a review of studies on cystoid macular edema published in the last seven years. Cystoid macular edema is a major cause of loss of visual acuity. It is the final common pathway of many diseases and can be caused by numerous processes including inflammatory, vascular, adverse drug reactions, retinal dystrophy or intraocular tumors. These processes disrupt the blood-retinal barrier, with fluid extravasation to the macular parenchyma. Imaging tests are essential for both detection and monitoring of this pathology. Fluorescein angiography and autofluorescence show the leakage of liquid from perifoveal vessels into the tissue where it forms cystic spaces. Optical coherence tomography is currently the gold standard technique for diagnosis and monitoring. This allows objective measurement of retinal thickness, which correlates with visual acuity and provides more complete morphological information. Based on the underlying etiology, the therapeutic approach can be either surgical or medical with anti-inflammatory drugs. We found that disruption of the blood-retinal barrier for various reasons is the key point in the pathogenesis of cystoid macular edema, therefore we believe that studies on its treatment should proceed on this path.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Introdução: O diagnóstico histológico das estenoses biliares é fundamental na definição da terapêutica a ser empregada, devido à heterogeneidade dos resultados dos estudos comparando o uso do escovado citológico e da biópsia transpapilar na colangiopancreatografia retrógada endoscópica (CPRE) com a punção aspirativa ecoguiada com agulha fina (ECO-PAAF) no diagnóstico histológico da estenose biliar maligna, e o fato de não existirem revisões sistemáticas e metanálises comparando esses métodos, este estudo propõe comparar esses dois métodos no diagnóstico histológico da estenose biliar maligna, através de revisão sistemática e metanálise da literatura. Métodos: Utilizando as bases de dados eletrônicas Medline, Embase, Cochrane, LILACS, CINAHL, e Scopus foram pesquisados estudos datados anteriormente a novembro de 2014. De um total de 1009 estudos publicados, foram selecionados três estudos prospectivos comparando ECO-PAAF e CPRE no diagnóstico histológico da estenose biliar maligna e cinco estudos transversais comparando ECO-PAAF com o mesmo padrão-ouro dos outros três estudos comparativos. Todos os pacientes foram submetidos ao mesmo padrão-ouro. Foram calculadas as variáveis do estudo (prevalência, sensibilidade, especificidade, valores preditivos positivos e negativos e acurácia) e realizada a metanálise utilizando os softwares Rev Man 5 e Meta-DiSc 1.4. Resultados: Um total de 294 pacientes foi incluído na análise. A probabilidade pré-teste para estenose biliar maligna foi de 76,66%. As sensibilidades médias da CPRE e da ECO-PAAF para o diagnóstico histológico da estenose biliar maligna foram de 49% e 76,5%, respectivamente; especificidades foram de 96,33% e 100%, respectivamente. As probabilidades pós-teste também foram determinadas: valores preditivos positivos de 98,33% e 100%, respectivamente, e valores preditivos negativos de 34% e 58,87%. As acurácias foram 60,66% e 82,25%, respectivamente. Conclusão: A ECO-PAAF é superior a CPRE com escovado citológico e/ou biópsia transpapilar no diagnóstico histológico da estenose biliar maligna. No entanto, um teste de ECO-PAAF ou CPRE com amostra histológica negativa não pode excluir a estenose biliar maligna, pois ambos os testes apresentam baixo valor preditivo negativo

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A síndrome de Sjögren primária (SSp) é uma doença crônica autoimune sistêmica que pode levar à hipossalivação e afetar negativamente o ambiente oral. Os objetivos deste estudo foram detectar a influência da SSp nos níveis de biomarcadores inflamatórios na saliva e no fluido gengival nas amostras de pacientes com periodontite crônica, avaliar o efeito do tratamento periodontal não cirúrgico sobre os valores do índice clínico de avaliação da atividade sistêmica de pacientes com SSp e do índice reportado pelo paciente com SSp. Amostras de fluido gengival, saliva e os parâmetros clínicos periodontais que consistiram de medida da profundidade de sondagem (PS), nível clínico de inserção (NCI), sangramento à sondagem (SS) e índice de placa (IP) foram coletadas no início do estudo e 45 dias após a terapia periodontal não-cirúrgica de pacientes sistemicamente saudáveis com periodontite crônica (PC, n = 7) e pacientes com SSp e periodontite crônica (SP, n = 7). Pacientes periodontalmente saudáveis com SSp (SC, n = 7) e sistemicamente saudáveis (C, n = 7) também foram avaliados no início do estudo. Os grupos C, PC e SC foram pareados em gênero, idade e critério socioeconômico com o grupo SP. Os níveis de interleucina-8 (IL-8), IL-10 e IL-1ß foram avaliados por ensaio multiplex. Os níveis de atividade da doença foram medidos usando o Gold Standard da literatura chamado Índice Eular de atividade da síndrome de Sjögren (ESSDAI). Já para avaliação dos sintomas reportados pelo paciente com SSp foi utilizado o Índice Eular reportado pelo paciente com Sjögren (ESSPRI). Os parâmetros clínicos melhoraram após a terapia periodontal (p <0,05). No entanto, o NCI em pacientes com SSp não melhorou significativamente após a terapia (p> 0,05). Houve um aumento nos níveis de IL-1ß, IL-8 e diminuição dos níveis de IL-10 nas amostras de saliva de pacientes do grupo SC em comparação ao grupo C (p <0,05). Já em relação ao fluido gengival, pacientes do grupo SC tiveram maiores níveis de IL-1ß em comparação com o grupo C (p<0,05). Além disso, o tratamento periodontal não cirúrgico resultou num aumento dos níveis de IL-10 no fluido gengival no grupo SP e grupo PC em relação ao valor basal (p <0,05). O fluxo salivar foi significativamente aumentado após o tratamento periodontal apenas em pacientes do grupo SP (p = 0,039). Além disso, o tratamento periodontal não influenciou o índice ESSDAI (p = 0,35) e levou a uma diminuição significativa no índice ESSPRI (p = 0,03). Os presentes dados demonstraram que a SSp influencia os níveis salivares e de fluido gengival de biomarcadores inflamatórios em favor de um perfil próinflamatório, no entanto, este perfil parece não aumentar susceptibilidade dos indivíduos SSp à destruição periodontal. Além disso, os presentes dados demonstraram que o tratamento periodontal não-cirúrgico tem um impacto positivo sobre o fluxo salivar e sobre o índice ESSPRI de pacientes com SSp. Sugere-se assim que o tratamento periodontal pode melhorar a qualidade de vida de indivíduos com SSp.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A lesão do plexo braquial é considerada a alteração neural mais grave das extremidades. A principal causa é o trauma de alta energia, especialmente acidentes envolvendo veículos a motor. Por este motivo, as lesões traumáticas do plexo braquial são cada vez mais frequentes. O presente estudo avaliou a acurácia da ressonância magnética (RM) no diagnóstico das lesões traumáticas do plexo braquial no adulto, utilizando o achado intraoperatório como padrão-ouro. Também foi avaliada a acurácia da neurografia pesada em difusão (neurografia DW) em relação à RM convencional e a capacidade de diferenciação dos três tipos de lesão: avulsão, ruptura e lesão em continuidade. Trinta e três pacientes com história e diagnóstico clínico de lesão traumática do plexo braquial foram prospectivamente estudados por RM. Os achados obtidos pela RM sem e com o uso da neurografia DW, e os achados de exame clínico foram comparados com os achados intraoperatórios. A análise estatística foi feita com associação de significância de 5%. Observou-se alta correlação entre a RM com neurografia DW e a cirurgia (rs=0,79), e baixa correlação entre a RM convencional e a cirurgia (rs=0,41). A correlação interobservador foi maior para a RM com neurografia DW (rs = 0,94) do que para a RM sem neurografia DW (rs = 0,75). Os resultados de sensibilidade, acurácia e valor preditivo positivo foram acima de 95% para as RM com e sem neurografia DW no estudo de todo o plexo. As especificidades foram, em geral, maiores para a neurografia DW (p < 0,05). Em relação à diferenciação dos tipos de lesão, a RM com neurografia DW apresentou altas acurácias e sensibilidades no diagnóstico da avulsão/rotura, e alta especificidade no diagnóstico da lesão em continuidade. A acurácia da RM (93,9%) foi significativamente maior que a do exame clínico (76,5%) no diagnóstico das lesões de todo o plexo braquial (p < 0,05).