987 resultados para enteric lesions
Resumo:
Detecting lame cows is important in improving animal welfare. Automated tools are potentially useful to enable identification and monitoring of lame cows. The goals of this study were to evaluate the suitability of various physiological and behavioral parameters to automatically detect lameness in dairy cows housed in a cubicle barn. Lame cows suffering from a claw horn lesion (sole ulcer or white line disease) of one claw of the same hind limb (n=32; group L) and 10 nonlame healthy cows (group C) were included in this study. Lying and standing behavior at night by tridimensional accelerometers, weight distribution between hind limbs by the 4-scale weighing platform, feeding behavior at night by the nose band sensor, and heart activity by the Polar device (Polar Electro Oy, Kempele, Finland) were assessed. Either the entire data set or parts of the data collected over a 48-h period were used for statistical analysis, depending upon the parameter in question. The standing time at night over 12 h and the limb weight ratio (LWR) were significantly higher in group C as compared with group L, whereas the lying time at night over 12 h, the mean limb difference (△weight), and the standard deviation (SD) of the weight applied on the limb taking less weight were significantly lower in group C as compared with group L. No significant difference was noted between the groups for the parameters of heart activity and feeding behavior at night. The locomotion score of cows in group L was positively correlated with the lying time and △weight, whereas it was negatively correlated with LWR and SD. The highest sensitivity (0.97) for lameness detection was found for the parameter SD [specificity of 0.80 and an area under the curve (AUC) of 0.84]. The highest specificity (0.90) for lameness detection was present for Δweight (sensitivity=0.78; AUC=0.88) and LWR (sensitivity=0.81; AUC=0.87). The model considering the data of SD together with lying time at night was the best predictor of cows being lame, accounting for 40% of the variation in the likelihood of a cow being lame (sensitivity=0.94; specificity=0.80; AUC=0.86). In conclusion, the data derived from the 4-scale-weighing platform, either alone or combined with the lying time at night over 12 h, represent the most valuable parameters for automated identification of lame cows suffering from a claw horn lesion of one individual hind limb.
Resumo:
Infection with certain types of HPV is a necessary event in the development of cervical carcinoma; however, not all women who become infected will progress. While much is known about the molecular influence of HPV E6 and E7 proteins on the malignant transformation, little is known about the additional factors needed to drive the process. Currently, conventional cervical screening is insufficient at identifying women who are likely to progress from premalignant lesions to carcinoma. Aneuploidy and chromatin texture from image cytometry have been suggested as quantitative measures of nuclear damage in premalignant lesions and cancer, and traditional epidemiologic studies have identified potential factors to aid in the discrimination of those lesions likely to progress. ^ In the current study, real-time PCR was used to quantitate mRNA expression of the E7 gene in women exhibiting normal epithelium, LSIL, and HSIL. Quantitative cytometry was used to gather information about the DNA index and chromatin features of cells from the same women. Logistic regression modeling was used to establish predictor variables for histologic grade based on the traditional epidemiologic risk factors and molecular markers. ^ Prevalence of mRNA transcripts was lower among women with normal histology (27%) than for women with LSIL (40%) and HSIL (37%) with mean levels ranging from 2.0 to 4.2. The transcriptional activity of HPV 18 was higher than that of HPV 16 and increased with increasing level of dysplasia, reinforcing the more aggressive nature of HPV 18. DNA index and mRNA level increased with increasing histological grade. Chromatin score was not correlated with histology but was higher for HPV 18 samples and those with both HPV 18 and HPV 16. However, chromatin score and DNA index were not correlated with mRNA levels. The most predictive variables in the regression modeling were mRNA level, DNA index, parity, and age, and the ROC curves for LSIL and HSIL indicated excellent discrimination. ^ Real-time PCR of viral transcripts could provide a more efficient method to analyze the oncogenic potential within cells from cervical swabs. Epidemiological modeling of malignant progression in the cervix should include molecular markers, as well as the traditional epidemiological risk factors. ^
Resumo:
Background. Large field studies in travelers' diarrhea (TD) in multiple destinations are limited by the need to perform stool cultures on site in a timely manner. A method for the collection, transport and storage of fecal specimens that does not require immediate processing, refrigeration and is stable for months would be advantageous. ^ Objectives. Determine if enteric pathogen bacterial DNA can be identified in cards routinely used for evaluation of fecal occult blood. ^ Methods. U.S. students traveling to Mexico in 2005-07 were followed for occurrence of diarrheal illness. When ill, students provided a stool specimen for culture and occult blood by the standard method. Cards were then stored at room temperature prior to DNA extraction. A multiplex fecal PCR was performed to identify enterotoxigenic Escherichia coli and enteroaggregative E. coli (EAEC) in DNA extracted from stools and occult blood cards. ^ Results. Significantly more EAEC cases were identified by PCR done in DNA extracted from cards (49%) or from frozen feces (40%) than by culture followed by HEp-2 adherence assays (13%). Similarly more ETEC cases were detected in card DNA (38%) than fecal DNA (30%) or culture followed by hybridization (10%). Sensitivity and specificity of the card test was 75% and 62%, respectively, and 50% and 63%, respectively, when compared to EAEC and ETEC culture, respectively, and 53% and 51%, respectively compared to EAEC multiplex fecal PCR and 56% and 70%, respectively, compared to ETEC multiplex fecal PCR. ^ Conclusions. DNA extracted from fecal cards used for detection of occult blood is of use in detecting enteric pathogens. ^
Resumo:
The hypothesis tested was that rapid rejection of Trichinella spiralis infective larvae from immunized rats following a challenge infection is associated with a local anaphylactic reaction, and this response should be reflected in altered small intestinal motility. The objective was to determine if altered gut smooth muscle function accompanies worm rejection based on the assumption that anaphylaxis in vivo could be detected by changes in intestinal smooth muscle contractile activity (ie. an equivalent of the Schultz-Dale reaction or in vitro anaphylaxis). The aims were to (1) characterize motility changes by monitoring intestinal myoelectric activity in conscious rats during the enteric phase of T. spiralis infection in immunized hosts, (2) detect the onset and magnitude of myoelectric changes caused by challenge infection in immunized rats, (3) determine the parasite stimulus causing changes, and (4) determine the specificity of host response to stimulation. Electrical slow wave frequency, spiking activity, normal interdigestive migrating myoelectric complexes and abnormal migrating action potential complexes were measured. Changes in myoelectric parameters induced by larvae inoculated into the duodenum of immune hosts differed from those associated with primary infection with respect to time of onset, magnitude and duration. Myoelectric changes elicited by live larvae could not be reproduced by inoculation of hosts with dead larvae, larval excretory-secretory products, or by challenge with a heterologous parasite, Eimeria nieschulzi. These results indicate that (1) local anaphylaxis is a component of the initial response to T. spiralis in immune hosts, since the rapid onset of altered smooth muscle function parallels in time the expression of rapid rejection of infective larvae, and (2) an active mucosal penetration attempt by the worm is necessary to elicit this host response. These findings provide evidence that worm rejection is a consequence of, or sequel to, an immediate hypersensitivity reaction elicited when parasites attempt to invade the gut mucosa of immunized hosts. ^
Resumo:
Bacterial pathogens such as enterotoxigenic Escherichia coli, Salmonella, and Campylobacter spp. are associated with up to 80% of diarrheal illness to travelers from developed countries to developing countries. In order to study acute gastrointestinal diseases, researchers from developed countries such as the United States rely on transporting clinical specimens from the developing countries to laboratories in the U.S. in transport media systems. There are few commercially available transport media systems cited in the literature or designated by transport system manufacturers for the transport of enteric bacteria. Therefore a laboratory-based study was conducted to assess three commercial available transport media systems, two gel swabs and one liquid vial, to determine the most appropriate for the maintenance and recovery of common enteric bacterial pathogens. A total of 13 bacterial enteropathogens were recovered from 25°C and 4°C storage temperatures at time points up to 21 days. The results demonstrated that the gel swab and liquid vial transport systems performed similarly for all isolates at both temperatures. All three transport media systems struggled to maintain the isolates at recoverable concentrations when stored at 4°C and it is recommended that isolates be stored at 25°C in transport media systems. Lastly, swab transport systems are recommend for transport since they are small and easy to pack, resist leakage, and are less expensive than similarly performing liquid vial transport media systems.^
Resumo:
Oral lesions, which may be bacterial, fungal or viral in nature may be characteristic of HIV/ AIDS, and have been observed on the oral mucosa as early signs of underlying disease. Some studies have suggested that there may be a correlation between poor oral hygiene, and the oral lesions seen among people living with HIV/AIDS.^ The objective of this study was to assess the nature of the relationship between oral health care practices, and the occurrence of oral lesions commonly seen in association with HIV/AIDS. A systematic review of the literature was conducted, and the databases searched were Medline and PubMed. Concepts that made up the search were oral hygiene promotion, HIV/AIDS and oral health care. Out of the 410 items identified through the search, only 11 met the inclusion criteria.^ The use of 0.12%–2% chlorhexidine gluconate, was found to be effective in reducing oral Candida counts in some studies, while other studies did not find such an association. However, 0.12%–2% chlorhexidine gluconate was consistently found to be effective in the management of periodontal lesions in people infected with HIV/AIDS. ^ Dental procedures such as treatment and filling of dental cavities, scaling and polishing, and use of fluoridated tooth paste were also found to be effective in the management of oral lesions seen in association with HIV/AIDS.^ The overall findings from the studies reviewed, suggest that effective oral health care may be necessary to reduce the morbidity, and mortality associated with the oral lesions seen among people living with HIV/AIDS. However, better designed studies with larger sample sizes need to be developed in order to ascertain the effectiveness of routine oral hygiene, and health care practices among people living with HIV/AIDS.^
Resumo:
An investigation was undertaken to evaluate the role of fomites in the transmission of diarrhea in day-care centers (DCC) and to elucidate the paths by which enteric organisms spread within this setting.^ During a nine-month period (December 1980-August 1981) extensive culturing of inanimate objects, as well as children and staff was done routinely each month and again repeated during diarrhea outbreaks. Air was sampled from the classrooms and toilets using a Single-Stage Sieve Sampler (Ross Industries, Midland, VA.). Stool samples were collected from both ill and well children and staff in the affected rooms only during outbreaks. Environmental samples were processed for Shigella, salmonella and fecal coliforms while stools were screened for miscellaneous enteropathogens.^ A total of 11 outbreaks occurred in the 5 DCC during the study period. Enteric pathogens were recovered in 7 (64%) of the outbreaks. Multiple pathogens were identified in 3 outbreaks. The most frequently identified pathogen in stools was Giardia lamblia which was recovered in 5 (45%) of the outbreaks. Ten of the 11 (91%) outbreaks occurred in children less than 12 months of age.^ Environmental microbiology studies together with epidemiologic information revealed that enteric organisms were transmitted from person-to-person. On routine sampling, fecal coliforms were most frequently isolated from tap handles and diaper change areas. Contamination with fetal coliforms was wide-spread during diarrhea outbreaks. Fecal coliforms were recovered with significantly greater frequency from hands, toys and other classroom objects during outbreaks than during non-outbreak period. Salmonella typhimurium was recovered from a table top during an outbreak of Salmonellosis. There was no association between the level of enteric microbial contamination in the toilet areas and the occurrence of outbreaks. No evidence was found to indicate that enteric organisms were spread by the airborne route via aerosols.^ Toys, other classroom objects and contaminated hands probably play a major role in the transmission of enteropathogens during day-care center outbreaks. The presence of many enteric agents in the environment undoubtedly explains the polymicrobial etiology of the day-care center associated diarrhea outbreaks. ^
Resumo:
Groundwater constitutes approximately 30% of freshwater globally and serves as a source of drinking water in many regions. Groundwater sources are subject to contamination with human pathogens (viruses, bacteria and protozoa) from a variety of sources that can cause diarrhea and contribute to the devastating global burden of this disease. To attempt to describe the extent of this public health concern in developing countries, a systematic review of the evidence for groundwater microbially-contaminated at its source as risk factor for enteric illness under endemic (non-outbreak) conditions in these countries was conducted. Epidemiologic studies published in English language journals between January 2000 and January 2011, and meeting certain other criteria, were selected, resulting in eleven studies reviewed. Data were extracted on microbes detected (and their concentrations if reported) and on associations measured between microbial quality of, or consumption of, groundwater and enteric illness; other relevant findings are also reported. In groundwater samples, several studies found bacterial indicators of fecal contamination (total coliforms, fecal coliforms, fecal streptococci, enterococci and E. coli), all in a wide range of concentrations. Rotavirus and a number of enteropathogenic bacteria and parasites were found in stool samples from study subjects who had consumed groundwater, but no concentrations were reported. Consumption of groundwater was associated with increased risk of diarrhea, with odds ratios ranging from 1.9 to 6.1. However, limitations of the selected studies, especially potential confounding factors, limited the conclusions that could be drawn from them. These results support the contention that microbial contamination of groundwater reservoirs—including with human enteropathogens and from a variety of sources—is a reality in developing countries. While microbially-contaminated groundwaters pose risk for diarrhea, other factors are also important, including water treatment, water storage practices, consumption of other water sources, water quantity and access to it, sanitation and hygiene, housing conditions, and socio-economic status. Further understanding of the interrelationships between, and the relative contributions to disease risk of, the various sources of microbial contamination of groundwater can guide the allocation of resources to interventions with the greatest public health benefit. Several recommendations for future research, and for practitioners and policymakers, are presented.^
Resumo:
The purpose of this study was to assess whether C. difficile infection (CDI) increases the risk of bacteremia or E. coli infection. The first specific aim of this study was to study the incidence of post C. difficile bacteremia in CDI patients stratified by disease severity vs. controls. The second specific aim was to study the incidence of post C. difficile E. coli infection from normally sterile sites stratified by disease severity vs. controls. This was a retrospective case case control study. The cases came from an ongoing prospective cohort study of CDI. Case group 1 were patients with mild to moderate CDI. Case group 2 were patients who had severe CDI. Controls were hospitalized patients given broad spectrum antibiotics that did not develop CDI. Controls were matched by age (±10 years) and duration of hospital visit (±1 week). 191 cases were selected from the cohort study and 191 controls were matched to the cases. Patients were followed up to 60 days after the initial diagnosis of CDI and assessed for bacteremia and E. coli infections. The Zar score was used to determine the severity of the CDI. Stata 11 was used to run all analyses. ^ The risk of non staphylococcal bacteremia after diagnosis of CDI was higher compared to controls (14% and 7% respectively, OR: 2.27; 95% CI:1.07-5.01, p=0.028). The risk of getting an E.coli infection was higher in cases than in controls (13% and 9% respectively although the results were not statistically significant (OR:1.4; 95% CI:0.38-5.59;p=0.32). Rates of non-staphylococcal bacteremia and E. coli infection did not differ cased on CDI severity. ^ This study showed that the risk of developing non-staphylococcus bacteremia was higher in patients with CDI compared to matched controls. The findings supported the hypothesis that CDI increases the risk of bacterial translocation specifically leading to the development of bacteremia.^
Resumo:
PURPOSE To report acute/subacute vision loss and paracentral scotomata in patients with idiopathic multifocal choroiditis/punctate inner choroidopathy due to large zones of acute photoreceptor attenuation surrounding the chorioretinal lesions. METHODS Multimodal imaging case series. RESULTS Six women and 2 men were included (mean age, 31.5 ± 5.8 years). Vision ranged from 20/20-1 to hand motion (mean, 20/364). Spectral domain optical coherence tomography demonstrated extensive attenuation of the external limiting membrane, ellipsoid and interdigitation zones, adjacent to the visible multifocal choroiditis/punctate inner choroidopathy lesions. The corresponding areas were hyperautofluorescent on fundus autofluorescence and were associated with corresponding visual field defects. Full-field electroretinogram (available in three cases) showed markedly decreased cone/rod response, and multifocal electroretinogram revealed reduced amplitudes and increased implicit times in two cases. Three patients received no treatment, the remaining were treated with oral corticosteroids (n = 4), oral acyclovir/valacyclovir (n = 2), intravitreal/posterior subtenon triamcinolone acetate (n = 3), and anti-vascular endothelial growth factor (n = 2). Visual recovery occurred in only three cases of whom two were treated. Varying morphological recovery was found in six cases, associated with decrease in hyperautofluorescence on fundus autofluorescence. CONCLUSION Multifocal choroiditis/punctate inner choroidopathy can present with transient or permanent central photoreceptor attenuation/loss. This presentation is likely a variant of multifocal choroiditis/punctate inner choroidopathy with chorioretinal atrophy. Associated changes are best evaluated using multimodal imaging.
Resumo:
Studies of patients with temporal lobe epilepsy provide few descriptions of seizures that arise in the temporopolar and the anterior temporobasal brain region. Based on connectivity, it might be assumed that the semiology of these seizures is similar to that of medial temporal lobe epilepsy. However, accumulating evidence suggests that the anterior temporobasal cortex may play an important role in the language system, which could account for particular features of seizures arising here. We studied the electroclinical features of seizures in patients with circumscribed temporopolar and temporobasal lesions in order to identify specific features that might differentiate them from seizures that originate in other temporal areas. Among 172 patients with temporal lobe seizures registered in our epilepsy unit in the last 15 years, 15 (8.7%) patients had seizures caused by temporopolar or anterior temporobasal lesions (11 left-sided lesions). The main finding in our study is that patients with left-sided lesions had aphasia during their seizures as the most prominent feature. In addition, while all patients showed normal to high intellectual functioning in standard neuropsychological testing, semantic impairment was found in a subset of 9 patients with left-sided lesions. This case series demonstrates that aphasic seizures without impairment of consciousness can result from small, circumscribed left anterior temporobasal and temporopolar lesions. Thus, the presence of speech manifestation during seizures should prompt detailed assessment of the structural integrity of the basal surface of the temporal lobe in addition to the evaluation of primary language areas.
Resumo:
El daño cerebral adquirido (DCA) es un problema social y sanitario grave, de magnitud creciente y de una gran complejidad diagnóstica y terapéutica. Su elevada incidencia, junto con el aumento de la supervivencia de los pacientes, una vez superada la fase aguda, lo convierten también en un problema de alta prevalencia. En concreto, según la Organización Mundial de la Salud (OMS) el DCA estará entre las 10 causas más comunes de discapacidad en el año 2020. La neurorrehabilitación permite mejorar el déficit tanto cognitivo como funcional y aumentar la autonomía de las personas con DCA. Con la incorporación de nuevas soluciones tecnológicas al proceso de neurorrehabilitación se pretende alcanzar un nuevo paradigma donde se puedan diseñar tratamientos que sean intensivos, personalizados, monitorizados y basados en la evidencia. Ya que son estas cuatro características las que aseguran que los tratamientos son eficaces. A diferencia de la mayor parte de las disciplinas médicas, no existen asociaciones de síntomas y signos de la alteración cognitiva que faciliten la orientación terapéutica. Actualmente, los tratamientos de neurorrehabilitación se diseñan en base a los resultados obtenidos en una batería de evaluación neuropsicológica que evalúa el nivel de afectación de cada una de las funciones cognitivas (memoria, atención, funciones ejecutivas, etc.). La línea de investigación en la que se enmarca este trabajo de investigación pretende diseñar y desarrollar un perfil cognitivo basado no sólo en el resultado obtenido en esa batería de test, sino también en información teórica que engloba tanto estructuras anatómicas como relaciones funcionales e información anatómica obtenida de los estudios de imagen. De esta forma, el perfil cognitivo utilizado para diseñar los tratamientos integra información personalizada y basada en la evidencia. Las técnicas de neuroimagen representan una herramienta fundamental en la identificación de lesiones para la generación de estos perfiles cognitivos. La aproximación clásica utilizada en la identificación de lesiones consiste en delinear manualmente regiones anatómicas cerebrales. Esta aproximación presenta diversos problemas relacionados con inconsistencias de criterio entre distintos clínicos, reproducibilidad y tiempo. Por tanto, la automatización de este procedimiento es fundamental para asegurar una extracción objetiva de información. La delineación automática de regiones anatómicas se realiza mediante el registro tanto contra atlas como contra otros estudios de imagen de distintos sujetos. Sin embargo, los cambios patológicos asociados al DCA están siempre asociados a anormalidades de intensidad y/o cambios en la localización de las estructuras. Este hecho provoca que los algoritmos de registro tradicionales basados en intensidad no funcionen correctamente y requieran la intervención del clínico para seleccionar ciertos puntos (que en esta tesis hemos denominado puntos singulares). Además estos algoritmos tampoco permiten que se produzcan deformaciones grandes deslocalizadas. Hecho que también puede ocurrir ante la presencia de lesiones provocadas por un accidente cerebrovascular (ACV) o un traumatismo craneoencefálico (TCE). Esta tesis se centra en el diseño, desarrollo e implementación de una metodología para la detección automática de estructuras lesionadas que integra algoritmos cuyo objetivo principal es generar resultados que puedan ser reproducibles y objetivos. Esta metodología se divide en cuatro etapas: pre-procesado, identificación de puntos singulares, registro y detección de lesiones. Los trabajos y resultados alcanzados en esta tesis son los siguientes: Pre-procesado. En esta primera etapa el objetivo es homogeneizar todos los datos de entrada con el objetivo de poder extraer conclusiones válidas de los resultados obtenidos. Esta etapa, por tanto, tiene un gran impacto en los resultados finales. Se compone de tres operaciones: eliminación del cráneo, normalización en intensidad y normalización espacial. Identificación de puntos singulares. El objetivo de esta etapa es automatizar la identificación de puntos anatómicos (puntos singulares). Esta etapa equivale a la identificación manual de puntos anatómicos por parte del clínico, permitiendo: identificar un mayor número de puntos lo que se traduce en mayor información; eliminar el factor asociado a la variabilidad inter-sujeto, por tanto, los resultados son reproducibles y objetivos; y elimina el tiempo invertido en el marcado manual de puntos. Este trabajo de investigación propone un algoritmo de identificación de puntos singulares (descriptor) basado en una solución multi-detector y que contiene información multi-paramétrica: espacial y asociada a la intensidad. Este algoritmo ha sido contrastado con otros algoritmos similares encontrados en el estado del arte. Registro. En esta etapa se pretenden poner en concordancia espacial dos estudios de imagen de sujetos/pacientes distintos. El algoritmo propuesto en este trabajo de investigación está basado en descriptores y su principal objetivo es el cálculo de un campo vectorial que permita introducir deformaciones deslocalizadas en la imagen (en distintas regiones de la imagen) y tan grandes como indique el vector de deformación asociado. El algoritmo propuesto ha sido comparado con otros algoritmos de registro utilizados en aplicaciones de neuroimagen que se utilizan con estudios de sujetos control. Los resultados obtenidos son prometedores y representan un nuevo contexto para la identificación automática de estructuras. Identificación de lesiones. En esta última etapa se identifican aquellas estructuras cuyas características asociadas a la localización espacial y al área o volumen han sido modificadas con respecto a una situación de normalidad. Para ello se realiza un estudio estadístico del atlas que se vaya a utilizar y se establecen los parámetros estadísticos de normalidad asociados a la localización y al área. En función de las estructuras delineadas en el atlas, se podrán identificar más o menos estructuras anatómicas, siendo nuestra metodología independiente del atlas seleccionado. En general, esta tesis doctoral corrobora las hipótesis de investigación postuladas relativas a la identificación automática de lesiones utilizando estudios de imagen médica estructural, concretamente estudios de resonancia magnética. Basándose en estos cimientos, se han abrir nuevos campos de investigación que contribuyan a la mejora en la detección de lesiones. ABSTRACT Brain injury constitutes a serious social and health problem of increasing magnitude and of great diagnostic and therapeutic complexity. Its high incidence and survival rate, after the initial critical phases, makes it a prevalent problem that needs to be addressed. In particular, according to the World Health Organization (WHO), brain injury will be among the 10 most common causes of disability by 2020. Neurorehabilitation improves both cognitive and functional deficits and increases the autonomy of brain injury patients. The incorporation of new technologies to the neurorehabilitation tries to reach a new paradigm focused on designing intensive, personalized, monitored and evidence-based treatments. Since these four characteristics ensure the effectivity of treatments. Contrary to most medical disciplines, it is not possible to link symptoms and cognitive disorder syndromes, to assist the therapist. Currently, neurorehabilitation treatments are planned considering the results obtained from a neuropsychological assessment battery, which evaluates the functional impairment of each cognitive function (memory, attention, executive functions, etc.). The research line, on which this PhD falls under, aims to design and develop a cognitive profile based not only on the results obtained in the assessment battery, but also on theoretical information that includes both anatomical structures and functional relationships and anatomical information obtained from medical imaging studies, such as magnetic resonance. Therefore, the cognitive profile used to design these treatments integrates information personalized and evidence-based. Neuroimaging techniques represent an essential tool to identify lesions and generate this type of cognitive dysfunctional profiles. Manual delineation of brain anatomical regions is the classical approach to identify brain anatomical regions. Manual approaches present several problems related to inconsistencies across different clinicians, time and repeatability. Automated delineation is done by registering brains to one another or to a template. However, when imaging studies contain lesions, there are several intensity abnormalities and location alterations that reduce the performance of most of the registration algorithms based on intensity parameters. Thus, specialists may have to manually interact with imaging studies to select landmarks (called singular points in this PhD) or identify regions of interest. These two solutions have the same inconvenient than manual approaches, mentioned before. Moreover, these registration algorithms do not allow large and distributed deformations. This type of deformations may also appear when a stroke or a traumatic brain injury (TBI) occur. This PhD is focused on the design, development and implementation of a new methodology to automatically identify lesions in anatomical structures. This methodology integrates algorithms whose main objective is to generate objective and reproducible results. It is divided into four stages: pre-processing, singular points identification, registration and lesion detection. Pre-processing stage. In this first stage, the aim is to standardize all input data in order to be able to draw valid conclusions from the results. Therefore, this stage has a direct impact on the final results. It consists of three steps: skull-stripping, spatial and intensity normalization. Singular points identification. This stage aims to automatize the identification of anatomical points (singular points). It involves the manual identification of anatomical points by the clinician. This automatic identification allows to identify a greater number of points which results in more information; to remove the factor associated to inter-subject variability and thus, the results are reproducible and objective; and to eliminate the time spent on manual marking. This PhD proposed an algorithm to automatically identify singular points (descriptor) based on a multi-detector approach. This algorithm contains multi-parametric (spatial and intensity) information. This algorithm has been compared with other similar algorithms found on the state of the art. Registration. The goal of this stage is to put in spatial correspondence two imaging studies of different subjects/patients. The algorithm proposed in this PhD is based on descriptors. Its main objective is to compute a vector field to introduce distributed deformations (changes in different imaging regions), as large as the deformation vector indicates. The proposed algorithm has been compared with other registration algorithms used on different neuroimaging applications which are used with control subjects. The obtained results are promising and they represent a new context for the automatic identification of anatomical structures. Lesion identification. This final stage aims to identify those anatomical structures whose characteristics associated to spatial location and area or volume has been modified with respect to a normal state. A statistical study of the atlas to be used is performed to establish which are the statistical parameters associated to the normal state. The anatomical structures that may be identified depend on the selected anatomical structures identified on the atlas. The proposed methodology is independent from the selected atlas. Overall, this PhD corroborates the investigated research hypotheses regarding the automatic identification of lesions based on structural medical imaging studies (resonance magnetic studies). Based on these foundations, new research fields to improve the automatic identification of lesions in brain injury can be proposed.