942 resultados para high-level synthesis


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Socioeconomic factors have long been incorporated into environmental research to examine the effects of human dimensions on coastal natural resources. Boyce (1994) proposed that inequality is a cause of environmental degradation and the Environmental Kuznets Curve is a proposed relationship that income or GDP per capita is related with initial increases in pollution followed by subsequent decreases (Torras and Boyce, 1998). To further examine this relationship within the CAMA counties, the emission of sulfur dioxide and nitrogen oxides, as measured by the EPA in terms of tons emitted, the Gini Coefficient, and income per capita were examined for the year of 1999. A quadratic regression was utilized and the results did not indicate that inequality, as measured by the Gini Coefficient, was significantly related to the level of criteria air pollutants within each county. Additionally, the results did not indicate the existence of the Environmental Kuznets Curve. Further analysis of spatial autocorrelation using ArcMap 9.2, found a high level of spatial autocorrelation among pollution emissions indicating that relation to other counties may be more important to the level of sulfur dioxide and nitrogen oxide emissions than income per capita and inequality. Lastly, the paper concludes that further Environmental Kuznets Curve and income inequality analyses in regards to air pollutant levels incorporate spatial patterns as well as other explanatory variables. (PDF contains 4 pages)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Artisanal Fish Societies constitutes one of the poorest societies in the developing world. Attempts to harness the potentials of the societies have often failed due to the enormity of the problem of poverty. This study was conducted in four major fishing villages namely; Abule titun, Apojola, Imama Odo and Ibaro in order to investigate the occupational practices and the problems of rural artisanal fisherfolks in Oyam's Dam, area of Ogun State. Eighty respondents were randomly selected among the artisanal fisher folks for interview using interview guide. The findings revealed that 43.8% of the fisherfolks are within active range of 31-40 years while 30% are within 21-30 years range. Also 31% had no formal education indicating a relatively high level of illiteracy among the fisherfolks while majority of the respondents practice fishing activities using paddle and canoe. It was similarly discovered from the study that the most pressing problems of the fisherfolks is the lack of basic social amenities like electricity, potable water, access roads, hospitals and markets. It is therefore recommended that basic social infrastructures be provided for the artisanal fishing communities in order to improve their social welfare, standard of living and the capacity to have a sustainable fishing occupation in the interest of food security and poverty alleviation

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Artisanal Fish Societies constitutes one of the poorest societies in the developing world. Attempts to harness the potentials of such societies have often failed due to the enormity of the problem of poverty. This study was conducted in four major fishing villages namely: Abule Titun, Apojola, Imala Odo and Ibaro in order to investigate the occupational practices and the problems of rural artisanal fisherfolks in Oyam's Dam, area of Ogun State. Eighty respondents were randomly selected among the artisanal fisher folks for interview using interview guide. The findings revealed that 43.8% of the fisherfolks are within active age range of 31-40 years while 30% are within 21-30 years range. Also 31% had no formal education indicating a relatively high level of illiteracy among the fisherfolks while majority of the respondents practice fishing activities using paddle and canoe. It was similarly discovered from the study that the most pressing problems of the fishfolks is the lack of basic social amenities like electricity, potable water, access roads, hospital and markets. It is therefore recommended that basic social infrastructures be provided for the artisanal fishing communities in order to improve their social welfare, standard of living and the capacity to have a sustainable fishing occupation in the interest of food security and poverty alleviation

Relevância:

80.00% 80.00%

Publicador:

Resumo:

EUSKARA LABURPENA: Irakurketa bizitza osoan zehar gizakiak jorratu beharko duen prozesu konplexua da. Irakurketarekiko lehen harremanak txikitatik eta familia giroan hasten direla kontuan izanik, hizkuntza idatziaren sustatze egitasmoan familiak duen ezinbesteko papera gailendu nahi izan dugu lan honetan. Horrenbestez, Lekeitioko 3-5 urte bitarteko haurren familiek etxetik irakurzaletasuna sustatzeko egiten duten ahalegina aztertzeko, galdetegi bidezko ikerketa bat egin dugu. Burututako ikerlanetik ondorio interesgarriak ateratzea lortu dugu; besteak beste, etxean gurasoek haurrekin irakurtzeko ohitura erakutsi arren, liburutegira edota ekimenetara joateko ez dutela esfortzu handiegirik egiten; edota familiek ez dutela oparien rankingeko gorengo mailan kokatzen liburua.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An analytical fluid model for JxB heating during the normal incidence by a short ultraintense linearly polarized laser on a solid-density plasma is proposed. The steepening of an originally smooth electron density profile as the electrons are pushed inward by the laser is included self-consistently. It is shown that the JxB heating includes two distinct coupling processes depending on the initial laser and plasma conditions: for a moderate intensity (a <= 1), the ponderomotive force of the laser light can drive a large plasma wave at the point n(e)=4 gamma(0)n(c) resonantly. When this plasma wave is damped, the energy is transferred to the plasma. At higher intensity, the electron density is steepened to a high level by the time-independent ponderomotive force, n(e)> 4 gamma(0)n(c), so that no 2 omega resonance will occur, but the longitudinal component of the oscillating ponderomotive field can lead to an absorption mechanism similar to "vacuum heating." (c) 2006 American Institute of Physics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The epidemic of HIV/AIDS in the United States is constantly changing and evolving, starting from patient zero to now an estimated 650,000 to 900,000 Americans infected. The nature and course of HIV changed dramatically with the introduction of antiretrovirals. This discourse examines many different facets of HIV from the beginning where there wasn't any treatment for HIV until the present era of highly active antiretroviral therapy (HAART). By utilizing statistical analysis of clinical data, this paper examines where we were, where we are and projections as to where treatment of HIV/AIDS is headed.

Chapter Two describes the datasets that were used for the analyses. The primary database utilized was collected by myself from an outpatient HIV clinic. The data included dates from 1984 until the present. The second database was from the Multicenter AIDS Cohort Study (MACS) public dataset. The data from the MACS cover the time between 1984 and October 1992. Comparisons are made between both datasets.

Chapter Three discusses where we were. Before the first anti-HIV drugs (called antiretrovirals) were approved, there was no treatment to slow the progression of HIV. The first generation of antiretrovirals, reverse transcriptase inhibitors such as AZT (zidovudine), DDI (didanosine), DDC (zalcitabine), and D4T (stavudine) provided the first treatment for HIV. The first clinical trials showed that these antiretrovirals had a significant impact on increasing patient survival. The trials also showed that patients on these drugs had increased CD4+ T cell counts. Chapter Three examines the distributions of CD4 T cell counts. The results show that the estimated distributions of CD4 T cell counts are distinctly non-Gaussian. Thus distributional assumptions regarding CD4 T cell counts must be taken, into account when performing analyses with this marker. The results also show the estimated CD4 T cell distributions for each disease stage: asymptomatic, symptomatic and AIDS are non-Gaussian. Interestingly, the distribution of CD4 T cell counts for the asymptomatic period is significantly below that of the CD4 T cell distribution for the uninfected population suggesting that even in patients with no outward symptoms of HIV infection, there exists high levels of immunosuppression.

Chapter Four discusses where we are at present. HIV quickly grew resistant to reverse transcriptase inhibitors which were given sequentially as mono or dual therapy. As resistance grew, the positive effects of the reverse transcriptase inhibitors on CD4 T cell counts and survival dissipated. As the old era faded a new era characterized by a new class of drugs and new technology changed the way that we treat HIV-infected patients. Viral load assays were able to quantify the levels of HIV RNA in the blood. By quantifying the viral load, one now had a faster, more direct way to test antiretroviral regimen efficacy. Protease inhibitors, which attacked a different region of HIV than reverse transcriptase inhibitors, when used in combination with other antiretroviral agents were found to dramatically and significantly reduce the HIV RNA levels in the blood. Patients also experienced significant increases in CD4 T cell counts. For the first time in the epidemic, there was hope. It was hypothesized that with HAART, viral levels could be kept so low that the immune system as measured by CD4 T cell counts would be able to recover. If these viral levels could be kept low enough, it would be possible for the immune system to eradicate the virus. The hypothesis of immune reconstitution, that is bringing CD4 T cell counts up to levels seen in uninfected patients, is tested in Chapter Four. It was found that for these patients, there was not enough of a CD4 T cell increase to be consistent with the hypothesis of immune reconstitution.

In Chapter Five, the effectiveness of long-term HAART is analyzed. Survival analysis was conducted on 213 patients on long-term HAART. The primary endpoint was presence of an AIDS defining illness. A high level of clinical failure, or progression to an endpoint, was found.

Chapter Six yields insights into where we are going. New technology such as viral genotypic testing, that looks at the genetic structure of HIV and determines where mutations have occurred, has shown that HIV is capable of producing resistance mutations that confer multiple drug resistance. This section looks at resistance issues and speculates, ceterus parabis, where the state of HIV is going. This section first addresses viral genotype and the correlates of viral load and disease progression. A second analysis looks at patients who have failed their primary attempts at HAART and subsequent salvage therapy. It was found that salvage regimens, efforts to control viral replication through the administration of different combinations of antiretrovirals, were not effective in 90 percent of the population in controlling viral replication. Thus, primary attempts at therapy offer the best change of viral suppression and delay of disease progression. Documentation of transmission of drug-resistant virus suggests that the public health crisis of HIV is far from over. Drug resistant HIV can sustain the epidemic and hamper our efforts to treat HIV infection. The data presented suggest that the decrease in the morbidity and mortality due to HIV/AIDS is transient. Deaths due to HIV will increase and public health officials must prepare for this eventuality unless new treatments become available. These results also underscore the importance of the vaccine effort.

The final chapter looks at the economic issues related to HIV. The direct and indirect costs of treating HIV/AIDS are very high. For the first time in the epidemic, there exists treatment that can actually slow disease progression. The direct costs for HAART are estimated. It is estimated that the direct lifetime costs for treating each HIV infected patient with HAART is between $353,000 to $598,000 depending on how long HAART prolongs life. If one looks at the incremental cost per year of life saved it is only $101,000. This is comparable with the incremental costs per year of life saved from coronary artery bypass surgery.

Policy makers need to be aware that although HAART can delay disease progression, it is not a cure and HIV is not over. The results presented here suggest that the decreases in the morbidity and mortality due to HIV are transient. Policymakers need to be prepared for the eventual increase in AIDS incidence and mortality. Costs associated with HIV/AIDS are also projected to increase. The cost savings seen recently have been from the dramatic decreases in the incidence of AIDS defining opportunistic infections. As patients who have been on HAART the longest start to progress to AIDS, policymakers and insurance companies will find that the cost of treating HIV/AIDS will increase.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

[ES]La caracterización térmica de una fachada vegetal es una tarea difícil que requiere un nivel de certeza y predicción realista de modelos en situaciones exteriores dinámicas. El estudio teórico de elementos constructivos complejos no asemeja la realidad, por lo que para obtener la correcta caracterización es necesario ensayar dichos elementos y analizar los datos obtenidos. Para ello se utilizan las células de ensayo PASLINK y el entorno informático LORD. A través de ellos, se obtiene la transmitancia térmica dinámica de la fachada vegetal ensayada en condiciones exteriores reales.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

I. Foehn winds of southern California.
An investigation of the hot, dry and dust laden winds occurring in the late fall and early winter in the Los Angeles Basin and attributed in the past to the influences of the desert regions to the north revealed that these currents were of a foehn nature. Their properties were found to be entirely due to dynamical heating produced in the descent from the high level areas in the interior to the lower Los Angeles Basin. Any dust associated with the phenomenon was found to be acquired from the Los Angeles area rather than transported from the desert. It was found that the frequency of occurrence of a mild type foehn of this nature during this season was sufficient to warrant its classification as a winter monsoon. This results from the topography of the Los Angeles region which allows an easy entrance to the air from the interior by virtue of the low level mountain passes north of the area. This monsoon provides the mild winter climate of southern California since temperatures associated with the foehn currents are far higher than those experienced when maritime air from the adjacent Pacific Ocean occupies the region.

II. Foehn wind cyclo-genesis.
Intense anticyclones frequently build up over the high level regions of the Great Basin and Columbia Plateau which lie between the Sierra Nevada and Cascade Mountains to the west and the Rocky Mountains to the east. The outflow from these anticyclones produce extensive foehns east of the Rockies in the comparatively low level areas of the middle west and the Canadian provinces of Alberta and Saskatchewan. Normally at this season of the year very cold polar continental air masses are present over this territory and with the occurrence of these foehns marked discontinuity surfaces arise between the warm foehn current, which is obliged to slide over a colder mass, and the Pc air to the east. Cyclones are easily produced from this phenomenon and take the form of unstable waves which propagate along the discontinuity surface between the two dissimilar masses. A continual series of such cyclones was found to occur as long as the Great Basin anticyclone is maintained with undiminished intensity.

III. Weather conditions associated with the Akron disaster.
This situation illustrates the speedy development and propagation of young disturbances in the eastern United States during the spring of the year under the influence of the conditionally unstable tropical maritime air masses which characterise the region. It also furnishes an excellent example of the superiority of air mass and frontal methods of weather prediction for aircraft operation over the older methods based upon pressure distribution.

IV. The Los Angeles storm of December 30, 1933 to January 1, 1934.
This discussion points out some of the fundamental interactions occurring between air masses of the North Pacific Ocean in connection with Pacific Coast storms and the value of topographic and aerological considerations in predicting them. Estimates of rainfall intensity and duration from analyses of this type may be made and would prove very valuable in the Los Angeles area in connection with flood control problems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

[ES]Este estudio tiene como objetivo analizar el rendimiento de diferentes aplicaciones ITS sobre el protocolo de movilidad de redes NeMHIP, el cual garantiza un alto nivel de seguridad y de calidad de servicio. En primer lugar, se seleccionarán las diferentes aplicaciones. A continuación, se identificarán los parámetros más significativos para medir el rendimiento y se definirá un plan de pruebas y un escenario. Posteriormente se realizarán las medidas con las aplicaciones previamente seleccionadas, y por último se analizarán los resultados obtenidos para determinar la eficiencia de cada aplicación sobre el protocolo NeMHIP.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Interleukin 2 (IL2) is the primary growth hormone used by mature T cells and this lymphokine plays an important role in the magnification of cell-mediated immune responses. Under normal circumstances its expression is limited to antigen-activated type 1 helper T cells (TH1) and the ability to transcribe this gene is often regarded as evidence for commitment to this developmental lineage. There is, however, abundant evidence than many non-TH1 T cells, under appropriate conditions, possess the ability to express this gene. Of paramount interest in the study of T-cell development is the mechanisms by which differentiating thymocytes are endowed with particular combinations of cell surface proteins and response repertoires. For example, why do most helper T cells express the CD4 differentiation antigen?

As a first step in understanding these developmental processes the gene encoding IL2 was isolated from a mouse genomic library by probing with a conspecific IL2 cDNA. The sequence of the 5' flanking region from + 1 to -2800 was determined and compared to the previously reported human sequence. Extensive identity exists between +1 and -580 (86%) and sites previously shown to be crucial for the proper expression of the human gene are well conserved in both sequence location in the mouse counterpart.

Transient expression assays were used to evaluate the contribution of various genomic sequences to high-level gene expression mediated by a cloned IL2 promoter fragment. Differing lengths of 5' flanking DNA, all terminating in the 5' untranslated region, were linked to a reporter gene, bacterial chloramphenicol acetyltransferase (CAT) and enzyme activity was measured after introduction into IL2-producing cell lines. No CAT was ever detected without stimulation of the recipient cells. A cloned promoter fragment containing only 321 bp of upstream DNA was expressed well in both Jurkat and EL4.El cells. Addition of intragenic or downstream DNA to these 5' IL2-CAT constructs showed that no obvious regulatory regions resided there. However, increasing the extent of 5' DNA from -321 to -2800 revealed several positive and negative regulatory elements. One negative region that was well characterized resided between -750 and -1000 and consisted almost exclusively of alternating purine and pyrimidines. There is no sequence resembling this in the human gene now, but there is evidence that there may have once been.

No region, when deleted, could relax either the stringent induction-dependence on cell-type specificity displayed by this promoter. Reagents that modulated endogenous IL2 expression, such as cAMP, cyclosporin A, and IL1, affected expression of the 5' IL2-CAT constructs also. For a given reagent, expression from all expressible constructs was suppressed or enhanced to the same extent. This suggests that these modulators affect IL2 expression through perturbation of a central inductive signal rather than by summation of the effects of discrete, independently regulated, negative and positive transcription factors.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The insula is a mammalian cortical structure that has been implicated in a wide range of low- and high-level functions governing one’s sensory, emotional, and cognitive experiences. One particular role of this region is considered to be processing of olfactory stimuli. The ability to detect and evaluate odors has significant effects on an organism’s eating behavior and survival and, in case of humans, on complex decision making. Despite such importance of this function, the mechanism in which olfactory information is processed in the insula has not been thoroughly studied. Moreover, due to the structure’s close spatial relationship with the neighboring claustrum, it is not entirely clear whether the connectivity and olfactory functions attributed to the insula are truly those of the insula, rather than of the claustrum. My graduate work, consisting of two studies, seeks to help fill these gaps. In the first, the structural connectivity patterns of the insula and the claustrum in a non-human primate brain is assayed using an ultra-high-quality diffusion magnetic resonance image, and the results suggest dissociation of connectivity — and hence function — between the two structures. In the second study, a functional neuroimaging experiment investigates the insular activity during odor evaluation tasks in humans, and uncovers a potential spatial organization within the anterior portion of the insula for processing different aspects of odor characteristics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Optical Coherence Tomography(OCT) is a popular, rapidly growing imaging technique with an increasing number of bio-medical applications due to its noninvasive nature. However, there are three major challenges in understanding and improving an OCT system: (1) Obtaining an OCT image is not easy. It either takes a real medical experiment or requires days of computer simulation. Without much data, it is difficult to study the physical processes underlying OCT imaging of different objects simply because there aren't many imaged objects. (2) Interpretation of an OCT image is also hard. This challenge is more profound than it appears. For instance, it would require a trained expert to tell from an OCT image of human skin whether there is a lesion or not. This is expensive in its own right, but even the expert cannot be sure about the exact size of the lesion or the width of the various skin layers. The take-away message is that analyzing an OCT image even from a high level would usually require a trained expert, and pixel-level interpretation is simply unrealistic. The reason is simple: we have OCT images but not their underlying ground-truth structure, so there is nothing to learn from. (3) The imaging depth of OCT is very limited (millimeter or sub-millimeter on human tissues). While OCT utilizes infrared light for illumination to stay noninvasive, the downside of this is that photons at such long wavelengths can only penetrate a limited depth into the tissue before getting back-scattered. To image a particular region of a tissue, photons first need to reach that region. As a result, OCT signals from deeper regions of the tissue are both weak (since few photons reached there) and distorted (due to multiple scatterings of the contributing photons). This fact alone makes OCT images very hard to interpret.

This thesis addresses the above challenges by successfully developing an advanced Monte Carlo simulation platform which is 10000 times faster than the state-of-the-art simulator in the literature, bringing down the simulation time from 360 hours to a single minute. This powerful simulation tool not only enables us to efficiently generate as many OCT images of objects with arbitrary structure and shape as we want on a common desktop computer, but it also provides us the underlying ground-truth of the simulated images at the same time because we dictate them at the beginning of the simulation. This is one of the key contributions of this thesis. What allows us to build such a powerful simulation tool includes a thorough understanding of the signal formation process, clever implementation of the importance sampling/photon splitting procedure, efficient use of a voxel-based mesh system in determining photon-mesh interception, and a parallel computation of different A-scans that consist a full OCT image, among other programming and mathematical tricks, which will be explained in detail later in the thesis.

Next we aim at the inverse problem: given an OCT image, predict/reconstruct its ground-truth structure on a pixel level. By solving this problem we would be able to interpret an OCT image completely and precisely without the help from a trained expert. It turns out that we can do much better. For simple structures we are able to reconstruct the ground-truth of an OCT image more than 98% correctly, and for more complicated structures (e.g., a multi-layered brain structure) we are looking at 93%. We achieved this through extensive uses of Machine Learning. The success of the Monte Carlo simulation already puts us in a great position by providing us with a great deal of data (effectively unlimited), in the form of (image, truth) pairs. Through a transformation of the high-dimensional response variable, we convert the learning task into a multi-output multi-class classification problem and a multi-output regression problem. We then build a hierarchy architecture of machine learning models (committee of experts) and train different parts of the architecture with specifically designed data sets. In prediction, an unseen OCT image first goes through a classification model to determine its structure (e.g., the number and the types of layers present in the image); then the image is handed to a regression model that is trained specifically for that particular structure to predict the length of the different layers and by doing so reconstruct the ground-truth of the image. We also demonstrate that ideas from Deep Learning can be useful to further improve the performance.

It is worth pointing out that solving the inverse problem automatically improves the imaging depth, since previously the lower half of an OCT image (i.e., greater depth) can be hardly seen but now becomes fully resolved. Interestingly, although OCT signals consisting the lower half of the image are weak, messy, and uninterpretable to human eyes, they still carry enough information which when fed into a well-trained machine learning model spits out precisely the true structure of the object being imaged. This is just another case where Artificial Intelligence (AI) outperforms human. To the best knowledge of the author, this thesis is not only a success but also the first attempt to reconstruct an OCT image at a pixel level. To even give a try on this kind of task, it would require fully annotated OCT images and a lot of them (hundreds or even thousands). This is clearly impossible without a powerful simulation tool like the one developed in this thesis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cap. 1. La interrelación entre los sistemas turístico y patrimonial: más allá de los discursos apologéticos y las prácticas reduccionistas. Iñaki Arrieta Urtizberea Cap. 2. Turistas y museos. Apocalípticos e integrados. José Antonio Donaire. Cap. 3. Turismo cultural. Ficciones sobre realidades, realidades sobre invenciones. Agustín Santana Talavera, Pablo Díaz Rodríguez y Alberto Jonay Rodríguez Darias. Cap. 4. ¿Museos a la deriva o continentes a la deriva?: consecuencias de la crisis financiera para los museos de América del Norte, Yves Bergeron. Cap. 5. Patrimonio histórico, turismo, economía: ¿un desafío o una alianza? El caso de Populonia (Toscana, Italia). Daniele Manacorda. Cap. 6. Diagnóstico posrevolucionario en Túnez: delirio turístico, fiebre museística y la locura del jazmín. Habib Saidi. Cap. 7. Patrimonio etnológico: ¿recurso socioeconómico o instrumento sociopolítico? El caso de los Astilleros Nereo de Málaga. Esther Fernández de Paz. Cap. 8. De Rampas y Pasarelas: los museos Guggenheim como espacios artísticos genéricos. Sophia Carmen Vackimes. Cap. 9. El patrimonio como fuente de desarrollo sostenible en las regiones del interior norte de Portugal: el caso del municipio de Vieira do Minho. Eduardo Jorge Duque. Cap. 10. Museos, turismo y desarrollo local: el caso de Belmonte, Portugal. Luís Silva. Cap. 11. ¿Existen razones de eficiencia económica en las decisiones de cierre parcial de algunos museos locales? Análisis del caso del Museo Darder (Banyoles) en el contexto de los museos de Cataluña. Gabriel Alcalde, Josep Burch, Modest Fluvià, Ricard Rigall i Torrent y Albert Saló.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

314 p.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Para analisar cepas de Salmonella ser. Typhimurium isoladas de processos entéricos e extraintestinais humanos ocorridos no período de 1970 a 2008 de diferentes regiões do país foram selecionadas, com base nos registros contidos no banco de dados do Laboratório de Enterobactérias do IOC/FIOCRUZ, RJ, amostras do fagotipo prevalente 193, visando precipuamente o reconhecimento de clones epidêmicos. Foram selecionadas 553 cepas de Salmonella ser. Typhimurium fagotipo 193 representadas por 91, 65, 70 e 327 amostras referentes as décadas de 70, 80, 90 e ao período de 2000 a 2008, respectivamente. Na análise global da sensibilidade destas cepas, 52% apresentaram um ou mais marcadores de resistência a antibióticos incluídos no perfil ACSSuT. Este perfil de resistência completo foi verificado em 20,9% dos isolados, sendo os 21,9% restantes, sensíveis a todas as drogas testadas, especialmente no período de 2000 a 2008, representadas por 121 amostras (37,0%) em relação as 327 culturas dessa época. O maior percentual de resistência foi observado nas amostras da década de 70 (99%) sendo o perfil ACSSuT detectado em 35,2% dos isolados, ressaltando-se que todas as amostras foram isoladas de processos gastroentéricos ocorridos na cidade de São Paulo. Ao longo das quatro décadas de estudo, descreve-se um ponto de ruptura entre a prevalência de resistência e a suscetibilidade na transição entre as décadas de 80 e 90. Embora o número de isolados de Salmonella ser. Typhimurium fagotipo 193 tenha aumentado no último período considerado, o percentual de mono e multirresistência aos antimicrobianos se situou em nível elevado (63,0%). A análise do polimorfismo obtido após macrorrestrição com a enzima XbaI revelou que cepas isoladas na década de 90 apresentaram elevado percentual de similaridade (≥85%) com cepas isoladas recentemente (período de 2000-2008), sendo agrupadas nos mesmos subclusters. Por outro lado, as cepas da década de 70 inserem-se em subclusters independentes, embora o percentual de similaridade entre tais subclusters e os demais seja ≥70%; o mesmo sendo observado para as cepas isoladas durante a década de 80. Em conclusão, este estudo mostrou que a prevalência de isolados humanos de Salmonella ser. Typhimurium fagotipo 193 no Brasil vem progredindo desde a década de 1990, enquanto a detecção do modelo R (ACSSuT) está diminuindo e a avaliação através da PFGE indicou a presença de multiplicidade de perfis de macrorrestrição no fagotipo 193, entretanto com elevados percentuais de similaridade entre si, sugerindo alguma clonalidade, tendo em vista o período entre o isolamento e a análise