859 resultados para Multi-Agent Control


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present review deals with the stages of synthesis and processing of asparagine-linked oligosaccharides occurring in the lumen of the endoplasmic reticulum and their relationship to the acquisition by glycoproteins of their proper tertiary structures. Special emphasis is placed on reactions taking place in trypanosomatid protozoa since their study has allowed the detection of the transient glucosylation of glycoproteins catalyzed by UDP-Glc:glycoprotein glucosyltransferase and glucosidase II. The former enzyme has the unique property of covalently tagging improperly folded conformations by catalyzing the formation of protein-linked Glc1Man7GlcNAc2, Glc1Man8GlcNac2 and Glc1Man9GlcNAc2 from the unglucosylated proteins. Glucosyltransferase is a soluble protein of the endoplasmic reticulum that recognizes protein domains exposed in denatured but not in native conformations (probably hydrophobic amino acids) and the innermost N-acetylglucosamine unit that is hidden from macromolecular probes in most native glycoproteins. In vivo, the glucose units are removed by glucosidase II. The influence of oligosaccharides in glycoprotein folding is reviewed as well as the participation of endoplasmic reticulum chaperones (calnexin and calreticulin) that recognize monoglucosylated species in the same process. A model for the quality control of glycoprotein folding in the endoplasmic reticulum, i.e., the mechanism by which cells recognize the tertiary structure of glycoproteins and only allow transit to the Golgi apparatus of properly folded species, is discussed. The main elements of this control are calnexin and calreticulin as retaining components, the UDP-Glc:glycoprotein glucosyltransferase as a sensor of tertiary structures and glucosidase II as the releasing agent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to various advantages such as flexibility, scalability and updatability, software intensive systems are increasingly embedded in everyday life. The constantly growing number of functions executed by these systems requires a high level of performance from the underlying platform. The main approach to incrementing performance has been the increase of operating frequency of a chip. However, this has led to the problem of power dissipation, which has shifted the focus of research to parallel and distributed computing. Parallel many-core platforms can provide the required level of computational power along with low power consumption. On the one hand, this enables parallel execution of highly intensive applications. With their computational power, these platforms are likely to be used in various application domains: from home use electronics (e.g., video processing) to complex critical control systems. On the other hand, the utilization of the resources has to be efficient in terms of performance and power consumption. However, the high level of on-chip integration results in the increase of the probability of various faults and creation of hotspots leading to thermal problems. Additionally, radiation, which is frequent in space but becomes an issue also at the ground level, can cause transient faults. This can eventually induce a faulty execution of applications. Therefore, it is crucial to develop methods that enable efficient as well as resilient execution of applications. The main objective of the thesis is to propose an approach to design agentbased systems for many-core platforms in a rigorous manner. When designing such a system, we explore and integrate various dynamic reconfiguration mechanisms into agents functionality. The use of these mechanisms enhances resilience of the underlying platform whilst maintaining performance at an acceptable level. The design of the system proceeds according to a formal refinement approach which allows us to ensure correct behaviour of the system with respect to postulated properties. To enable analysis of the proposed system in terms of area overhead as well as performance, we explore an approach, where the developed rigorous models are transformed into a high-level implementation language. Specifically, we investigate methods for deriving fault-free implementations from these models into, e.g., a hardware description language, namely VHDL.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genes encoding lipoproteins LipL32, LipL41 and the outer-membrane protein OmpL1 of leptospira were recombined and cloned into a pVAX1 plasmid. BALB/c mice were immunized with LipL32 and recombined LipL32-41-OmpL1 using DNA-DNA, DNA-protein and protein-protein strategies, respectively. Prime immunization was on day 1, boost immunizations were on day 11 and day 21. Sera were collected from each mouse on day 35 for antibody, cytokine detection and microscopic agglutination test while spleen cells were collected for splenocyte proliferation assay. All experimental groups (N = 10 mice per group) showed statistically significant increases in antigen-specific antibodies, in cytokines IL-4 and IL-10, as well as in the microscopic agglutination test and splenocyte proliferation compared with the pVAX1 control group. The groups receiving the recombined LipL32-41-OmpL1 vaccine induced anti-LipL41 and anti-OmpL1 antibodies and yielded better splenocyte proliferation values than the groups receiving LipL32. DNA prime and protein boost immune strategies stimulated more antibodies than a DNA-DNA immune strategy and yielded greater cytokine and splenocyte proliferation than a protein-protein immune strategy. It is clear from these results that recombination of protective antigen genes lipL32, lipL41, and ompL1 and a DNA-protein immune strategy resulted in better immune responses against leptospira than single-component, LipL32, or single DNA or protein immunization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pancreatic adenocarcinoma is important in oncology because of its high mortality rate. Deaths may be avoided if an early diagnosis could be achieved. Several types of tumors overexpress gastrin-releasing peptide receptors (GRPr), including pancreatic cancer cells. Thus, a radiolabeled peptide derivative of gastrin-releasing peptide (GRP) may be useful as a specific imaging probe. The purpose of the present study was to evaluate the feasibility of using99mTc-HYNIC-βAla-Bombesin(7-14)as an imaging probe for Capan-1 pancreatic adenocarcinoma. Xenographic pancreatic tumor was developed in nude mice and characterized by histopathological analysis. Biodistribution studies and scintigraphic images were carried out in tumor-bearing nude mice. The two methods showed higher uptake by pancreatic tumor when compared to muscle (used as control), and the tumor-to-muscle ratio indicated that99mTc-HYNIC-βAla-Bombesin(7-14)uptake was four-fold higher in tumor cells than in other tissues. Scintigraphic images also showed a clear signal at the tumor site. The present data indicate that99mTc-HYNIC-βAla-Bombesin(7-14)may be useful for the detection of pancreatic adenocarcinoma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this thesis is to propose a novel control method for teleoperated electrohydraulic servo systems that implements a reliable haptic sense between the human and manipulator interaction, and an ideal position control between the manipulator and the task environment interaction. The proposed method has the characteristics of a universal technique independent of the actual control algorithm and it can be applied with other suitable control methods as a real-time control strategy. The motivation to develop this control method is the necessity for a reliable real-time controller for teleoperated electrohydraulic servo systems that provides highly accurate position control based on joystick inputs with haptic capabilities. The contribution of the research is that the proposed control method combines a directed random search method and a real-time simulation to develop an intelligent controller in which each generation of parameters is tested on-line by the real-time simulator before being applied to the real process. The controller was evaluated on a hydraulic position servo system. The simulator of the hydraulic system was built based on Markov chain Monte Carlo (MCMC) method. A Particle Swarm Optimization algorithm combined with the foraging behavior of E. coli bacteria was utilized as the directed random search engine. The control strategy allows the operator to be plugged into the work environment dynamically and kinetically. This helps to ensure the system has haptic sense with high stability, without abstracting away the dynamics of the hydraulic system. The new control algorithm provides asymptotically exact tracking of both, the position and the contact force. In addition, this research proposes a novel method for re-calibration of multi-axis force/torque sensors. The method makes several improvements to traditional methods. It can be used without dismantling the sensor from its application and it requires smaller number of standard loads for calibration. It is also more cost efficient and faster in comparison to traditional calibration methods. The proposed method was developed in response to re-calibration issues with the force sensors utilized in teleoperated systems. The new approach aimed to avoid dismantling of the sensors from their applications for applying calibration. A major complication with many manipulators is the difficulty accessing them when they operate inside a non-accessible environment; especially if those environments are harsh; such as in radioactive areas. The proposed technique is based on design of experiment methodology. It has been successfully applied to different force/torque sensors and this research presents experimental validation of use of the calibration method with one of the force sensors which method has been applied to.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Crystal properties, product quality and particle size are determined by the operating conditions in the crystallization process. Thus, in order to obtain desired end-products, the crystallization process should be effectively controlled based on reliable kinetic information, which can be provided by powerful analytical tools such as Raman spectrometry and thermal analysis. The present research work studied various crystallization processes such as reactive crystallization, precipitation with anti-solvent and evaporation crystallization. The goal of the work was to understand more comprehensively the fundamentals, phenomena and utilizations of crystallization, and establish proper methods to control particle size distribution, especially for three phase gas-liquid-solid crystallization systems. As a part of the solid-liquid equilibrium studies in this work, prediction of KCl solubility in a MgCl2-KCl-H2O system was studied theoretically. Additionally, a solubility prediction model by Pitzer thermodynamic model was investigated based on solubility measurements of potassium dihydrogen phosphate with the presence of non-electronic organic substances in aqueous solutions. The prediction model helps to extend literature data and offers an easy and economical way to choose solvent for anti-solvent precipitation. Using experimental and modern analytical methods, precipitation kinetics and mass transfer in reactive crystallization of magnesium carbonate hydrates with magnesium hydroxide slurry and CO2 gas were systematically investigated. The obtained results gave deeper insight into gas-liquid-solid interactions and the mechanisms of this heterogeneous crystallization process. The research approach developed can provide theoretical guidance and act as a useful reference to promote development of gas-liquid reactive crystallization. Gas-liquid mass transfer of absorption in the presence of solid particles in a stirred tank was investigated in order to gain understanding of how different-sized particles interact with gas bubbles. Based on obtained volumetric mass transfer coefficient values, it was found that the influence of the presence of small particles on gas-liquid mass transfer cannot be ignored since there are interactions between bubbles and particles. Raman spectrometry was successfully applied for liquid and solids analysis in semi-batch anti-solvent precipitation and evaporation crystallization. Real-time information such as supersaturation, formation of precipitates and identification of crystal polymorphs could be obtained by Raman spectrometry. The solubility prediction models, monitoring methods for precipitation and empirical model for absorption developed in this study together with the methodologies used gives valuable information for aspects of industrial crystallization. Furthermore, Raman analysis was seen to be a potential controlling method for various crystallization processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, bromelain was recovered from ground pineapple stem and rind by means of precipitation with alcohol at low temperature. Bromelain is the name of a group of powerful protein-digesting, or proteolytic, enzymes that are particularly useful for reducing muscle and tissue inflammation and as a digestive aid. Temperature control is crucial to avoid irreversible protein denaturation and consequently to improve the quality of the enzyme recovered. The process was carried out alternatively in two fed-batch pilot tanks: a glass tank and a stainless steel tank. Aliquots containing 100 mL of pineapple aqueous extract were fed into the tank. Inside the jacketed tank, the protein was exposed to unsteady operating conditions during the addition of the precipitating agent (ethanol 99.5%) because the dilution ratio "aqueous extract to ethanol" and heat transfer area changed. The coolant flow rate was manipulated through a variable speed pump. Fine tuned conventional and adaptive PID controllers were on-line implemented using a fieldbus digital control system. The processing performance efficiency was enhanced and so was the quality (enzyme activity) of the product.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work an agent based model (ABM) was proposed using the main idea from the Jabłonska-Capasso-Morale (JCM) model and maximized greediness concept. Using a multi-agents simulator, the power of the ABM was assessed by using the historical prices of silver metal dating from the 01.03.2000 to 01.03.2013. The model results, analysed in two different situations, with and without maximized greediness, have proven that the ABM is capable of explaining the silver price dynamics even in utmost events. The ABM without maximal greediness explained the prices with more irrationalities whereas the ABM with maximal greediness tracked the price movements with more rational decisions. In the comparison test, the model without maximal greediness stood as the best to capture the silver market dynamics. Therefore, the proposed ABM confirms the suggested reasons for financial crises or markets failure. It reveals that an economic or financial collapse may be stimulated by irrational and rational decisions, yet irrationalities may dominate the market.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this Master’s thesis is to find out how should internal control be structured in a Finnish retail company in order to fulfil the requirements set out in the Finnish Corporate Governance Code and to be value adding for the company as well as to analyse the added value that a structured and centrally led internal control can provide for the case company. The underlying fundamental theoretical framework of the study essentially stems from the theory of the firm; the agent-principal problem is the primary motivator for internal control. Regulatory requirements determine the thresholds that the internal control of a company must reach. The research was carried out as a case study and methodically the study is qualitative and the empirical data gathering was conducted by interviews and by participant observation. The data gathered (processes, controls etc.) is used to understand the control environment of the company and to assess the current state of internal control. Deficiencies and other points of development identified are then discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Forty-four bacteriophage isolates of Erwinia amy/ovora, the causal agent of fire blight, were collected from sites in and around the Niagara Region of Southern Ontario in the summer of 1998. Phages were isolated only from sites where fire blight was present. Thirty-seven of these phages were isolated from the soil surrounding infected trees, with the remainder isolated from aerial plant tissue samples. A mixture of six E. amy/ovora bacterial host strains was used to enrich field samples in order to avoid the selection bias of a single-host system. Molecular characterization of the phages with a combination of peR and restriction endonuclease digestions showed that six distinct phage types were isolated. Ten phage isolates related to the previously characterized E. amy/ovora phage PEa1 were isolated, with some divergence of molecular markers between phages isolated from different sites. The host ranges of the phages revealed that certain types were unable to efficiently lyse some E. amy/ovora strains, and that some types were able to lyse the epiphytic bacterium Pantoea agg/omerans. Biological control of E. amy/ovora by the bacteriophages was assessed in a bioassay using discs of immature pear fruit. Twenty-three phage isolates were able to significantly suppress the incidence of bacterial exudate on the pear disc surface. Quantification of the bacterial population remaining on the disc surface indicated that population reductions of up to 97% were obtainable by phage treatment, but that elimination of bacteria from the surface was not possible with this model system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Understanding the machinery of gene regulation to control gene expression has been one of the main focuses of bioinformaticians for years. We use a multi-objective genetic algorithm to evolve a specialized version of side effect machines for degenerate motif discovery. We compare some suggested objectives for the motifs they find, test different multi-objective scoring schemes and probabilistic models for the background sequence models and report our results on a synthetic dataset and some biological benchmarking suites. We conclude with a comparison of our algorithm with some widely used motif discovery algorithms in the literature and suggest future directions for research in this area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The National Seaman’s Association was a labour recruiter hiding behind a union-like name. It was run by H.N. McMaster who collected fees from companies and dues from workers. With McMaster in charge, shipping interests could claim that their seamen had a union, but ship-owners were free to push their vessels and their workers to the breaking point. In 1935, the members on the Great Lakes decided to strike. One year later, they created their own union and amalgamated with a Montreal-based independent body to create the Canadian Seamen’s Union headed by a ship’s cook who became a union leader, John Allan Patrick “Pat” Sullivan. By the late 1940s, almost all sailors on Canadian ships were CSU members. Right from its inception in 1936, Communists were prominent among the leaders of the union. Sullivan had been recruited to the Communist party that year and the union had a close rapport with the party. On June 8, 1940, Pat Sullivan was arrested because of his affiliation with the Communist party. He was incarcerated until March 20, 1942. No charges were laid, no bail was set and there was no trial. After his release, Sullivan was elected second vice-president of the Trades and Labour Congress of Canada. In 1943, Percy Bengough was elected as president and Sullivan was elected as secretary treasurer of the TLC while maintaining his role as president of the CSU. On March 14, 1947 Sullivan made a shocking announcement that he was resigning from the CSU and the Labor-Progressive Party. He claimed that the CSU was under the full control of the Communists. Within a month of this announcement, he emerged as the president of the Canadian Lake Seamen’s Union. Ship-owners never really reconciled themselves to having their industry unionized, and in 1946 there was a seamen’s strike in which the union won the eight-hour day. In 1949, the shipping companies had a plan to get rid of the union and were negotiating behind their back with the Seafarers International Union (SIU). In a brutal confrontation, led by Hal Banks, an American ex-convict, the SIU was able to roust the CSU and take over the bargaining rights of Canadian seamen. On July 15, 1948, Robert Lindsay, who was Sullivan’s Welland business agent said that to the best of his knowledge, Sullivan’s outfit, the CLSU, was under the control of some of the Steamship Companies. Lindsay had heard that there was a movement to get rid of Bengough of the Trades and Labour Congress as well as elements of the CSU. He also had heard that the CLSU wanted to affiliate with the American Federation of Labor. Lindsay’s allegations raised the questions: Were the ship-owners powerful enough to oust Percy Bengough because he supported the seamen? Could the CLSU get an affiliation with the American Federation of Labor? and Would the American Federation of Labor actually affiliate with a union that was siding with employers against a locked-out union?

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pérez-Castrillo and Wettstein (2002) propose a multi-bidding mechanism to determine a winner from a set of possible projects. The winning project is implemented and its surplus is shared among the agents. In the multi-bidding mechanism each agent announces a vector of bids, one for each possible project, that are constrained to sum up to zero. In addition, each agent chooses a favorite a object which is used as a tie-breaker if several projects receive the same highest aggregate bid. Since more desirable projects receive larger bids, it is natural to consider the multi-bidding mechanism without the announcement of favorite projects. We show that the merits of the multi-bidding mechanism appear not to be robust to this natural simplification. Specifically, a Nash equilibrium exists if and only if there are at least two individually optimal projects and all individually optimal projects are efficient.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cancer du poumon associé à l’exposition au nickel, au chrome VI et au cadmium dans le milieu de travail utilisant deux études populationnelles cas-témoins à Montréal. Au début des années 1990, le nickel, le chrome VI et le cadmium ont été classés en tant qu’agents cancérigènes de classe 1 par le CIRC (Centre International de Recherche sur le Cancer). Cependant, les résultats des études ayant permis la classification de ces métaux n’ont pas toujours été reproduits, et d’importantes questions demeurent quant aux effets de ces métaux à de faibles niveaux d’exposition. Un plus grand nombre de recherches empiriques est donc nécessaire afin de réaffirmer la cancérogénicité de ces agents, et d’identifier les circonstances dans lesquelles ils peuvent être néfastes. L'objectif de cette étude était d'explorer la relation entre l’exposition à un des métaux (soit le nickel, le chrome VI, ou le cadmium) et les risques subséquents de développer un cancer du poumon chez des travailleurs provenant de différents milieux de travail qui sont exposés à ces métaux à de différents degrés. Deux études cas-témoins de base populationnelle menées à Montréal ont fourni les données nécessaires pour examiner la cancérogénicité de ces métaux. La première étude était menée entre 1979 et 1986 chez des hommes âgés de 35 à 70 ans ayant un cancer dans l’un de 19 sites anatomiques de cancer sélectionnés. La seconde étude était menée entre 1996 et 2001 chez des hommes et des femmes âgés de 35 à 75 ans, avec un diagnostic de tumeur maligne au poumon. Dans ces deux études, les cas ont été recensés dans tous les hôpitaux de l'île de Montréal, tandis que les contrôles populationnels appariés par âge et stratifiés par sexe, ont été sélectionnés des listes électorales. Une entrevue avec chaque sujet a permis d'obtenir un historique d'emploi détaillé ainsi que des informations précises sur les facteurs de risques socio-économiques et personnels. Les descriptions de poste ont été évaluées par une équipe d'experts chimistes et hygiénistes afin de déterminer si le sujet a été exposé à chaque agent, et pour mesurer à la fois la concentration et la durée de chaque exposition, ainsi que l’exposition cumulative tout au long de la vie de chaque participant. Pour déterminer si une exposition à l’un des trois métaux en cause était associée à une augmentation de l'incidence du cancer du poumon, des données ont été analysées par régression logistique : des ajustements ont été effectués pour des facteurs de confusion pertinents incluant un historique détaillé du tabagisme. Des mesures catégoriques d'exposition cumulée ont été également analysées, ainsi que la modification des effets par le tabagisme. Les deux études ont été analysées séparément, puis par la suite combinées afin d'augmenter la puissance statistique. Les niveaux d'exposition mesurés dans cette population ne semblaient pas poser un excès de risque de cancer du poumon pour les travailleurs exposés au chrome VI. Cependant, ceux qui ont été exposés au nickel ont subi une augmentation significative du risque, et ce, quel que soit leur niveau d'exposition. Le risque de développer un cancer du poumon suite à une exposition au cadmium était élevé, mais pas de manière significative. Pour chacun des trois métaux, le risque de cancer du poumon était très élevé parmi les non-fumeurs, mais pas parmi les fumeurs. L’effet combiné du tabagisme et de l’exposition aux métaux était compatible avec un excès de risque additif. Cependant, les intervalles de confiance dans cette étude tendaient à être larges, et une faiblesse de puissance statistique peut limiter l’interprétation de certains résultats. Cette étude est unique dans la mesure où elle a fourni des preuves empiriques sur les risques de développer le cancer du poumon liés aux faibles niveaux d’exposition au nickel, au chrome VI, ou au cadmium provenant de divers contextes de travail. Dans la plupart des autres études, la majorité des expositions pertinentes n’ont pas été bien contrôlées. À l'inverse, cette étude a bénéficié de la collecte et de la disponibilité d'information détaillée concernant le tabagisme et d’autres facteurs de risque. Les résultats de cette étude ont d'importantes conséquences pour la santé publique, tant au niveau de la détermination des risques pour les travailleurs actuellement exposés à ces métaux, qu'au niveau de l’évaluation des risques pour la population en général, elle-même exposée à ces métaux par le biais de la pollution et de la fumée de cigarette. Cette analyse contribuera fort probablement à une réévaluation par le CIRC de la cancérogénicité de ces métaux. L'exploration de la relation entre les risques de cancer du poumon et l'exposition au nickel, au chrome VI et au cadmium est donc opportune et pertinente.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Les efforts globaux pour contrôler la tuberculose sont présentement restreints par la prévalence croissante du VIH/SIDA. Quoique les éclosions de la tuberculose multi résistante (TB-MDR) soient fréquemment rapportées parmi les populations atteintes du SIDA, le lien entre VIH/SIDA et le développement de résistance n’est pas clair. Objectifs: Cette recherche visait à : (1) développer une base de connaissances concernant les facteurs associés à des éclosions de la TB-MDR parmi les patients atteints du VIH/SIDA; (2) utiliser ce cadre de connaissances pour accroître des mesures préliminaires pour mieux contrôler la tuberculose pulmonaire chez les patients atteints du VIH/SIDA; et (3) afin d’améliorer l’application des ces mesures, affiner les techniques bactériologiques existantes pour Mycobacterium tuberculosis. Méthodologie: Quatre études ont été réalisées : (1) Une étude longitudinale pour identifier les facteurs associés avec une éclosion de la TB-MDR parmi les patients atteints du SIDA qui ont reçu le traitement directement supervisé de courte durée (DOTS) pour la tuberculose pulmonaire au Lima et au Pérou entre 1999 et 2005; (2) Une étude transversale pour décrire différentes étapes de l’histoire naturelle de la tuberculose, la prévalence et les facteurs associés avec la mycobactérie qu’on retrouve dans les selles des patients atteints du SIDA; (3) Un projet pilote pour développer des stratégies de dépistage pour la tuberculose pulmonaire parmi les patients hospitalisés atteints du SIDA, en utilisant l’essaie Microscopic Observation Drug Susceptibility (MODS); et (4) Une étude laboratoire pour identifier les meilleures concentrations critiques pour détecter les souches MDR de M. tuberculosis en utilisant l’essaie MODS. Résultats : Étude 1 démontre qu’une épidémie de TB-MDR parmi les patients atteints du SIDA qui ont reçu DOTS pour la tuberculose pulmonaire ait été causée par la superinfection du clone de M. tuberculosis plutôt que le développement de la résistance secondaire. Bien que ce clone ait été plus commun parmi la cohorte de patients atteints du SIDA, il n’avait aucune différence de risque pour superinfection entre les patients avec ou sans SIDA. Ces résultats suggèrent qu’un autre facteur, possiblement associé à la diarrhée, peu contribuer à la prévalence élevée de ce clone chez les patients atteints du SIDA. Étude 2 suggère que chez la plupart des patients atteints du SIDA il a été retrouvé une mycobactérie dans leurs selles alors qu’ils étaient en phase terminale au niveau de la tuberculose pulmonaire. Or, les patients atteints du SIDA ayant été hospitalisés pendant les deux dernières années pour une autre condition médicale sont moins à risque de se retrouver avec une mycobactérie dans leurs selles. Étude 3 confirme que la tuberculose pulmonaire a été commune à tous les patients hospitalisés atteints du SIDA, mais diagnostiquée incorrectement en utilisant les critères cliniques présentement recommandés pour la tuberculose. Or, l’essaie MODS a détecté pour la plupart de ces cas. De plus, MODS a été également efficace quand la méthode a été dirigée aux patients soupçonnés d’avoir la tuberculose, à cause de leurs symptômes. Étude 4 démontre les difficultés de détecter les souches de M. tuberculosis avec une faible résistance contre ethambutol et streptomycine en utilisant l’essai MODS avec les concentrations de drogue présentement recommandées pour un milieu de culture. Cependant, l’utilité diagnostique de MODS peut être améliorée ; modifier les concentrations critiques et utiliser deux plaques et non une, pour des tests réguliers. Conclusion: Nos études soulèvent la nécessité d’améliorer le diagnostic et le traitement de la tuberculose parmi les patients atteints du SIDA, en particulier ceux qui vivent dans des régions avec moins de ressources. Par ailleurs, nos résultats font ressortir les effets indirects que les soins de santé ont sur les patients infectés par le VIH et qu’ils peuvent avoir sur le développement de la tuberculose.