898 resultados para Artificial immune systems
Resumo:
Le récepteur neurokinine 1 (NK1R) est impliqué dans la régulation des réponses immunitaires innées et adaptatives. Cependant, les mécanismes par lesquels le NK1R modulerait ces réponses ne sont pas connus. Chez les cellules T, les voies de la calcineurine et de la mTOR constituent les cibles d’immunosuppresseurs, comme la cyclosporine A (CsA), le tacrolimus et la rapamycine. Ainsi, nous avons voulu déterminer si le NK1R pourrait agir sur ces voies et si le blocage pharmacologique du NK1R avec des antagonistes sélectifs, pourrait augmenter l’action de ces immunosuppresseurs sur l’activation des cellules T. Tout d’abord, nos résultats ont montré que les cellules Jurkat (celules T humaines) exprimaient à la fois le gène du NK1R et de son ligand (les endokinines). Ceci suggère l'existence d'une régulation autocrine tachykinergique de la fonction des cellules T. Cette hypothèse est appuyée par nos données, où nous avons observé que le blocage du NK1R avec des antagonistes spécifiques (L-733,060 et L-703,606) chez les cellules Jurkat, inhibe la production d'IL-2 et diminue l'activation du NFAT (substrat de la calcineurine). De façon intéressante, nous avons montré un effet de combinaison entre les antagonistes du NK1R et les inhibiteurs de la calcineurine (CsA et tacrolimus) sur la production d’IL-2 et l’activation du NFAT. En revanche, le blocage du NK1R n'a pas d'effet inhibiteur sur l’activation de la mTOR et la p70S6K, mais réduit la phosphorylation de S6R (Ser235/236) et Akt (Ser473). Enfin, nous n’avons observé aucun effet de combinaison avec la rapamycine et l’antagoniste NK1R sur l’activation de mTOR et de sa voie de signalisation. L’ensemble de nos résultats, démontrent la présence d'un nouveau mécanisme de régulation de NFAT impliquant le système tachykinergique NK1R/endokinines chez les cellules T. Par conséquent, nous suggérons que la combinaison des antagonistes NK1R avec les inhibiteurs de la calcineurine pourrait être une alternative thérapeutique intéressante afin de réduire les doses de CsA et le FK506 dans les protocoles de prévention de rejet de greffes.
Resumo:
Background The gut and immune system form a complex integrated structure that has evolved to provide effective digestion and defence against ingested toxins and pathogenic bacteria. However, great variation exists in what is considered normal healthy gut and immune function. Thus, whilst it is possible to measure many aspects of digestion and immunity, it is more difficult to interpret the benefits to individuals of variation within what is considered to be a normal range. Nevertheless, it is important to set standards for optimal function for use both by the consumer, industry and those concerned with the public health. The digestive tract is most frequently the object of functional and health claims and a large market already exists for gut-functional foods worldwide. Aim To define normal function of the gut and immune system and describe available methods of measuring it. Results We have defined normal bowel habit and transit time, identified their role as risk factors for disease and how they may be measured. Similarly, we have tried to define what is a healthy gut flora in terms of the dominant genera and their metabolism and listed the many, varied and novel methods for determining these parameters. It has proved less easy to provide boundaries for what constitutes optimal or improved gastric emptying, gut motility, nutrient and water absorption and the function of organs such as the liver, gallbladder and pancreas. The many tests of these functions are described. We have discussed gastrointestinal well being. Sensations arising from the gut can be both pleasant and unpleasant. However, the characteristics of well being are ill defined and merge imperceptibly from acceptable to unacceptable, a state that is subjective. Nevertheless, we feel this is an important area for future work and method development. The immune system is even more difficult to make quantitative judgements about. When it is defective, then clinical problems ensure, but this is an uncommon state. The innate and adaptive immune systems work synergistically together and comprise many cellular and humoral factors. The adaptive system is extremely sophisticated and between the two arms of immunity there is great redundancy, which provides robust defences. New aspects of immune function are discovered regularly. It is not clear whether immune function can be "improved". Measuring aspects of immune function is possible but there is no one test that will define either the status or functional capacity of the immune system. Human studies are often limited by the ability to sample only blood or secretions such as saliva but it should be remembered that only 2% of lymphocytes circulate at any given time, which limits interpretation of data. We recommend assessing the functional capacity of the immune system by: measuring specific cell functions ex vivo, measuring in vivo responses to challenge, e. g. change in antibody in blood or response to antigens, determining the incidence and severity of infection in target populations during naturally occurring episodes or in response to attenuated pathogens.
Resumo:
This paper represents the first step in an on-going work for designing an unsupervised method based on genetic algorithm for intrusion detection. Its main role in a broader system is to notify of an unusual traffic and in that way provide the possibility of detecting unknown attacks. Most of the machine-learning techniques deployed for intrusion detection are supervised as these techniques are generally more accurate, but this implies the need of labeling the data for training and testing which is time-consuming and error-prone. Hence, our goal is to devise an anomaly detector which would be unsupervised, but at the same time robust and accurate. Genetic algorithms are robust and able to avoid getting stuck in local optima, unlike the rest of clustering techniques. The model is verified on KDD99 benchmark dataset, generating a solution competitive with the solutions of the state-of-the-art which demonstrates high possibilities of the proposed method.
Resumo:
Purpose - To present an account of cognition integrating second-order cybernetics (SOC) together with enactive perception and dynamic systems theory. Design/methodology/approach - The paper presents a brief critique of classical models of cognition then outlines how integration of SOC, enactive perception and dynamic systems theory can overcome some weaknesses of the classical paradigm. Findings - Presents the critique of evolutionary robotics showing how the issues of teleology and autonomy are left unresolved by this paradigm although their solution fits within the proposed framework. Research limitations/implications - The paper highlights the importance of genuine autonomy in the development of artificial cognitive systems. It sets out a framework within which the robofic research of cognitive systems could succeed. Practical implications - There are no immediate practical implications but see research implications. Originality/value - It joins the discussion on the fundamental nature of cognitive systems and emphasise the importance of autonomy and embodiment.
Resumo:
Costs of resistance are widely assumed to be important in the evolution of parasite and pathogen defence in animals, but they have been demonstrated experimentally on very few occasions. Endoparasitoids are insects whose larvae develop inside the bodies of other insects where they defend themselves from attack by their hosts' immune systems (especially cellular encapsulation). Working with Drosophila melanogaster and its endoparasitoid Leptopilina boulardi, we selected for increased resistance in four replicate populations of flies. The percentage of flies surviving attack increased from about 0.5% to between 40% and 50% in five generations, revealing substantial additive genetic variation in resistance in the field population from which our culture was established. In comparison with four control lines, flies from selected lines suffered from lower larval survival under conditions of moderate to severe intraspecific competition.
Resumo:
Chatterbox Challenge is an annual web-based contest for artificial conversational systems, ACE. The 2010 instantiation was the tenth consecutive contest held between March and June in the 60th year following the publication of Alan Turing’s influential disquisition ‘computing machinery and intelligence’. Loosely based on Turing’s viva voca interrogator-hidden witness imitation game, a thought experiment to ascertain a machine’s capacity to respond satisfactorily to unrestricted questions, the contest provides a platform for technology comparison and evaluation. This paper provides an insight into emotion content in the entries since the 2005 Chatterbox Challenge. The authors find that synthetic textual systems, none of which are backed by academic or industry funding, are, on the whole and more than half a century since Weizenbaum’s natural language understanding experiment, little further than Eliza in terms of expressing emotion in dialogue. This may be a failure on the part of the academic AI community for ignoring the Turing test as an engineering challenge.
Resumo:
Bacterial pathogens exhibit significant variation in their genomic content of virulence factors. This reflects the abundance of strategies pathogens evolved to infect host organisms by suppressing host immunity. Molecular arms-races have been a strong driving force for the evolution of pathogenicity, with pathogens often encoding overlapping or redundant functions, such as type III protein secretion effectors and hosts encoding ever more sophisticated immune systems. The pathogens’ frequent exposure to other microbes, either in their host or in the environment, provides opportunities for the acquisition or interchange of mobile genetic elements. These DNA elements accessorise the core genome and can play major roles in shaping genome structure and altering the complement of virulence factors. Here, we review the different mobile genetic elements focusing on the more recent discoveries and highlighting their role in shaping bacterial pathogen evolution.
Resumo:
This paper analyzes the changes the ways of organizing memory have undergone since ancient times, turning them into the current artificial memory systems. It aims to draw a parallel between the art of memory (which associates images to specific texts) and the hypertext (which also uses associations, but in a non-linear way). Our methodology consisted of a qualitative approach, involving the collection of texts about the art of memory and hypertext; this enables us to salvage the historical-cultural changes which have modified form and use of the art of memory and allowed the creation of hypertext. It also analyzes the similarities among artificial memory systems created by different cultures in order to prevent loss of knowledge produced by society.
Resumo:
Mycoplasma genitalium (Mg) is a mollicute that causes a range of human urogenital infections. A hallmark of these bacteria is their ability to establish chronic infections that can persist despite completion of appropriate antibiotic therapies and intact and functional immune systems. Intimate adherence and surface colonization of mycoplasmas to host cells are important pathogenic features. However, their facultative intracellular nature is poorly understood, partly due to difficulties in developing and standardizing cellular interaction model systems. Here, we characterize growth and invasion properties of two Mg strains (G37 and 1019V). Mg G37 is a high-passage laboratory strain, while Mg 1019V is a low-passage isolate recovered from the cervix. The two strains diverge partially in gene sequences for adherence-related proteins and exhibit subtle variations in their axenic growth. However, with both strains and consistent with our previous studies, a subset of adherent Mg organisms invade host cells and exhibit perinuclear targeting. Remarkably, intranuclear localization of Mg proteins is observed, which occurred as early as 30 min after infection. Mg strains deficient in adherence were markedly reduced in their ability to invade and associate with perinuclear and nuclear sites.
Resumo:
Object selection refers to the mechanism of extracting objects of interest while ignoring other objects and background in a given visual scene. It is a fundamental issue for many computer vision and image analysis techniques and it is still a challenging task to artificial Visual systems. Chaotic phase synchronization takes place in cases involving almost identical dynamical systems and it means that the phase difference between the systems is kept bounded over the time, while their amplitudes remain chaotic and may be uncorrelated. Instead of complete synchronization, phase synchronization is believed to be a mechanism for neural integration in brain. In this paper, an object selection model is proposed. Oscillators in the network representing the salient object in a given scene are phase synchronized, while no phase synchronization occurs for background objects. In this way, the salient object can be extracted. In this model, a shift mechanism is also introduced to change attention from one object to another. Computer simulations show that the model produces some results similar to those observed in natural vision systems.
Resumo:
Biological systems have facility to capture salient object(s) in a given scene, but it is still a difficult task to be accomplished by artificial vision systems. In this paper a visual selection mechanism based on the integrate and fire neural network is proposed. The model not only can discriminate objects in a given visual scene, but also can deliver focus of attention to the salient object. Moreover, it processes a combination of relevant features of an input scene, such as intensity, color, orientation, and the contrast of them. In comparison to other visual selection approaches, this model presents several interesting features. It is able to capture attention of objects in complex forms, including those linearly nonseparable. Moreover, computer simulations show that the model produces results similar to those observed in natural vision systems.
Resumo:
Many contaminants are currently unregulated by the government and do not have a set limit, known as the Maximum Contaminant Level, which is dictated by cost and the best available treatment technology. The Maximum Contaminant Level Goal, on the other hand, is based solely upon health considerations and is non-enforceable. In addition to being naturally occurring, contaminants may enter drinking water supplies through industrial sources, agricultural practices, urban pollution, sprawl, and water treatment byproducts. Exposure to these contaminants is not limited to ingestion and can also occur through dermal absorption and inhalation in the shower. Health risks for the general public include skin damage, increased risk of cancer, circulatory problems, and multiple toxicities. At low levels, these contaminants generally are not harmful in our drinking water. However, children, pregnant women, and people with compromised immune systems are more vulnerable to the health risks associated with these contaminants. Vulnerable peoples should take additional precautions with drinking water. This research project was conducted in order to learn more about our local drinking water and to characterize our exposure to contaminants. We hope to increase public awareness of water quality issues by educating the local residents about their drinking water in order to promote public health and minimize exposure to some of the contaminants contained within public water supplies.
Resumo:
Natural ventilation is an efficient bioclimatic strategy, one that provides thermal comfort, healthful and cooling to the edification. However, the disregard for quality environment, the uncertainties involved in the phenomenon and the popularization of artificial climate systems are held as an excuse for those who neglect the benefits of passive cooling. The unfamiliarity with the concept may be lessened if ventilation is observed in every step of the project, especially in the initial phase in which decisions bear a great impact in the construction process. The tools available in order to quantify the impact of projected decisions consist basically of the renovation rate calculations or computer simulations of fluids, commonly dubbed CFD, which stands for Computational Fluid Dynamics , both somewhat apart from the project s execution and unable to adapt for use in parametric studies. Thus, we chose to verify, through computer simulation, the representativeness of the results with a method of simplified air reconditioning rate calculation, as well as making it more compatible with the questions relevant to the first phases of the project s process. The case object consists of a model resulting from the recommendations of the Código de Obras de Natal/ RN, customized according to the NBR 15220. The study has shown the complexity in aggregating a CFD tool to the process and the need for a method capable of generating data at the compatible rate to the flow of ideas and are discarded during the project s development. At the end of our study, we discuss the necessary concessions for the realization of simulations, the applicability and the limitations of both the tools used and the method adopted, as well as the representativeness of the results obtained