9 resultados para high-level features

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Agent Communication Languages (ACLs) have been developed to provide a way for agents to communicate with each other supporting cooperation in Multi-Agent Systems. In the past few years many ACLs have been proposed for Multi-Agent Systems, such as KQML and FIPA-ACL. The goal of these languages is to support high-level, human like communication among agents, exploiting Knowledge Level features rather than symbol level ones. Adopting these ACLs, and mainly the FIPA-ACL specifications, many agent platforms and prototypes have been developed. Despite these efforts, an important issue in the research on ACLs is still open and concerns how these languages should deal (at the Knowledge Level) with possible failures of agents. Indeed, the notion of Knowledge Level cannot be straightforwardly extended to a distributed framework such as MASs, because problems concerning communication and concurrency may arise when several Knowledge Level agents interact (for example deadlock or starvation). The main contribution of this Thesis is the design and the implementation of NOWHERE, a platform to support Knowledge Level Agents on the Web. NOWHERE exploits an advanced Agent Communication Language, FT-ACL, which provides high-level fault-tolerant communication primitives and satisfies a set of well defined Knowledge Level programming requirements. NOWHERE is well integrated with current technologies, for example providing full integration for Web services. Supporting different middleware used to send messages, it can be adapted to various scenarios. In this Thesis we present the design and the implementation of the architecture, together with a discussion of the most interesting details and a comparison with other emerging agent platforms. We also present several case studies where we discuss the benefits of programming agents using the NOWHERE architecture, comparing the results with other solutions. Finally, the complete source code of the basic examples can be found in appendix.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The first part of the research project of the Co-Advisorship Ph.D Thesis was aimed to select the best Bifidobacterium longum strains suitable to set the basis of our study. We were looking for strains with the abilities to colonize the intestinal mucosa and with good adhesion capacities, so that we can test these strains to investigate their ability to induce apoptosis in “damaged” intestinal cells. Adhesion and apoptosis are the two process that we want to study to better understand the role of an adhesion protein that we have previously identified and that have top scores homologies with the recent serpin encoding gene identified in B. longum by Nestlè researchers. Bifidobacterium longum is a probiotic, known for its beneficial effects to the human gut and even for its immunomodulatory and antitumor activities. Recently, many studies have stressed out the intimate relation between probiotic bacteria and the GIT mucosa and their influence on human cellular homeostasis. We focused on the apoptotic deletion of cancer cells induced by B. longum. This has been valued in vitro, performing the incubation of three B.longum strains with enterocyte-like Caco- 2 cells, to evidence DNA fragmentation, a cornerstone of apoptosis. The three strains tested were defined for their adhesion properties using adhesion and autoaggregation assays. These features are considered necessary to select a probiotic strain. The three strains named B12, B18 and B2990 resulted respectively: “strong adherent”, “adherent” and “non adherent”. Then, bacteria were incubated with Caco-2 cells to investigate apoptotic deletion. Cocultures of Caco-2 cells with B. longum resulted positive in DNA fragmentation test, only when adherent strains were used (B12 and B18). These results indicate that the interaction with adherent B. longum can induce apoptotic deletion of Caco-2 cells, suggesting a role in cellular homeostasis of the gastrointestinal tract and in restoring the ecology of damaged colon tissues. These results were used to keep on researching and the strains tested were used as recipient of recombinant techniques aimed to originate new B.longum strains with enhanced capacity of apoptotic induction in “damaged” intestinal cells. To achieve this new goal it was decided to clone the serpin encoding gene of B. longum, so that we can understand its role in adhesion and apoptosis induction. Bifidobacterium longum has immunostimulant activity that in vitro can lead to apoptotic response of Caco-2 cell line. It secretes a hypothetical eukaryotic type serpin protein, which could be involved in this kind of deletion of damaged cells. We had previously characterised a protein that has homologies with the hypothetical serpin of B. longum (DD087853). In order to create Bifidobacterium serpin transformants, a B. longum cosmid library was screened with a PCR protocol using specific primers for serpin gene. After fragment extraction, the insert named S1 was sub-cloned into pRM2, an Escherichia coli - Bifidobacterium shuttle vector, to construct pRM3. Several protocols for B. longum transformation were performed and the best efficiency was obtained using MRS medium and raffinose. Finally bacterial cell supernatants were tested in a dotblot assay to detect antigens presence against anti-antitrypsin polyclonal antibody. The best signal was produced by one starin that has been renamed B. longum BLKS 7. Our research study was aimed to generate transformants able to over express serpin encoding gene, so that we can have the tools for a further study on bacterial apoptotic induction of Caco-2 cell line. After that we have originated new trasformants the next step to do was to test transformants abilities when exposed to an intestinal cell model. In fact, this part of the project was achieved in the Department of Biochemistry of the Medical Faculty of the University of Maribor, guest of the abroad supervisor of the Co-Advisorship Doctoral Thesis: Prof. Avrelija Cencic. In this study we examined the probiotic ability of some bacterial strains using intestinal cells from a 6 years old pig. The use of intestinal mammalian cells is essential to study this symbiosis and a functional cell model mimics a polarised epithelium in which enterocytes are separated by tight junctions. In this list of strains we have included the Bifidobacterium longum BKS7 transformant strain that we have previously originated; in order to compare its abilities. B. longum B12 wild type and B. longum BKS7 transformant and eight Lactobacillus strains of different sources were co-cultured with porcine small intestine epithelial cells (PSI C1) and porcine blood monocytes (PoM2) in Transwell filter inserts. The strains, including Lb. gasseri, Lb. fermentum, Lb. reuterii, Lb. plantarum and unidentified Lactobacillus from kenyan maasai milk and tanzanian coffee, were assayed for activation of cell lines, measuring nitric oxide by Griess reaction, H202 by tetramethylbenzidine reaction and O2 - by cytochrome C reduction. Cytotoxic effect by crystal violet staining and induction on metabolic activity by MTT cell proliferation assay were tested too. Transepithelial electrical resistance (TER) of polarised PSI C1 was measured during 48 hours co-culture. TER, used to observe epithelium permeability, decrease during pathogenesis and tissue becomes permeable to ion passive flow lowering epithelial barrier function. Probiotics can prevent or restore increased permeability. Lastly, dot-blot was achieved against Interleukin-6 of treated cells supernatants. The metabolic activity of PoM2 and PSI C1 increased slightly after co-culture not affecting mitochondrial functions. No strain was cytotoxic over PSI C1 and PoM2 and no cell activation was observed, as measured by the release of NO2, H202 and O2 - by PoM2 and PSI C1. During coculture TER of polarised PSI C1 was two-fold higher comparing with constant TER (~3000 ) of untreated cells. TER raise generated by bacteria maintains a low permeability of the epithelium. During treatment Interleukin-6 was detected in cell supernatants at several time points, confirming immunostimulant activity. All results were obtained using Lactobacillus paracasei Shirota e Carnobacterium divergens as controls. In conclusion we can state that both the list of putative probiotic bacteria and our new transformant strain of B. longum are not harmful when exposed to intestinal cells and could be selected as probiotics, because can strengthen epithelial barrier function and stimulate nonspecific immunity of intestinal cells on a pig cell model. Indeed, we have found out that none of the strains tested that have good adhesion abilities presents citotoxicity to the intestinal cells and that non of the strains tested can induce cell lines to produce high level of ROS, neither NO2. Moreover we have assayed even the capacity of producing certain citokynes that are correlated with immune response. The detection of Interleukin-6 was assayed in all our samples, including B.longum transformant BKS 7 strain, this result indicates that these bacteria can induce a non specific immune response in the intestinal cells. In fact, when we assayed the presence of Interferon-gamma in cells supernatant after bacterial exposure, we have no positive signals, that means that there is no activation of a specific immune response, thus confirming that these bacteria are not recognize as pathogen by the intestinal cells and are certainly not harmful for intestinal cells. The most important result is the measure of Trans Epithelial Electric Resistance that have shown how the intestinal barrier function get strengthen when cells are exposed to bacteria, due to a reduction of the epithelium permeability. We have now a new strain of B. longum that will be used for further studies above the mechanism of apoptotic induction to “damaged cells” and above the process of “restoring ecology”. This strain will be the basis to originate new transformant strains for Serpin encoding gene that must have better performance and shall be used one day even in clinical cases as in “gene therapy” for cancer treatment and prevention.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Hepatitis E is an infectious viral disease with clinical and morphological features of acute hepatitis. The aetiological agent is the Hepatitis E virus (HEV). The disease represents an important Public Health problem in developing countries where is frequently epidemic and primarily transmitted by fecal-oral route. In the last few years, a certain number of sporadic cases have been also described in industrialized countries, Italy included. A swine HEV was first identified in 1997 and is now considered an ubiquitous virus. Human and swine strains from the same geographical region have shown to have a high level of nucleotidic omology and in experimental infections, the possibility of interspecific transmission of swine strains to humans and of human strains to non-human primates has been demonstrated. Furthermore, some seroepidemiological studies have demonstrated that people working in contact with swine have a higher risk to get infected than normal blood donors. Recently, cases of HEV hepatitis have been directly associated to the ingestion of uncooked tissues from pigs, wild boar or deer and today the disease is considered an emerging zoonosis. The aims of this thesis were: evaluate HEV prevalence in Italian swine herds (both in fattening and in breeding animals); investigate the possibility of finding HEV in livers used for human consumption; investigate if there is any correlation between HEV infection and the presence of macroscopical lesions; investigate HEV prevalence in a demographic managed wild boar population; phylogenetically analyse viral strains identified. During an internship period at Veterinary Laboratories Agency (Weybridge, UK), furthermore, swine samples at different stages of production and slurry lagoons have been analysed. Six swine herds located in North Italy have been sampled at different stage of production. The overall prevalence resulted 42%, and both breeding and fattening animals were positive for HEV infection. A longitudinal study has been conducted in a herd across all stages of production until the slaughtering age. Livers have been collected from the animals at the abattoir and 11.8% of them were positive for HEV infection. No correlations have been identified between HEV infection and macroscopical lesions in pigs affected by different pathological conditions. Of 86 wild boars tested 22 (25%) were positive for HEV. Of the swine tested in UK 21,5 % and 2 of the 9 slurry lagoons (22,2%) were positive for HEV infection. All the strains identified belonged to genotype 3 and showed high percentages of nucleotidic identity with humans and swine strains identified in Europe. The high prevalence detected in these studies confirms the widespread diffusion of HEV in swine populations in Italy and in UK. Phylogenetical analysis of identified strains, similar to those identified in autochthonous human hepatitis E cases of the same geographical area, confirm the hypothesis that pigs can be a font of zoonotical infection. The finding that a fraction of the livers inserted in the food chain are positive for HEV infection it’s of some concern for Public Health. The finding of a high HEV prevalence in all examined farms, together with the observation that infection may be sub-clinical and affect animals at slaughtering age, raise concern because of the possible risk of transmission of HEV to humans by either direct contact with infected pigs, indirect contact with environment and working instruments contaminated with pig feces, or ingestion of contaminated undercooked meat.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Interactive theorem provers (ITP for short) are tools whose final aim is to certify proofs written by human beings. To reach that objective they have to fill the gap between the high level language used by humans for communicating and reasoning about mathematics and the lower level language that a machine is able to “understand” and process. The user perceives this gap in terms of missing features or inefficiencies. The developer tries to accommodate the user requests without increasing the already high complexity of these applications. We believe that satisfactory solutions can only come from a strong synergy between users and developers. We devoted most part of our PHD designing and developing the Matita interactive theorem prover. The software was born in the computer science department of the University of Bologna as the result of composing together all the technologies developed by the HELM team (to which we belong) for the MoWGLI project. The MoWGLI project aimed at giving accessibility through the web to the libraries of formalised mathematics of various interactive theorem provers, taking Coq as the main test case. The motivations for giving life to a new ITP are: • study the architecture of these tools, with the aim of understanding the source of their complexity • exploit such a knowledge to experiment new solutions that, for backward compatibility reasons, would be hard (if not impossible) to test on a widely used system like Coq. Matita is based on the Curry-Howard isomorphism, adopting the Calculus of Inductive Constructions (CIC) as its logical foundation. Proof objects are thus, at some extent, compatible with the ones produced with the Coq ITP, that is itself able to import and process the ones generated using Matita. Although the systems have a lot in common, they share no code at all, and even most of the algorithmic solutions are different. The thesis is composed of two parts where we respectively describe our experience as a user and a developer of interactive provers. In particular, the first part is based on two different formalisation experiences: • our internship in the Mathematical Components team (INRIA), that is formalising the finite group theory required to attack the Feit Thompson Theorem. To tackle this result, giving an effective classification of finite groups of odd order, the team adopts the SSReflect Coq extension, developed by Georges Gonthier for the proof of the four colours theorem. • our collaboration at the D.A.M.A. Project, whose goal is the formalisation of abstract measure theory in Matita leading to a constructive proof of Lebesgue’s Dominated Convergence Theorem. The most notable issues we faced, analysed in this part of the thesis, are the following: the difficulties arising when using “black box” automation in large formalisations; the impossibility for a user (especially a newcomer) to master the context of a library of already formalised results; the uncomfortable big step execution of proof commands historically adopted in ITPs; the difficult encoding of mathematical structures with a notion of inheritance in a type theory without subtyping like CIC. In the second part of the manuscript many of these issues will be analysed with the looking glasses of an ITP developer, describing the solutions we adopted in the implementation of Matita to solve these problems: integrated searching facilities to assist the user in handling large libraries of formalised results; a small step execution semantic for proof commands; a flexible implementation of coercive subtyping allowing multiple inheritance with shared substructures; automatic tactics, integrated with the searching facilities, that generates proof commands (and not only proof objects, usually kept hidden to the user) one of which specifically designed to be user driven.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Peer-to-Peer network paradigm is drawing the attention of both final users and researchers for its features. P2P networks shift from the classic client-server approach to a high level of decentralization where there is no central control and all the nodes should be able not only to require services, but to provide them to other peers as well. While on one hand such high level of decentralization might lead to interesting properties like scalability and fault tolerance, on the other hand it implies many new problems to deal with. A key feature of many P2P systems is openness, meaning that everybody is potentially able to join a network with no need for subscription or payment systems. The combination of openness and lack of central control makes it feasible for a user to free-ride, that is to increase its own benefit by using services without allocating resources to satisfy other peers’ requests. One of the main goals when designing a P2P system is therefore to achieve cooperation between users. Given the nature of P2P systems based on simple local interactions of many peers having partial knowledge of the whole system, an interesting way to achieve desired properties on a system scale might consist in obtaining them as emergent properties of the many interactions occurring at local node level. Two methods are typically used to face the problem of cooperation in P2P networks: 1) engineering emergent properties when designing the protocol; 2) study the system as a game and apply Game Theory techniques, especially to find Nash Equilibria in the game and to reach them making the system stable against possible deviant behaviors. In this work we present an evolutionary framework to enforce cooperative behaviour in P2P networks that is alternative to both the methods mentioned above. Our approach is based on an evolutionary algorithm inspired by computational sociology and evolutionary game theory, consisting in having each peer periodically trying to copy another peer which is performing better. The proposed algorithms, called SLAC and SLACER, draw inspiration from tag systems originated in computational sociology, the main idea behind the algorithm consists in having low performance nodes copying high performance ones. The algorithm is run locally by every node and leads to an evolution of the network both from the topology and from the nodes’ strategy point of view. Initial tests with a simple Prisoners’ Dilemma application show how SLAC is able to bring the network to a state of high cooperation independently from the initial network conditions. Interesting results are obtained when studying the effect of cheating nodes on SLAC algorithm. In fact in some cases selfish nodes rationally exploiting the system for their own benefit can actually improve system performance from the cooperation formation point of view. The final step is to apply our results to more realistic scenarios. We put our efforts in studying and improving the BitTorrent protocol. BitTorrent was chosen not only for its popularity but because it has many points in common with SLAC and SLACER algorithms, ranging from the game theoretical inspiration (tit-for-tat-like mechanism) to the swarms topology. We discovered fairness, meant as ratio between uploaded and downloaded data, to be a weakness of the original BitTorrent protocol and we drew inspiration from the knowledge of cooperation formation and maintenance mechanism derived from the development and analysis of SLAC and SLACER, to improve fairness and tackle freeriding and cheating in BitTorrent. We produced an extension of BitTorrent called BitFair that has been evaluated through simulation and has shown the abilities of enforcing fairness and tackling free-riding and cheating nodes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The vast majority of known proteins have not yet been experimentally characterized and little is known about their function. The design and implementation of computational tools can provide insight into the function of proteins based on their sequence, their structure, their evolutionary history and their association with other proteins. Knowledge of the three-dimensional (3D) structure of a protein can lead to a deep understanding of its mode of action and interaction, but currently the structures of <1% of sequences have been experimentally solved. For this reason, it became urgent to develop new methods that are able to computationally extract relevant information from protein sequence and structure. The starting point of my work has been the study of the properties of contacts between protein residues, since they constrain protein folding and characterize different protein structures. Prediction of residue contacts in proteins is an interesting problem whose solution may be useful in protein folding recognition and de novo design. The prediction of these contacts requires the study of the protein inter-residue distances related to the specific type of amino acid pair that are encoded in the so-called contact map. An interesting new way of analyzing those structures came out when network studies were introduced, with pivotal papers demonstrating that protein contact networks also exhibit small-world behavior. In order to highlight constraints for the prediction of protein contact maps and for applications in the field of protein structure prediction and/or reconstruction from experimentally determined contact maps, I studied to which extent the characteristic path length and clustering coefficient of the protein contacts network are values that reveal characteristic features of protein contact maps. Provided that residue contacts are known for a protein sequence, the major features of its 3D structure could be deduced by combining this knowledge with correctly predicted motifs of secondary structure. In the second part of my work I focused on a particular protein structural motif, the coiled-coil, known to mediate a variety of fundamental biological interactions. Coiled-coils are found in a variety of structural forms and in a wide range of proteins including, for example, small units such as leucine zippers that drive the dimerization of many transcription factors or more complex structures such as the family of viral proteins responsible for virus-host membrane fusion. The coiled-coil structural motif is estimated to account for 5-10% of the protein sequences in the various genomes. Given their biological importance, in my work I introduced a Hidden Markov Model (HMM) that exploits the evolutionary information derived from multiple sequence alignments, to predict coiled-coil regions and to discriminate coiled-coil sequences. The results indicate that the new HMM outperforms all the existing programs and can be adopted for the coiled-coil prediction and for large-scale genome annotation. Genome annotation is a key issue in modern computational biology, being the starting point towards the understanding of the complex processes involved in biological networks. The rapid growth in the number of protein sequences and structures available poses new fundamental problems that still deserve an interpretation. Nevertheless, these data are at the basis of the design of new strategies for tackling problems such as the prediction of protein structure and function. Experimental determination of the functions of all these proteins would be a hugely time-consuming and costly task and, in most instances, has not been carried out. As an example, currently, approximately only 20% of annotated proteins in the Homo sapiens genome have been experimentally characterized. A commonly adopted procedure for annotating protein sequences relies on the "inheritance through homology" based on the notion that similar sequences share similar functions and structures. This procedure consists in the assignment of sequences to a specific group of functionally related sequences which had been grouped through clustering techniques. The clustering procedure is based on suitable similarity rules, since predicting protein structure and function from sequence largely depends on the value of sequence identity. However, additional levels of complexity are due to multi-domain proteins, to proteins that share common domains but that do not necessarily share the same function, to the finding that different combinations of shared domains can lead to different biological roles. In the last part of this study I developed and validate a system that contributes to sequence annotation by taking advantage of a validated transfer through inheritance procedure of the molecular functions and of the structural templates. After a cross-genome comparison with the BLAST program, clusters were built on the basis of two stringent constraints on sequence identity and coverage of the alignment. The adopted measure explicity answers to the problem of multi-domain proteins annotation and allows a fine grain division of the whole set of proteomes used, that ensures cluster homogeneity in terms of sequence length. A high level of coverage of structure templates on the length of protein sequences within clusters ensures that multi-domain proteins when present can be templates for sequences of similar length. This annotation procedure includes the possibility of reliably transferring statistically validated functions and structures to sequences considering information available in the present data bases of molecular functions and structures.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The digital electronic market development is founded on the continuous reduction of the transistors size, to reduce area, power, cost and increase the computational performance of integrated circuits. This trend, known as technology scaling, is approaching the nanometer size. The lithographic process in the manufacturing stage is increasing its uncertainty with the scaling down of the transistors size, resulting in a larger parameter variation in future technology generations. Furthermore, the exponential relationship between the leakage current and the threshold voltage, is limiting the threshold and supply voltages scaling, increasing the power density and creating local thermal issues, such as hot spots, thermal runaway and thermal cycles. In addiction, the introduction of new materials and the smaller devices dimension are reducing transistors robustness, that combined with high temperature and frequently thermal cycles, are speeding up wear out processes. Those effects are no longer addressable only at the process level. Consequently the deep sub-micron devices will require solutions which will imply several design levels, as system and logic, and new approaches called Design For Manufacturability (DFM) and Design For Reliability. The purpose of the above approaches is to bring in the early design stages the awareness of the device reliability and manufacturability, in order to introduce logic and system able to cope with the yield and reliability loss. The ITRS roadmap suggests the following research steps to integrate the design for manufacturability and reliability in the standard CAD automated design flow: i) The implementation of new analysis algorithms able to predict the system thermal behavior with the impact to the power and speed performances. ii) High level wear out models able to predict the mean time to failure of the system (MTTF). iii) Statistical performance analysis able to predict the impact of the process variation, both random and systematic. The new analysis tools have to be developed beside new logic and system strategies to cope with the future challenges, as for instance: i) Thermal management strategy that increase the reliability and life time of the devices acting to some tunable parameter,such as supply voltage or body bias. ii) Error detection logic able to interact with compensation techniques as Adaptive Supply Voltage ASV, Adaptive Body Bias ABB and error recovering, in order to increase yield and reliability. iii) architectures that are fundamentally resistant to variability, including locally asynchronous designs, redundancy, and error correcting signal encodings (ECC). The literature already features works addressing the prediction of the MTTF, papers focusing on thermal management in the general purpose chip, and publications on statistical performance analysis. In my Phd research activity, I investigated the need for thermal management in future embedded low-power Network On Chip (NoC) devices.I developed a thermal analysis library, that has been integrated in a NoC cycle accurate simulator and in a FPGA based NoC simulator. The results have shown that an accurate layout distribution can avoid the onset of hot-spot in a NoC chip. Furthermore the application of thermal management can reduce temperature and number of thermal cycles, increasing the systemreliability. Therefore the thesis advocates the need to integrate a thermal analysis in the first design stages for embedded NoC design. Later on, I focused my research in the development of statistical process variation analysis tool that is able to address both random and systematic variations. The tool was used to analyze the impact of self-timed asynchronous logic stages in an embedded microprocessor. As results we confirmed the capability of self-timed logic to increase the manufacturability and reliability. Furthermore we used the tool to investigate the suitability of low-swing techniques in the NoC system communication under process variations. In this case We discovered the superior robustness to systematic process variation of low-swing links, which shows a good response to compensation technique as ASV and ABB. Hence low-swing is a good alternative to the standard CMOS communication for power, speed, reliability and manufacturability. In summary my work proves the advantage of integrating a statistical process variation analysis tool in the first stages of the design flow.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The research activity carried out during the PhD course was focused on the development of mathematical models of some cognitive processes and their validation by means of data present in literature, with a double aim: i) to achieve a better interpretation and explanation of the great amount of data obtained on these processes from different methodologies (electrophysiological recordings on animals, neuropsychological, psychophysical and neuroimaging studies in humans), ii) to exploit model predictions and results to guide future research and experiments. In particular, the research activity has been focused on two different projects: 1) the first one concerns the development of neural oscillators networks, in order to investigate the mechanisms of synchronization of the neural oscillatory activity during cognitive processes, such as object recognition, memory, language, attention; 2) the second one concerns the mathematical modelling of multisensory integration processes (e.g. visual-acoustic), which occur in several cortical and subcortical regions (in particular in a subcortical structure named Superior Colliculus (SC)), and which are fundamental for orienting motor and attentive responses to external world stimuli. This activity has been realized in collaboration with the Center for Studies and Researches in Cognitive Neuroscience of the University of Bologna (in Cesena) and the Department of Neurobiology and Anatomy of the Wake Forest University School of Medicine (NC, USA). PART 1. Objects representation in a number of cognitive functions, like perception and recognition, foresees distribute processes in different cortical areas. One of the main neurophysiological question concerns how the correlation between these disparate areas is realized, in order to succeed in grouping together the characteristics of the same object (binding problem) and in maintaining segregated the properties belonging to different objects simultaneously present (segmentation problem). Different theories have been proposed to address these questions (Barlow, 1972). One of the most influential theory is the so called “assembly coding”, postulated by Singer (2003), according to which 1) an object is well described by a few fundamental properties, processing in different and distributed cortical areas; 2) the recognition of the object would be realized by means of the simultaneously activation of the cortical areas representing its different features; 3) groups of properties belonging to different objects would be kept separated in the time domain. In Chapter 1.1 and in Chapter 1.2 we present two neural network models for object recognition, based on the “assembly coding” hypothesis. These models are networks of Wilson-Cowan oscillators which exploit: i) two high-level “Gestalt Rules” (the similarity and previous knowledge rules), to realize the functional link between elements of different cortical areas representing properties of the same object (binding problem); 2) the synchronization of the neural oscillatory activity in the γ-band (30-100Hz), to segregate in time the representations of different objects simultaneously present (segmentation problem). These models are able to recognize and reconstruct multiple simultaneous external objects, even in difficult case (some wrong or lacking features, shared features, superimposed noise). In Chapter 1.3 the previous models are extended to realize a semantic memory, in which sensory-motor representations of objects are linked with words. To this aim, the network, previously developed, devoted to the representation of objects as a collection of sensory-motor features, is reciprocally linked with a second network devoted to the representation of words (lexical network) Synapses linking the two networks are trained via a time-dependent Hebbian rule, during a training period in which individual objects are presented together with the corresponding words. Simulation results demonstrate that, during the retrieval phase, the network can deal with the simultaneous presence of objects (from sensory-motor inputs) and words (from linguistic inputs), can correctly associate objects with words and segment objects even in the presence of incomplete information. Moreover, the network can realize some semantic links among words representing objects with some shared features. These results support the idea that semantic memory can be described as an integrated process, whose content is retrieved by the co-activation of different multimodal regions. In perspective, extended versions of this model may be used to test conceptual theories, and to provide a quantitative assessment of existing data (for instance concerning patients with neural deficits). PART 2. The ability of the brain to integrate information from different sensory channels is fundamental to perception of the external world (Stein et al, 1993). It is well documented that a number of extraprimary areas have neurons capable of such a task; one of the best known of these is the superior colliculus (SC). This midbrain structure receives auditory, visual and somatosensory inputs from different subcortical and cortical areas, and is involved in the control of orientation to external events (Wallace et al, 1993). SC neurons respond to each of these sensory inputs separately, but is also capable of integrating them (Stein et al, 1993) so that the response to the combined multisensory stimuli is greater than that to the individual component stimuli (enhancement). This enhancement is proportionately greater if the modality-specific paired stimuli are weaker (the principle of inverse effectiveness). Several studies have shown that the capability of SC neurons to engage in multisensory integration requires inputs from cortex; primarily the anterior ectosylvian sulcus (AES), but also the rostral lateral suprasylvian sulcus (rLS). If these cortical inputs are deactivated the response of SC neurons to cross-modal stimulation is no different from that evoked by the most effective of its individual component stimuli (Jiang et al 2001). This phenomenon can be better understood through mathematical models. The use of mathematical models and neural networks can place the mass of data that has been accumulated about this phenomenon and its underlying circuitry into a coherent theoretical structure. In Chapter 2.1 a simple neural network model of this structure is presented; this model is able to reproduce a large number of SC behaviours like multisensory enhancement, multisensory and unisensory depression, inverse effectiveness. In Chapter 2.2 this model was improved by incorporating more neurophysiological knowledge about the neural circuitry underlying SC multisensory integration, in order to suggest possible physiological mechanisms through which it is effected. This endeavour was realized in collaboration with Professor B.E. Stein and Doctor B. Rowland during the 6 months-period spent at the Department of Neurobiology and Anatomy of the Wake Forest University School of Medicine (NC, USA), within the Marco Polo Project. The model includes four distinct unisensory areas that are devoted to a topological representation of external stimuli. Two of them represent subregions of the AES (i.e., FAES, an auditory area, and AEV, a visual area) and send descending inputs to the ipsilateral SC; the other two represent subcortical areas (one auditory and one visual) projecting ascending inputs to the same SC. Different competitive mechanisms, realized by means of population of interneurons, are used in the model to reproduce the different behaviour of SC neurons in conditions of cortical activation and deactivation. The model, with a single set of parameters, is able to mimic the behaviour of SC multisensory neurons in response to very different stimulus conditions (multisensory enhancement, inverse effectiveness, within- and cross-modal suppression of spatially disparate stimuli), with cortex functional and cortex deactivated, and with a particular type of membrane receptors (NMDA receptors) active or inhibited. All these results agree with the data reported in Jiang et al. (2001) and in Binns and Salt (1996). The model suggests that non-linearities in neural responses and synaptic (excitatory and inhibitory) connections can explain the fundamental aspects of multisensory integration, and provides a biologically plausible hypothesis about the underlying circuitry.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis focuses on the ceramic process for the production of optical grade transparent materials to be used as laser hosts. In order to be transparent a ceramic material must exhibit a very low concentration of defects. Defects are mainly represented by secondary or grain boundary phases and by residual pores. The strict control of the stoichiometry is mandatory to avoid the formation of secondary phases, whereas residual pores need to be below 150 ppm. In order to fulfill these requirements specific experimental conditions must be combined together. In addition powders need to be nanometric or at least sub-micrometric and extremely pure. On the other hand, nanometric powders aggregate easily and this leads to a poor, not homogeneous packing during shaping by pressing and to the formation of residual pores during sintering. Very fine powders are also difficult to handle and tend to absorb water on the surface. Finally, the powder manipulation (weighting operations, solvent removal, spray drying, shaping, etc), easily introduces impurities. All these features must be fully controlled in order to avoid the formation of defects that work as scattering sources thus decreasing the transparency of the material. The important role played by the processing on the transparency of ceramic materials is often underestimated. In the literature a high level of transparency has been reported by many authors but the description of the experimental process, in particular of the powder treatment and shaping, is seldom extensively described and important information that are necessary to reproduce the described results are often missing. The main goal of the present study therefore is to give additional information on the way the experimental features affect the microstructural evolution of YAG-based ceramics and thus the final properties, in particular transparency. Commercial powders are used to prepare YAG materials doped with Nd or Yb by reactive sintering under high vacuum. These dopants have been selected as the more appropriate for high energy and high peak power lasers. As far as it concerns the powder treatment, the thesis focuses on the influence of the solvent removal technique (rotavapor versus spray drying of suspensions in ethanol), the ball milling duration and speed, suspension concentration, solvent ratio, type and amount of dispersant. The influence of the powder type and process on the powder packing as well as the pressure conditions during shaping by pressing are also described. Finally calcination, sintering under high vacuum and in clean atmosphere, and post sintering cycles are studied and related to the final microstructure analyzed by SEM-EDS and HR-TEM, and to the optical and laser properties.