587 resultados para Assemblies
Resumo:
Full text: The idea of producing proteins from recombinant DNA hatched almost half a century ago. In his PhD thesis, Peter Lobban foresaw the prospect of inserting foreign DNA (from any source, including mammalian cells) into the genome of a λ phage in order to detect and recover protein products from Escherichia coli [ 1 and 2]. Only a few years later, in 1977, Herbert Boyer and his colleagues succeeded in the first ever expression of a peptide-coding gene in E. coli — they produced recombinant somatostatin [ 3] followed shortly after by human insulin. The field has advanced enormously since those early days and today recombinant proteins have become indispensable in advancing research and development in all fields of the life sciences. Structural biology, in particular, has benefitted tremendously from recombinant protein biotechnology, and an overwhelming proportion of the entries in the Protein Data Bank (PDB) are based on heterologously expressed proteins. Nonetheless, synthesizing, purifying and stabilizing recombinant proteins can still be thoroughly challenging. For example, the soluble proteome is organized to a large part into multicomponent complexes (in humans often comprising ten or more subunits), posing critical challenges for recombinant production. A third of all proteins in cells are located in the membrane, and pose special challenges that require a more bespoke approach. Recent advances may now mean that even these most recalcitrant of proteins could become tenable structural biology targets on a more routine basis. In this special issue, we examine progress in key areas that suggests this is indeed the case. Our first contribution examines the importance of understanding quality control in the host cell during recombinant protein production, and pays particular attention to the synthesis of recombinant membrane proteins. A major challenge faced by any host cell factory is the balance it must strike between its own requirements for growth and the fact that its cellular machinery has essentially been hijacked by an expression construct. In this context, Bill and von der Haar examine emerging insights into the role of the dependent pathways of translation and protein folding in defining high-yielding recombinant membrane protein production experiments for the common prokaryotic and eukaryotic expression hosts. Rather than acting as isolated entities, many membrane proteins form complexes to carry out their functions. To understand their biological mechanisms, it is essential to study the molecular structure of the intact membrane protein assemblies. Recombinant production of membrane protein complexes is still a formidable, at times insurmountable, challenge. In these cases, extraction from natural sources is the only option to prepare samples for structural and functional studies. Zorman and co-workers, in our second contribution, provide an overview of recent advances in the production of multi-subunit membrane protein complexes and highlight recent achievements in membrane protein structural research brought about by state-of-the-art near-atomic resolution cryo-electron microscopy techniques. E. coli has been the dominant host cell for recombinant protein production. Nonetheless, eukaryotic expression systems, including yeasts, insect cells and mammalian cells, are increasingly gaining prominence in the field. The yeast species Pichia pastoris, is a well-established recombinant expression system for a number of applications, including the production of a range of different membrane proteins. Byrne reviews high-resolution structures that have been determined using this methylotroph as an expression host. Although it is not yet clear why P. pastoris is suited to producing such a wide range of membrane proteins, its ease of use and the availability of diverse tools that can be readily implemented in standard bioscience laboratories mean that it is likely to become an increasingly popular option in structural biology pipelines. The contribution by Columbus concludes the membrane protein section of this volume. In her overview of post-expression strategies, Columbus surveys the four most common biochemical approaches for the structural investigation of membrane proteins. Limited proteolysis has successfully aided structure determination of membrane proteins in many cases. Deglycosylation of membrane proteins following production and purification analysis has also facilitated membrane protein structure analysis. Moreover, chemical modifications, such as lysine methylation and cysteine alkylation, have proven their worth to facilitate crystallization of membrane proteins, as well as NMR investigations of membrane protein conformational sampling. Together these approaches have greatly facilitated the structure determination of more than 40 membrane proteins to date. It may be an advantage to produce a target protein in mammalian cells, especially if authentic post-translational modifications such as glycosylation are required for proper activity. Chinese Hamster Ovary (CHO) cells and Human Embryonic Kidney (HEK) 293 cell lines have emerged as excellent hosts for heterologous production. The generation of stable cell-lines is often an aspiration for synthesizing proteins expressed in mammalian cells, in particular if high volumetric yields are to be achieved. In his report, Buessow surveys recent structures of proteins produced using stable mammalian cells and summarizes both well-established and novel approaches to facilitate stable cell-line generation for structural biology applications. The ambition of many biologists is to observe a protein's structure in the native environment of the cell itself. Until recently, this seemed to be more of a dream than a reality. Advances in nuclear magnetic resonance (NMR) spectroscopy techniques, however, have now made possible the observation of mechanistic events at the molecular level of protein structure. Smith and colleagues, in an exciting contribution, review emerging ‘in-cell NMR’ techniques that demonstrate the potential to monitor biological activities by NMR in real time in native physiological environments. A current drawback of NMR as a structure determination tool derives from size limitations of the molecule under investigation and the structures of large proteins and their complexes are therefore typically intractable by NMR. A solution to this challenge is the use of selective isotope labeling of the target protein, which results in a marked reduction of the complexity of NMR spectra and allows dynamic processes even in very large proteins and even ribosomes to be investigated. Kerfah and co-workers introduce methyl-specific isotopic labeling as a molecular tool-box, and review its applications to the solution NMR analysis of large proteins. Tyagi and Lemke next examine single-molecule FRET and crosslinking following the co-translational incorporation of non-canonical amino acids (ncAAs); the goal here is to move beyond static snap-shots of proteins and their complexes and to observe them as dynamic entities. The encoding of ncAAs through codon-suppression technology allows biomolecules to be investigated with diverse structural biology methods. In their article, Tyagi and Lemke discuss these approaches and speculate on the design of improved host organisms for ‘integrative structural biology research’. Our volume concludes with two contributions that resolve particular bottlenecks in the protein structure determination pipeline. The contribution by Crepin and co-workers introduces the concept of polyproteins in contemporary structural biology. Polyproteins are widespread in nature. They represent long polypeptide chains in which individual smaller proteins with different biological function are covalently linked together. Highly specific proteases then tailor the polyprotein into its constituent proteins. Many viruses use polyproteins as a means of organizing their proteome. The concept of polyproteins has now been exploited successfully to produce hitherto inaccessible recombinant protein complexes. For instance, by means of a self-processing synthetic polyprotein, the influenza polymerase, a high-value drug target that had remained elusive for decades, has been produced, and its high-resolution structure determined. In the contribution by Desmyter and co-workers, a further, often imposing, bottleneck in high-resolution protein structure determination is addressed: The requirement to form stable three-dimensional crystal lattices that diffract incident X-ray radiation to high resolution. Nanobodies have proven to be uniquely useful as crystallization chaperones, to coax challenging targets into suitable crystal lattices. Desmyter and co-workers review the generation of nanobodies by immunization, and highlight the application of this powerful technology to the crystallography of important protein specimens including G protein-coupled receptors (GPCRs). Recombinant protein production has come a long way since Peter Lobban's hypothesis in the late 1960s, with recombinant proteins now a dominant force in structural biology. The contributions in this volume showcase an impressive array of inventive approaches that are being developed and implemented, ever increasing the scope of recombinant technology to facilitate the determination of elusive protein structures. Powerful new methods from synthetic biology are further accelerating progress. Structure determination is now reaching into the living cell with the ultimate goal of observing functional molecular architectures in action in their native physiological environment. We anticipate that even the most challenging protein assemblies will be tackled by recombinant technology in the near future.
Resumo:
Thermal effects in uncontrolled factory environments are often the largest source of uncertainty in large volume dimensional metrology. As the standard temperature for metrology of 20°C cannot be achieved practically or economically in many manufacturing facilities, the characterisation and modelling of temperature offers a solution for improving the uncertainty of dimensional measurement and quantifying thermal variability in large assemblies. Technologies that currently exist for temperature measurement in the range of 0-50°C have been presented alongside discussion of these temperature measurement technologies' usefulness for monitoring temperatures in a manufacturing context. Particular aspects of production where the technology could play a role are highlighted as well as practical considerations for deployment. Contact sensors such as platinum resistance thermometers can produce accuracy closest to the desired accuracy given the most challenging measurement conditions calculated to be ∼0.02°C. Non-contact solutions would be most practical in the light controlled factory (LCF) and semi-invasive appear least useful but all technologies can play some role during the initial development of thermal variability models.
Resumo:
Design verification in the digital domain, using model-based principles, is a key research objective to address the industrial requirement for reduced physical testing and prototyping. For complex assemblies, the verification of design and the associated production methods is currently fragmented, prolonged and sub-optimal, as it uses digital and physical verification stages that are deployed in a sequential manner using multiple systems. This paper describes a novel, hybrid design verification methodology that integrates model-based variability analysis with measurement data of assemblies, in order to reduce simulation uncertainty and allow early design verification from the perspective of satisfying key assembly criteria.
Resumo:
Poly(styrene-co-maleic anhydride) (PSMA) based copolymers are known to undergo conformational transition in response to environmental stimuli. This smart behaviour makes it possible to mimic the behaviour of native apoproteins. The primary aim of this study was to develop a better understanding of the structure-property relationships of various PSMA-based copolymers sought. The work undertaken in this thesis has revealed that the responsive behaviour of PSMA-based copolymers can be tailored by varying the molecular weight, hydrophobic (styrene) and hydrophilic (maleic acid) balance, and more so in the presence of additional hydrophobic, mono-partial ester moieties. Novel hydrophilic and hydrophobic synthetic surfactant protein analogues have successfully been prepared. These novel lipid solubilising agents possess a broad range of HLB (hydrophilic-lipophilic balance) values that have been estimated. NMR spectroscopy was utilised to confirm the structures for PSMA-based copolymers sought and proved useful in furthering understanding of the structure-property relationships of PSMA-based copolymers. The association of PSMA with the polar phospholipid, 2-dilauryl-sn-glycero-3- phosphocholine (DLPC) produces polymer-lipid complexes analogous to lipoprotein assemblies present in the blood plasma. NMR analysis reveals that the PSMA-based copolymers are not perfectly alternating. Regio-irregular structures, atactic and random monomer sequence distribution have been identified for all materials studied. Novel lipid solubilising agents (polyanionic surfactants) have successfully been synthesised from a broad range of PSMA-based copolymers with desired estimated HLB values that interact with polar phospholipids (DLPC/DPPC) uniquely. Very low static and dynamic surface tensions have been observed via the du Noϋy ring method and Langmuir techniques and correlate well with the estimated HLB values. Synthetic protein-lipid analogues have been successfully synthesised, that mimic the unique surface properties of native biological lubricants without the use of solvents. The novel PSMA-DLPC complexes have successfully been combined with hyaluronan (hyaluronic acid, HA). Today, the employment of HA is economically feasible, because it is readily available from bacterial fermentation processes in a thermally stable form - HyaCare®. The work undertaken in this thesis highlights the usage of HA in biolubrication applications and how this can be optimised and thus justified by carefully selecting the biological source, concentration, molecular weight, purity and most importantly by combining it with compatible boundary lubricating agents (polar phospholipids). Experimental evidence supports the belief that the combined HA and PSMA-DLPC complexes provide a balance of rheological, biotribological and surface properties that are composition dependent, and show competitive advantage as novel synthetic biological lubricants (biosurfactants).
Resumo:
Discrepancies of materials, tools, and factory environments, as well as human intervention, make variation an integral part of the manufacturing process of any component. In particular, the assembly of large volume, aerospace parts is an area where significant levels of form and dimensional variation are encountered. Corrective actions can usually be taken to reduce the defects, when the sources and levels of variation are known. For the unknown dimensional and form variations, a tolerancing strategy is typically put in place in order to minimize the effects of production inconsistencies related to geometric dimensions. This generates a challenging problem for the automation of the corresponding manufacturing and assembly processes. Metrology is becoming a major contributor to being able to predict, in real time, the automated assembly problems related to the dimensional variation of parts and assemblies. This is done by continuously measuring dimensions and coordinate points, focusing on the product's key characteristics. In this paper, a number of metrology focused activities for large-volume aerospace products, including their implementation and application in the automation of manufacturing and assembly processes, are reviewed. This is done by using a case study approach within the assembly of large-volume aircraft wing structures.
Resumo:
One of the most pressing demands on electrophysiology applied to the diagnosis of epilepsy is the non-invasive localization of the neuronal generators responsible for brain electrical and magnetic fields (the so-called inverse problem). These neuronal generators produce primary currents in the brain, which together with passive currents give rise to the EEG signal. Unfortunately, the signal we measure on the scalp surface doesn't directly indicate the location of the active neuronal assemblies. This is the expression of the ambiguity of the underlying static electromagnetic inverse problem, partly due to the relatively limited number of independent measures available. A given electric potential distribution recorded at the scalp can be explained by the activity of infinite different configurations of intracranial sources. In contrast, the forward problem, which consists of computing the potential field at the scalp from known source locations and strengths with known geometry and conductivity properties of the brain and its layers (CSF/meninges, skin and skull), i.e. the head model, has a unique solution. The head models vary from the computationally simpler spherical models (three or four concentric spheres) to the realistic models based on the segmentation of anatomical images obtained using magnetic resonance imaging (MRI). Realistic models – computationally intensive and difficult to implement – can separate different tissues of the head and account for the convoluted geometry of the brain and the significant inter-individual variability. In real-life applications, if the assumptions of the statistical, anatomical or functional properties of the signal and the volume in which it is generated are meaningful, a true three-dimensional tomographic representation of sources of brain electrical activity is possible in spite of the ‘ill-posed’ nature of the inverse problem (Michel et al., 2004). The techniques used to achieve this are now referred to as electrical source imaging (ESI) or magnetic source imaging (MSI). The first issue to influence reconstruction accuracy is spatial sampling, i.e. the number of EEG electrodes. It has been shown that this relationship is not linear, reaching a plateau at about 128 electrodes, provided spatial distribution is uniform. The second factor is related to the different properties of the source localization strategies used with respect to the hypothesized source configuration.
Resumo:
In this review, the impacts of climate change on Lepidoptera species and communities are summarized, regarding already registered changes in case of individual species and assemblies, and possible future effects. These include changes in abundance, distribution ranges (altitude above sea level, geographical distribution), phenology (earlier or later flying, number of generations per year). The paper also contains a short description of the observed impacts of single factors and conditions (temperature, atmospheric CO2 concentration, drought, predators and parasitoids, UV-B radiation) affecting the life of moths and butterflies, and recorded monitoring results of changes in the Lepidoptera communities of some observed areas. The review is closed with some theoretical considerations concerning the characteristics of “winner” species and also the features and conditions needed for a successful invasion, conquest of new territories.
Resumo:
The present study was concerned with evaluating one basic institution in Bolivian democracy: its electoral system. The study evaluates the impact of electoral systems on the interaction between presidents and assemblies. It sought to determine whether it is possible to have electoral systems that favor multipartism but can also moderate the likelihood of executive-legislative confrontation by producing the necessary conditions for coalition building. ^ This dissertation utilized the case study method as a methodology. Using the case of Bolivia, the research project studied the variations in executive-legislative relations and political outcomes from 1985 to the present through a model of executive-legislative relations that provided a typology of presidents and assemblies based on the strategies available to them to bargain with each other for support. A complementary model that evaluated the state of their inter-institutional interaction was also employed. ^ Results indicated that executive-legislative relations are profoundly influenced by the choice of the electoral system. Similarly, the project showed that although the Bolivian mixed system for legislative elections, and executive formula favor multipartism, these electoral systems do not necessarily engender executive-legislative confrontation in Bolivia. This was mainly due to the congressional election of the president, and the formulas utilized to translate the popular vote into legislative seats. However, the study found that the electoral system has also allowed for anti-systemic forces to emerge and gain political space both within and outside of political institutions. ^ The study found that government coalitions in Bolivia that are promoted by the system of congressional election of the president and the D'Hondt system to allocate legislative seats have helped ameliorate one of the typical problems of presidential systems in Latin America: the presence of a minority government that is blocked in its capacity to govern. This study was limited to evaluating the impact of the electoral system, as the independent variable, on executive-legislative interaction. However, the project revealed a need for more theoretical and empirical work on executive-legislative bargaining models in order to understand how institutional reforms can have an impact on the incentives of presidents and legislators to form coherent coalitions. ^
Resumo:
Surface freshwater samples from Everglades National Park, Florida, were used to investigate the size distributions of natural dissolved organic matter (DOM) and associated fluorescence characteristics along the molecular weight continuum. Samples were fractionated using size exclusion chromatography (SEC) and characterized by spectroscopic means, in particular Excitation-Emission Matrix fluorescence modeled with parallel factor analysis (EEM-PARAFAC). Most of the eight components obtained from PARAFAC modeling were broadly distributed across the DOM molecular weight range, and the optical properties of the eight size fractions for all samples studied were quite consistent among each other. Humic-like components presented a similar distribution in all the samples, with enrichment in the middle molecular weight range. Some variability in the relative distribution of the different humic-like components was observed among the different size fractions and among samples. The protein like fluorescence, although also generally present in all fractions, was more variable but generally enriched in the highest and lowest molecular weight fractions. These observations are in agreement with the hypothesis of a supramolecular structure for DOM, and suggest that DOM fluorescence characteristics may be controlled by molecular assemblies with similar optical properties, distributed along the molecular weight continuum. This study highlights the importance of studying the molecular structure of DOM on a molecular size distribution perspective, which may have important implications in understanding the environmental dynamics such materials.
Resumo:
Hebb proposed that synapses between neurons that fire synchronously are strengthened, forming cell assemblies and phase sequences. The former, on a shorter scale, are ensembles of synchronized cells that function transiently as a closed processing system; the latter, on a larger scale, correspond to the sequential activation of cell assemblies able to represent percepts and behaviors. Nowadays, the recording of large neuronal populations allows for the detection of multiple cell assemblies. Within Hebb’s theory, the next logical step is the analysis of phase sequences. Here we detected phase sequences as consecutive assembly activation patterns, and then analyzed their graph attributes in relation to behavior. We investigated action potentials recorded from the adult rat hippocampus and neocortex before, during and after novel object exploration (experimental periods). Within assembly graphs, each assembly corresponded to a node, and each edge corresponded to the temporal sequence of consecutive node activations. The sum of all assembly activations was proportional to firing rates, but the activity of individual assemblies was not. Assembly repertoire was stable across experimental periods, suggesting that novel experience does not create new assemblies in the adult rat. Assembly graph attributes, on the other hand, varied significantly across behavioral states and experimental periods, and were separable enough to correctly classify experimental periods (Naïve Bayes classifier; maximum AUROCs ranging from 0.55 to 0.99) and behavioral states (waking, slow wave sleep, and rapid eye movement sleep; maximum AUROCs ranging from 0.64 to 0.98). Our findings agree with Hebb’s view that neuronal assemblies correspond to primitive building blocks of representation, nearly unchanged in 10 the adult, while phase sequences are labile across behavioral states and change after novel experience. The results are compatible with a role for phase sequences in behavior and cognition
Resumo:
In this work we propose the development of an ultrasonic anemometer using distance sensors. The wind is an important tool for studying the dynamics of the atmosphere, changes in climate and agricultural crops meteorological variable. Thus it is necessary advances in studies that provide increasingly characterizing the behavior of the wind. Currently there are several types of anemometers to measure wind speed, among which stands out due to the ultrasonic anemometer accuracy in measurements. But this device has a high cost difficult to use. Therefore, we sought to lower the cost of the ultrasonic anemometer, developing an apparatus capable of measuring wind velocity using distance sensors. In this type of anemometer wind speed is measured based on the transit time of the ultrasonic pulse, in this same distance sensors to space technique measures. Here various assemblies seeking the best configuration which could use the distance sensor to measure wind speed were made. Arrangements bulkhead and separate transducers are examples of worked assemblies that will be detailed in chapter 3. With the measures collected (with and without wind) histograms, which show the distribution of records transit time of the sound wave for each case were generated. Two of the studied configurations show favorable results regarding the use of the distance sensor as the wind speed.
Resumo:
Esta pesquisa busca abordar e analisar os caminhos da mídia televisiva nas igrejas Assembleias de Deus no Brasil, aqui trabalhadas no plural devido às muitas ramificações da denominação, em suas construções das representações simbólicas, imaginárias, culturais e midiáticas. O objetivo é analisar a dimensão simbólica na proibição do uso da televisão na denominação religiosa em questão, e os argumentos para o veto entre os fiéis, a discreta aceitabilidade da TV diante das novas concepções teológicas das ADs ao longo dos anos, investimento em canal aberto (Rede Boas Novas de Televisão) e proposta de um programa televisivo como marca midiática da denominação (Programa Movimento Pentecostal). Para viabilidade desse trabalho, a metodologia adotada consiste em pesquisa bibliográfica histórica, com foco nos temas fundamentais para o desenvolvimento das pesquisas do imaginário, identidade, cultura, nação e televisão, aliada à pesquisa documental, que tem como meta os registros históricos, que melhor elucidarão os caminhos e a ligação da TV com as Assembleias de Deus.
Resumo:
Esta dissertação tem como objetivo estudar as transformações socioculturais dos usos e costumes da igreja Assembleia de Deus Ministério Belém em Itapecerica da Serra no bairro Crispim. Dentro desta perspectiva procuramos entender as razões que permeiam ainda hoje o discurso ideológico e conservador da igreja em estudo diante da flexibilização e dos novos paradigmas assembleianos referente aos usos e costumes. Atualmente a igreja Assembleia de Deus em Itapecerica da Serra passa por um processo de mudança e ressignificação em seus usos e costumes. Pretendemos discutir a possibilidade de que tais mudanças sejam oriundas do contato com diferentes pentecostalismos existentes em Itapecerica da Serra, e também do próprio desenvolvimento sociocultural da sociedade. Tenta-se nesta pesquisa estudar as mudanças dos usos e costumes como parte da teia de significados produzida pela Igreja Assembleia de Deus. Como processo metodológico utilizamos fontes históricas, dados publicados em livros, artigos, jornais da própria igreja que auxiliarão na confrontação dos dados encontrados em entrevistas realizadas com assembleianos de Itapecerica da Serra. Os resultados desta pesquisa estão apresentados em três capítulos cujo eixo central é a discussão das transformações socioculturais dos usos e costumes
Resumo:
Currently the Science fairs in Brazil have gained great incentive, examples are the regulations that the government has been implementing in education and the financing of public calls for events throughout the national territory. However, even with this incentive, some researchers point out that the scientific fairs and shows are still interpreted as an extemporaneous work by teachers. In order to know the views of basic education teachers about the fairs of Science, proposed to carry out this research. Given this situation, based mediation theory and sociocultural interaction Vygotsky (2001), the theory of instrumentalism Dewey (2002) and the proposed education through research Galiazzi e Moraes (2002), we sought to understand the importance of fair and their benefits as well as the presence in the talks of respondents. In order to analyze the answers of respondents, used to discourse analysis proposed by Eni Orlandi (2009), in which it is observed and is an interpretation of the speech of teachers, considering their interpretation and how to shape their thinking on the research object. In analyzing the results of the survey, it was noted that the teachers interviewed know the importance and objectives of science fairs, however experience difficulties that often does not allow these events to be carried out. In seeking to assist them to minimize these difficulties, it was realized the need for a product to make available guidance on how to develop research projects and assemblies of science fairs, that would provide an education for the research. Thus, resulting from research, was set up a blog and a booklet with texts, articles and report templates.
Resumo:
Contexte La connectomique, ou la cartographie des connexions neuronales, est un champ de recherche des neurosciences évoluant rapidement, promettant des avancées majeures en ce qui concerne la compréhension du fonctionnement cérébral. La formation de circuits neuronaux en réponse à des stimuli environnementaux est une propriété émergente du cerveau. Cependant, la connaissance que nous avons de la nature précise de ces réseaux est encore limitée. Au niveau du cortex visuel, qui est l’aire cérébrale la plus étudiée, la manière dont les informations se transmettent de neurone en neurone est une question qui reste encore inexplorée. Cela nous invite à étudier l’émergence des microcircuits en réponse aux stimuli visuels. Autrement dit, comment l’interaction entre un stimulus et une assemblée cellulaire est-elle mise en place et modulée? Méthodes En réponse à la présentation de grilles sinusoïdales en mouvement, des ensembles neuronaux ont été enregistrés dans la couche II/III (aire 17) du cortex visuel primaire de chats anesthésiés, à l’aide de multi-électrodes en tungstène. Des corrélations croisées ont été effectuées entre l’activité de chacun des neurones enregistrés simultanément pour mettre en évidence les liens fonctionnels de quasi-synchronie (fenêtre de ± 5 ms sur les corrélogrammes croisés corrigés). Ces liens fonctionnels dévoilés indiquent des connexions synaptiques putatives entre les neurones. Par la suite, les histogrammes peri-stimulus (PSTH) des neurones ont été comparés afin de mettre en évidence la collaboration synergique temporelle dans les réseaux fonctionnels révélés. Enfin, des spectrogrammes dépendants du taux de décharges entre neurones ou stimulus-dépendants ont été calculés pour observer les oscillations gamma dans les microcircuits émergents. Un indice de corrélation (Rsc) a également été calculé pour les neurones connectés et non connectés. Résultats Les neurones liés fonctionnellement ont une activité accrue durant une période de 50 ms contrairement aux neurones fonctionnellement non connectés. Cela suggère que les connexions entre neurones mènent à une synergie de leur inter-excitabilité. En outre, l’analyse du spectrogramme dépendant du taux de décharge entre neurones révèle que les neurones connectés ont une plus forte activité gamma que les neurones non connectés durant une fenêtre d’opportunité de 50ms. L’activité gamma de basse-fréquence (20-40 Hz) a été associée aux neurones à décharge régulière (RS) et l’activité de haute fréquence (60-80 Hz) aux neurones à décharge rapide (FS). Aussi, les neurones fonctionnellement connectés ont systématiquement un Rsc plus élevé que les neurones non connectés. Finalement, l’analyse des corrélogrammes croisés révèle que dans une assemblée neuronale, le réseau fonctionnel change selon l’orientation de la grille. Nous démontrons ainsi que l’intensité des relations fonctionnelles dépend de l’orientation de la grille sinusoïdale. Cette relation nous a amené à proposer l’hypothèse suivante : outre la sélectivité des neurones aux caractères spécifiques du stimulus, il y a aussi une sélectivité du connectome. En bref, les réseaux fonctionnels «signature » sont activés dans une assemblée qui est strictement associée à l’orientation présentée et plus généralement aux propriétés des stimuli. Conclusion Cette étude souligne le fait que l’assemblée cellulaire, plutôt que le neurone, est l'unité fonctionnelle fondamentale du cerveau. Cela dilue l'importance du travail isolé de chaque neurone, c’est à dire le paradigme classique du taux de décharge qui a été traditionnellement utilisé pour étudier l'encodage des stimuli. Cette étude contribue aussi à faire avancer le débat sur les oscillations gamma, en ce qu'elles surviennent systématiquement entre neurones connectés dans les assemblées, en conséquence d’un ajout de cohérence. Bien que la taille des assemblées enregistrées soit relativement faible, cette étude suggère néanmoins une intrigante spécificité fonctionnelle entre neurones interagissant dans une assemblée en réponse à une stimulation visuelle. Cette étude peut être considérée comme une prémisse à la modélisation informatique à grande échelle de connectomes fonctionnels.