42 resultados para Network-on-Chip (NoC)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The lanthanide binuclear helicate [Eu(2)(L(C2(CO(2)H)))(3)] is coupled to avidin to yield a luminescent bioconjugate EuB1 (Q = 9.3%, tau((5)D(0)) = 2.17 ms). MALDI/TOF mass spectrometry confirms the covalent binding of the Eu chelate and UV-visible spectroscopy allows one to determine a luminophore/protein ratio equal to 3.2. Bio-affinity assays involving the recognition of a mucin-like protein expressed on human breast cancer MCF-7 cells by a biotinylated monoclonal antibody 5D10 to which EuB1 is attached via avidin-biotin coupling demonstrate that (i) avidin activity is little affected by the coupling reaction and (ii) detection limits obtained by time-resolved (TR) luminescence with EuB1 and a commercial Eu-avidin conjugate are one order of magnitude lower than those of an organic conjugate (FITC-streptavidin). In the second part of the paper, conditions for growing MCF-7 cells in 100-200 microm wide microchannels engraved in PDMS are established; we demonstrate that EuB1 can be applied as effectively on this lab-on-a-chip device for the detection of tumour-associated antigens as on MCF-7 cells grown in normal culture vials. In order to exploit the versatility of the ligand used for self-assembling [Ln(2)(L(C2(CO(2)H)))(3)] helicates, which sensitizes the luminescence of both Eu(III) and Tb(III) ions, a dual on-chip assay is proposed in which estrogen receptors (ERs) and human epidermal growth factor receptors (Her2/neu) can be simultaneously detected on human breast cancer tissue sections. The Ln helicates are coupled to two secondary antibodies: ERs are visualized by red-emitting EuB4 using goat anti-mouse IgG and Her2/neu receptors by green-emitting TbB5 using goat anti-rabbit IgG. The fact that the assay is more than 6 times faster and requires 5 times less reactants than conventional immunohistochemical assays provides essential advantages over conventional immunohistochemistry for future clinical biomarker detection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

NanoImpactNet (NIN) is a multidisciplinary European Commission funded network on the environmental, health and safety (EHS) impact of nanomaterials. The 24 founding scientific institutes are leading European research groups active in the fields of nanosafety, nanorisk assessment and nanotoxicology. This 4−year project is the new focal point for information exchange within the research community. Contact with other stakeholders is vital and their needs are being surveyed. NIN is communicating with 100s of stakeholders: businesses; internet platforms; industry associations; regulators; policy makers; national ministries; international agencies; standard−setting bodies and NGOs concerned by labour rights, EHS or animal welfare. To improve this communication, internet research, a questionnaire distributed via partners and targeted phone calls were used to identify stakeholders' interests and needs. Knowledge gaps and the necessity for further data mentioned by representatives of all stakeholder groups in the targeted phone calls concerned: potential toxic and safety hazards of nanomaterials throughout their lifecycles; fate and persistence of nanoparticles in humans, animals and the environment; risks associated to nanoparticle exposure; participation in the preparation of nomenclature, standards, methodologies, protocols and benchmarks; development of best practice guidelines; voluntary schemes on responsibility; databases of materials, research topics and themes. Findings show that stakeholders and NIN researchers share very similar knowledge needs, and that open communication and free movement of knowledge will benefit both researchers and industry. Consequently NIN will encourage stakeholders to be active members. These survey findings will be used to improve NIN's communication tools to further build on interdisciplinary relationships towards a healthy future with nanotechnology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The in situ nuclear matrix was obtained from HeLa cells. After permeabilization with nonionic detergent, the resulting structures were incubated for 1 h at 37 degrees C to determine whether or not such an incubation might result in the redistribution of nuclear polypeptides which resisted extraction with buffers of high-ionic strength (1.6 M NaCl or 0.25 M (NH4)2SO4 as well as DNase I digestion. Using indirect immunofluorescence experiments and monoclonal antibodies we show that heating to 37 degrees C changes the distribution of a 160 kDa protein previously shown to be a component of the inner matrix network. On the other hand, a 125 kDa polypeptide was not affected at all by the incubation. Our results clearly indicate that the inclusion of a 37 degrees C incubation (for example during digestion with DNase I) in the protocol to obtain the in situ nuclear matrix can result in the formation of in vitro artifacts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

NanoImpactNet (NIN) is a multidisciplinary European Commission funded network on the environmental, health and safety (EHS) impact of nanomaterials. The 24 founding scientific institutes are leading European research groups active in the fields of nanosafety, nanorisk assessment and nanotoxicology. This 4-year project is the new focal point for information exchange within the research community. Contact with other stakeholders is vital and their needs are being surveyed. NIN is communicating with 100s of stakeholders: businesses; internet platforms; industry associations; regulators; policy makers; national ministries; international agencies; standard-setting bodies and NGOs concerned by labour rights, EHS or animal welfare. To improve this communication, internet research, a questionnaire distributed via partners and targeted phone calls were used to identify stakeholders' interests and needs. Knowledge gaps and the necessity for further data mentioned by representatives of all stakeholder groups in the targeted phone calls concerned: • the potential toxic and safety hazards of nanomaterials throughout their lifecycles; • the fate and persistence of nanoparticles in humans, animals and the environment; • the associated risks of nanoparticle exposure; • greater participation in: the preparation of nomenclature, standards, methodologies, protocols and benchmarks; • the development of best practice guidelines; • voluntary schemes on responsibility; • databases of materials, research topics and themes, but also of expertise. These findings suggested that stakeholders and NIN researchers share very similar knowledge needs, and that open communication and free movement of knowledge will benefit both researchers and industry. Subsequently a workshop was organised by NIN focused on building a sustainable multi-stakeholder dialogue. Specific questions were asked to different stakeholder groups to encourage discussions and open communication. 1. What information do stakeholders need from researchers and why? The discussions about this question confirmed the needs identified in the targeted phone calls. 2. How to communicate information? While it was agreed that reporting should be enhanced, commercial confidentiality and economic competition were identified as major obstacles. It was recognised that expertise was needed in the areas of commercial law and economics for a wellinformed treatment of this communication issue. 3. Can engineered nanomaterials be used safely? The idea that nanomaterials are probably safe because some of them have been produced 'for a long time', was questioned, since many materials in common use have been proved to be unsafe. The question of safety is also about whether the public has confidence. New legislation like REACH could help with this issue. Hazards do not materialise if exposure can be avoided or at least significantly reduced. Thus, there is a need for information on what can be regarded as acceptable levels of exposure. Finally, it was noted that there is no such thing as a perfectly safe material but only boundaries. At this moment we do not know where these boundaries lie. The matter of labelling of products containing nanomaterials was raised, as in the public mind safety and labelling are connected. This may need to be addressed since the issue of nanomaterials in food, drink and food packaging may be the first safety issue to attract public and media attention, and this may have an impact on 'nanotechnology as a whole. 4. Do we need more or other regulation? Any decision making process should accommodate the changing level of uncertainty. To address the uncertainties, adaptations of frameworks such as REACH may be indicated for nanomaterials. Regulation is often needed even if voluntary measures are welcome because it mitigates the effects of competition between industries. Data cannot be collected on voluntary bases for example. NIN will continue with an active stakeholder dialogue to further build on interdisciplinary relationships towards a healthy future with nanotechnology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT : The development of the retina is a very complex process, occurring through the progressive restriction of cell fates, from pluripotent cell populations to complex tissues and organs. In all vertebrate species analyzed so far, retinal differentiation starts with the generation of retinal ganglion cells (RGC)s. One of the documented key essential events in the specification of RGCs is the expression of ATHS, an atonal homolog encoding a bHLH transcription factor. Despite the putative role of master regulator of RGC differentiation, the mechanism of integrating its functions into a coherent program underlying the production of this subclass of retinal neurons has not yet been elucidated. By using chromatin immunoprecipitation combined with microarray (ChIP-on-chip) we have screened for ATH5 direct targets in the developing chick retina at two consecutive periods: E3.5 (stage HH22) and E6 (stage HH30), covering the stages of progenitor proliferation, neuroepithelium patterning, RGC specification, cell cycle exit and early neuronal differentiation. In parallel, complementary analysis with Affymetrix expression microarrays was conducted. We compared RGCs versus retina to see if the targets correspond to genes preferentially expressed in RGCs. We also precociously overexpressed ATH5 in the retina of individual embryo, and contralateral retina vas used as a control. Our integrated approach allowed us to establish a compendium of ATH5-targets and enabled us to position ATH5 in the transcription network underlying neurogenesis in the retina. Malattia Leventinese (ML) is an autosomal, dominant retinal dystrophy characterized by extracellular, amorphous deposits known as drusen, between the retinal pigment epithelium (RPE) and Bruch's membrane. On the genetic level, it has been associated with a single missense mutation (R345W) in a widely expressed gene with unknown function called EFEMP1. We determined expression patterns of the EFEMP1 gene in normal and ML human retinas. Our data shown that the upregulation of EFEMP1 is not specific to ML eye, except for the region of the ciliary body. We also analyzed the cell compartmentalization of different versions of the protein (both wild type and mutant). Our studies indicate that both abnormal expression of the EFEMP1 gene and mutation and accumulation of EFEMP 1 protein (inside or outside the cells) might contribute to the ML pathology. Résumé : 1er partie : L'ontogenèse de la rétine est un processus complexe au cours duquel des cellules progénitrices sont engagée, par vagues successives, dans des lignées où elles vont d'abord être déterminées puis vont se différencier pour finalement construire un tissu rétinien composé de cinq classes de neurones (les photorécepteurs, les cellules horizontales, bipolaires, amacrines et ganglionnaires) et d'une seule de cellules gliales (les cellules de Muller). Chez tous les vertébrés, la neurogenèse rétinienne est d'abord marquée par la production des cellules ganglionnaires (RGCs). La production de cette classe de neurone est liée à l'expression du gène ATH5 qui est un homologue du gène atonal chez la Drosophile et qui code pour un facteur de transcription de la famille des protéines basic Helix-Loop-Helix (bHLH). Malgré le rôle central que joue ATH5 dans la production des RGCs, le mécanisme qui intègre la fonction de cette protéine dans le programme de détermination neuronale et ceci en relation avec le développement de la rétine n'est pas encore élucidé. Grâce à une technologie qui permet de combiner la sélection de fragments de chromatine liant ATH5 et la recherche de séquences grâce à des puces d'ADN non-codants (ChIP-on-chip), nous avons recherché des cibles potentielles de la protéine ATH5 dans la rétine en développement. Nous avons conduit cette recherche à deux stades de développement de manière à englober la phase de prolifération cellulaire, la détermination des RGCs, la sortie du cycle cellulaire ainsi que les premières étapes de la différentiation de ces neurones. Des expériences complémentaires nous ont permis de définir les patrons d'expression des gènes sélectionnés ainsi que l'activité promotrice des éléments de régulation identifiés lors de notre criblage. Ces approches expérimentales diverses et complémentaires nous ont permis de répertorier des gènes cibles de la protéine ATH5 et d'établir ainsi des liens fonctionnels entre des voies métaboliques dont nous ne soupçonnions pas jusqu'alors qu'elles puissent être associées à la production d'une classe de neurones centraux. 2ème partie : Malattia Leventinese (ML) est une maladie génétique qui engendre une dystrophie de la rétine. Elle se caractérise par l'accumulation de dépôt amorphe entre l'épithélium pigmentaire et la membrane de Bruch et connu sous le nom de drusen. Cette maladie est liée à une simple mutation non-sens (R345W) dans un gène dénommé EFEMP1 qui est exprimé dans de nombreux tissus mais dont la fonction reste mal définie. Une étude détaillée de l'expression de ce gène dans des rétines humaines a révélé une expression à un niveau élevé du gène EFEMP1 dans divers tissus de l'oeil ML mais également dans des yeux contrôles. Alors que l'accumulation d'ARN messager EFEMP1 dans les cellules de l'épithélium pigmentaire n'est pas spécifique à ML, l'expression de ce gène dans le corps cilié n'a été observée que dans l'oeil ML. Nous avons également comparé la sécrétion de la protéine sauvage avec celle porteuse de la mutation. En résumé, notre étude révèle que le niveau élevé d'expression du gène EFEMP1 ainsi que l'accumulation de la protéine dans certains compartiments cellulaires pourraient contribuer au développement de pathologies rétiniennes liées à ML.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Our current knowledge of the general factor requirement in transcription by the three mammalian RNA polymerases is based on a small number of model promoters. Here, we present a comprehensive chromatin immunoprecipitation (ChIP)-on-chip analysis for 28 transcription factors on a large set of known and novel TATA-binding protein (TBP)-binding sites experimentally identified via ChIP cloning. A large fraction of identified TBP-binding sites is located in introns or lacks a gene/mRNA annotation and is found to direct transcription. Integrated analysis of the ChIP-on-chip data and functional studies revealed that TAF12 hitherto regarded as RNA polymerase II (RNAP II)-specific was found to be also involved in RNAP I transcription. Distinct profiles for general transcription factors and TAF-containing complexes were uncovered for RNAP II promoters located in CpG and non-CpG islands suggesting distinct transcription initiation pathways. Our study broadens the spectrum of general transcription factor function and uncovers a plethora of novel, functional TBP-binding sites in the human genome.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Eukaryotic transcription is tightly regulated by transcriptional regulatory elements, even though these elements may be located far away from their target genes. It is now widely recognized that these regulatory elements can be brought in close proximity through the formation of chromatin loops, and that these loops are crucial for transcriptional regulation of their target genes. The chromosome conformation capture (3C) technique presents a snapshot of long-range interactions, by fixing physically interacting elements with formaldehyde, digestion of the DNA, and ligation to obtain a library of unique ligation products. Recently, several large-scale modifications to the 3C technique have been presented. Here, we describe chromosome conformation capture sequencing (4C-seq), a high-throughput version of the 3C technique that combines the 3C-on-chip (4C) protocol with next-generation Illumina sequencing. The method is presented for use in mammalian cell lines, but can be adapted to use in mammalian tissues and any other eukaryotic genome.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Résumé La théorie de l'autocatégorisation est une théorie de psychologie sociale qui porte sur la relation entre l'individu et le groupe. Elle explique le comportement de groupe par la conception de soi et des autres en tant que membres de catégories sociales, et par l'attribution aux individus des caractéristiques prototypiques de ces catégories. Il s'agit donc d'une théorie de l'individu qui est censée expliquer des phénomènes collectifs. Les situations dans lesquelles un grand nombre d'individus interagissent de manière non triviale génèrent typiquement des comportements collectifs complexes qui sont difficiles à prévoir sur la base des comportements individuels. La simulation informatique de tels systèmes est un moyen fiable d'explorer de manière systématique la dynamique du comportement collectif en fonction des spécifications individuelles. Dans cette thèse, nous présentons un modèle formel d'une partie de la théorie de l'autocatégorisation appelée principe du métacontraste. À partir de la distribution d'un ensemble d'individus sur une ou plusieurs dimensions comparatives, le modèle génère les catégories et les prototypes associés. Nous montrons que le modèle se comporte de manière cohérente par rapport à la théorie et est capable de répliquer des données expérimentales concernant divers phénomènes de groupe, dont par exemple la polarisation. De plus, il permet de décrire systématiquement les prédictions de la théorie dont il dérive, notamment dans des situations nouvelles. Au niveau collectif, plusieurs dynamiques peuvent être observées, dont la convergence vers le consensus, vers une fragmentation ou vers l'émergence d'attitudes extrêmes. Nous étudions également l'effet du réseau social sur la dynamique et montrons qu'à l'exception de la vitesse de convergence, qui augmente lorsque les distances moyennes du réseau diminuent, les types de convergences dépendent peu du réseau choisi. Nous constatons d'autre part que les individus qui se situent à la frontière des groupes (dans le réseau social ou spatialement) ont une influence déterminante sur l'issue de la dynamique. Le modèle peut par ailleurs être utilisé comme un algorithme de classification automatique. Il identifie des prototypes autour desquels sont construits des groupes. Les prototypes sont positionnés de sorte à accentuer les caractéristiques typiques des groupes, et ne sont pas forcément centraux. Enfin, si l'on considère l'ensemble des pixels d'une image comme des individus dans un espace de couleur tridimensionnel, le modèle fournit un filtre qui permet d'atténuer du bruit, d'aider à la détection d'objets et de simuler des biais de perception comme l'induction chromatique. Abstract Self-categorization theory is a social psychology theory dealing with the relation between the individual and the group. It explains group behaviour through self- and others' conception as members of social categories, and through the attribution of the proto-typical categories' characteristics to the individuals. Hence, it is a theory of the individual that intends to explain collective phenomena. Situations involving a large number of non-trivially interacting individuals typically generate complex collective behaviours, which are difficult to anticipate on the basis of individual behaviour. Computer simulation of such systems is a reliable way of systematically exploring the dynamics of the collective behaviour depending on individual specifications. In this thesis, we present a formal model of a part of self-categorization theory named metacontrast principle. Given the distribution of a set of individuals on one or several comparison dimensions, the model generates categories and their associated prototypes. We show that the model behaves coherently with respect to the theory and is able to replicate experimental data concerning various group phenomena, for example polarization. Moreover, it allows to systematically describe the predictions of the theory from which it is derived, specially in unencountered situations. At the collective level, several dynamics can be observed, among which convergence towards consensus, towards frag-mentation or towards the emergence of extreme attitudes. We also study the effect of the social network on the dynamics and show that, except for the convergence speed which raises as the mean distances on the network decrease, the observed convergence types do not depend much on the chosen network. We further note that individuals located at the border of the groups (whether in the social network or spatially) have a decisive influence on the dynamics' issue. In addition, the model can be used as an automatic classification algorithm. It identifies prototypes around which groups are built. Prototypes are positioned such as to accentuate groups' typical characteristics and are not necessarily central. Finally, if we consider the set of pixels of an image as individuals in a three-dimensional color space, the model provides a filter that allows to lessen noise, to help detecting objects and to simulate perception biases such as chromatic induction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A recurring task in the analysis of mass genome annotation data from high-throughput technologies is the identification of peaks or clusters in a noisy signal profile. Examples of such applications are the definition of promoters on the basis of transcription start site profiles, the mapping of transcription factor binding sites based on ChIP-chip data and the identification of quantitative trait loci (QTL) from whole genome SNP profiles. Input to such an analysis is a set of genome coordinates associated with counts or intensities. The output consists of a discrete number of peaks with respective volumes, extensions and center positions. We have developed for this purpose a flexible one-dimensional clustering tool, called MADAP, which we make available as a web server and as standalone program. A set of parameters enables the user to customize the procedure to a specific problem. The web server, which returns results in textual and graphical form, is useful for small to medium-scale applications, as well as for evaluation and parameter tuning in view of large-scale applications, requiring a local installation. The program written in C++ can be freely downloaded from ftp://ftp.epd.unil.ch/pub/software/unix/madap. The MADAP web server can be accessed at http://www.isrec.isb-sib.ch/madap/.