904 resultados para Guide for ways to support the most vulnerable families in society


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Como en todos los medios de transporte, la seguridad en los viajes en avión es de primordial importancia. Con los aumentos de tráfico aéreo previstos en Europa para la próxima década, es evidente que el riesgo de accidentes necesita ser evaluado y monitorizado cuidadosamente de forma continúa. La Tesis presente tiene como objetivo el desarrollo de un modelo de riesgo de colisión exhaustivo como método para evaluar el nivel de seguridad en ruta del espacio aéreo europeo, considerando todos los factores de influencia. La mayor limitación en el desarrollo de metodologías y herramientas de monitorización adecuadas para evaluar el nivel de seguridad en espacios de ruta europeos, donde los controladores aéreos monitorizan el tráfico aéreo mediante la vigilancia radar y proporcionan instrucciones tácticas a las aeronaves, reside en la estimación del riesgo operacional. Hoy en día, la estimación del riesgo operacional está basada normalmente en reportes de incidentes proporcionados por el proveedor de servicios de navegación aérea (ANSP). Esta Tesis propone un nuevo e innovador enfoque para evaluar el nivel de seguridad basado exclusivamente en el procesamiento y análisis trazas radar. La metodología propuesta ha sido diseñada para complementar la información recogida en las bases de datos de accidentes e incidentes, mediante la provisión de información robusta de los factores de tráfico aéreo y métricas de seguridad inferidas del análisis automático en profundidad de todos los eventos de proximidad. La metodología 3-D CRM se ha implementado en un prototipo desarrollado en MATLAB © para analizar automáticamente las trazas radar y planes de vuelo registrados por los Sistemas de Procesamiento de Datos Radar (RDP) e identificar y analizar todos los eventos de proximidad (conflictos, conflictos potenciales y colisiones potenciales) en un periodo de tiempo y volumen del espacio aéreo. Actualmente, el prototipo 3-D CRM está siendo adaptado e integrado en la herramienta de monitorización de prestaciones de Aena (PERSEO) para complementar las bases de accidentes e incidentes ATM y mejorar la monitorización y proporcionar evidencias de los niveles de seguridad. ABSTRACT As with all forms of transport, the safety of air travel is of paramount importance. With the projected increases in European air traffic in the next decade and beyond, it is clear that the risk of accidents needs to be assessed and carefully monitored on a continuing basis. The present thesis is aimed at the development of a comprehensive collision risk model as a method of assessing the European en-route risk, due to all causes and across all dimensions within the airspace. The major constraint in developing appropriate monitoring methodologies and tools to assess the level of safety in en-route airspaces where controllers monitor air traffic by means of radar surveillance and provide aircraft with tactical instructions lies in the estimation of the operational risk. The operational risk estimate normally relies on incident reports provided by the air navigation service providers (ANSPs). This thesis proposes a new and innovative approach to assessing aircraft safety level based exclusively upon the process and analysis of radar tracks. The proposed methodology has been designed to complement the information collected in the accident and incident databases, thereby providing robust information on air traffic factors and safety metrics inferred from the in depth assessment of proximate events. The 3-D CRM methodology is implemented in a prototype tool in MATLAB © in order to automatically analyze recorded aircraft tracks and flight plan data from the Radar Data Processing systems (RDP) and identify and analyze all proximate events (conflicts, potential conflicts and potential collisions) within a time span and a given volume of airspace. Currently, the 3D-CRM prototype is been adapted and integrated in AENA’S Performance Monitoring Tool (PERSEO) to complement the information provided by the ATM accident and incident databases and to enhance monitoring and providing evidence of levels of safety.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of the present research is to characterise the international scene in the field of building refurbishment, by thoroughly reviewing the literature relating to building renovation and systematising the results according to the different aspects considered by the authors. Even though there is certain consensus with respect to the criteria for the selection of energy efficiency measures, the assessment criteria differ broadly and widely. The present work highlights the lack of consensus on the assessment criteria and the need of harmonization. A holistic view is required in order to identify the most sustainable strategies in each particular case, considering social, environmental and economic impacts from a life cycle perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los sistemas empotrados son cada día más comunes y complejos, de modo que encontrar procesos seguros, eficaces y baratos de desarrollo software dirigidos específicamente a esta clase de sistemas es más necesario que nunca. A diferencia de lo que ocurría hasta hace poco, en la actualidad los avances tecnológicos en el campo de los microprocesadores de los últimos tiempos permiten el desarrollo de equipos con prestaciones más que suficientes para ejecutar varios sistemas software en una única máquina. Además, hay sistemas empotrados con requisitos de seguridad (safety) de cuyo correcto funcionamiento depende la vida de muchas personas y/o grandes inversiones económicas. Estos sistemas software se diseñan e implementan de acuerdo con unos estándares de desarrollo software muy estrictos y exigentes. En algunos casos puede ser necesaria también la certificación del software. Para estos casos, los sistemas con criticidades mixtas pueden ser una alternativa muy valiosa. En esta clase de sistemas, aplicaciones con diferentes niveles de criticidad se ejecutan en el mismo computador. Sin embargo, a menudo es necesario certificar el sistema entero con el nivel de criticidad de la aplicación más crítica, lo que hace que los costes se disparen. La virtualización se ha postulado como una tecnología muy interesante para contener esos costes. Esta tecnología permite que un conjunto de máquinas virtuales o particiones ejecuten las aplicaciones con unos niveles de aislamiento tanto temporal como espacial muy altos. Esto, a su vez, permite que cada partición pueda ser certificada independientemente. Para el desarrollo de sistemas particionados con criticidades mixtas se necesita actualizar los modelos de desarrollo software tradicionales, pues estos no cubren ni las nuevas actividades ni los nuevos roles que se requieren en el desarrollo de estos sistemas. Por ejemplo, el integrador del sistema debe definir las particiones o el desarrollador de aplicaciones debe tener en cuenta las características de la partición donde su aplicación va a ejecutar. Tradicionalmente, en el desarrollo de sistemas empotrados, el modelo en V ha tenido una especial relevancia. Por ello, este modelo ha sido adaptado para tener en cuenta escenarios tales como el desarrollo en paralelo de aplicaciones o la incorporación de una nueva partición a un sistema ya existente. El objetivo de esta tesis doctoral es mejorar la tecnología actual de desarrollo de sistemas particionados con criticidades mixtas. Para ello, se ha diseñado e implementado un entorno dirigido específicamente a facilitar y mejorar los procesos de desarrollo de esta clase de sistemas. En concreto, se ha creado un algoritmo que genera el particionado del sistema automáticamente. En el entorno de desarrollo propuesto, se han integrado todas las actividades necesarias para desarrollo de un sistema particionado, incluidos los nuevos roles y actividades mencionados anteriormente. Además, el diseño del entorno de desarrollo se ha basado en la ingeniería guiada por modelos (Model-Driven Engineering), la cual promueve el uso de los modelos como elementos fundamentales en el proceso de desarrollo. Así pues, se proporcionan las herramientas necesarias para modelar y particionar el sistema, así como para validar los resultados y generar los artefactos necesarios para el compilado, construcción y despliegue del mismo. Además, en el diseño del entorno de desarrollo, la extensión e integración del mismo con herramientas de validación ha sido un factor clave. En concreto, se pueden incorporar al entorno de desarrollo nuevos requisitos no-funcionales, la generación de nuevos artefactos tales como documentación o diferentes lenguajes de programación, etc. Una parte clave del entorno de desarrollo es el algoritmo de particionado. Este algoritmo se ha diseñado para ser independiente de los requisitos de las aplicaciones así como para permitir al integrador del sistema implementar nuevos requisitos del sistema. Para lograr esta independencia, se han definido las restricciones al particionado. El algoritmo garantiza que dichas restricciones se cumplirán en el sistema particionado que resulte de su ejecución. Las restricciones al particionado se han diseñado con una capacidad expresiva suficiente para que, con un pequeño grupo de ellas, se puedan expresar la mayor parte de los requisitos no-funcionales más comunes. Las restricciones pueden ser definidas manualmente por el integrador del sistema o bien pueden ser generadas automáticamente por una herramienta a partir de los requisitos funcionales y no-funcionales de una aplicación. El algoritmo de particionado toma como entradas los modelos y las restricciones al particionado del sistema. Tras la ejecución y como resultado, se genera un modelo de despliegue en el que se definen las particiones que son necesarias para el particionado del sistema. A su vez, cada partición define qué aplicaciones deben ejecutar en ella así como los recursos que necesita la partición para ejecutar correctamente. El problema del particionado y las restricciones al particionado se modelan matemáticamente a través de grafos coloreados. En dichos grafos, un coloreado propio de los vértices representa un particionado del sistema correcto. El algoritmo se ha diseñado también para que, si es necesario, sea posible obtener particionados alternativos al inicialmente propuesto. El entorno de desarrollo, incluyendo el algoritmo de particionado, se ha probado con éxito en dos casos de uso industriales: el satélite UPMSat-2 y un demostrador del sistema de control de una turbina eólica. Además, el algoritmo se ha validado mediante la ejecución de numerosos escenarios sintéticos, incluyendo algunos muy complejos, de más de 500 aplicaciones. ABSTRACT The importance of embedded software is growing as it is required for a large number of systems. Devising cheap, efficient and reliable development processes for embedded systems is thus a notable challenge nowadays. Computer processing power is continuously increasing, and as a result, it is currently possible to integrate complex systems in a single processor, which was not feasible a few years ago.Embedded systems may have safety critical requirements. Its failure may result in personal or substantial economical loss. The development of these systems requires stringent development processes that are usually defined by suitable standards. In some cases their certification is also necessary. This scenario fosters the use of mixed-criticality systems in which applications of different criticality levels must coexist in a single system. In these cases, it is usually necessary to certify the whole system, including non-critical applications, which is costly. Virtualization emerges as an enabling technology used for dealing with this problem. The system is structured as a set of partitions, or virtual machines, that can be executed with temporal and spatial isolation. In this way, applications can be developed and certified independently. The development of MCPS (Mixed-Criticality Partitioned Systems) requires additional roles and activities that traditional systems do not require. The system integrator has to define system partitions. Application development has to consider the characteristics of the partition to which it is allocated. In addition, traditional software process models have to be adapted to this scenario. The V-model is commonly used in embedded systems development. It can be adapted to the development of MCPS by enabling the parallel development of applications or adding an additional partition to an existing system. The objective of this PhD is to improve the available technology for MCPS development by providing a framework tailored to the development of this type of system and by defining a flexible and efficient algorithm for automatically generating system partitionings. The goal of the framework is to integrate all the activities required for developing MCPS and to support the different roles involved in this process. The framework is based on MDE (Model-Driven Engineering), which emphasizes the use of models in the development process. The framework provides basic means for modeling the system, generating system partitions, validating the system and generating final artifacts. The framework has been designed to facilitate its extension and the integration of external validation tools. In particular, it can be extended by adding support for additional non-functional requirements and support for final artifacts, such as new programming languages or additional documentation. The framework includes a novel partitioning algorithm. It has been designed to be independent of the types of applications requirements and also to enable the system integrator to tailor the partitioning to the specific requirements of a system. This independence is achieved by defining partitioning constraints that must be met by the resulting partitioning. They have sufficient expressive capacity to state the most common constraints and can be defined manually by the system integrator or generated automatically based on functional and non-functional requirements of the applications. The partitioning algorithm uses system models and partitioning constraints as its inputs. It generates a deployment model that is composed by a set of partitions. Each partition is in turn composed of a set of allocated applications and assigned resources. The partitioning problem, including applications and constraints, is modeled as a colored graph. A valid partitioning is a proper vertex coloring. A specially designed algorithm generates this coloring and is able to provide alternative partitions if required. The framework, including the partitioning algorithm, has been successfully used in the development of two industrial use cases: the UPMSat-2 satellite and the control system of a wind-power turbine. The partitioning algorithm has been successfully validated by using a large number of synthetic loads, including complex scenarios with more that 500 applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bismuth ultra-thin films grown on n-GaAs electrodes via electrodeposition are porous due to a blockade of the electrode surface caused by adsorbed hydrogen when using acidic electrolytes. In this study, we discuss the existence of two sources of hydrogen adsorption and we propose different routes to unblock the n-GaAs surface in order to improve Bi films compactness. Firstly, we demonstrate that increasing the electrolyte temperature provides compact yet polycrystalline Bi films. Cyclic voltammetry scans indicate that this low crystal quality might be a result of the incorporation of Bi hydroxides within the Bi film as a result of the temperature increase. Secondly, we have illuminated the semiconductor surface to take advantage of photogenerated holes. These photocarriers oxidize the adsorbed hydrogen unblocking the surface, but also create pits at the substrate surface that degrade the Bi/GaAs interface and prevent an epitaxial growth. Finally, we show that performing a cyclic voltammetry scan before electrodeposition enables the growth of compact Bi ultra-thin films of high crystallinity on semiconductor substrates with a doping level low enough to perform transport measurements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Devido à proliferação dos Cursos do Ensino Superior de Ciências Contábeis, o Conselho Federal de Contabilidade CFC criou o Exame de Suficiência, por ter constatado que a qualidade do ensino deixava a desejar, procurando, nas provas, dar ênfase às questões éticas. Atualmente, o profissional contador possui um conhecimento prático-mecânico da contabilidade muito maior do que um raciocínio contábil. Este trabalho, mediante o estudo de minha trajetória (auto)biográfica na perspectiva da formação profissional, busca identificar os elementos mais determinantes na formação de um professor de Ciências Contábeis e verificar em que medida o exercício da pesquisa esteve presente nesta formação, bem como examinar se a pesquisa, enquanto princípio educativo, se reflete na prática docente. É necessário entender que na trajetória de Contador a professor de Ciências Contábeis, este movimento se dá, na maioria das vezes, pelo convite a profissionais que deram certo no mercado de trabalho e que, portanto, nem sempre possuem formação pedagógica adequada para o exercício docente. É também finalidade deste estudo, ao compreender melhor a experiência de um percurso de Contador a professor de Ciências Contábeis, sugerir alguns possíveis caminhos para a formação continuada dos professores deste campo do conhecimento. Para tanto, do ponto de vista da Educação e Formação de Educadores, assumi como referências Freire, Demo e Schön; do ponto de vista do Ensino Superior, Cunha e Buarque; do ponto de vista das Ciências Contábeis, Iudícibus e Marion. A metodologia utilizada na elaboração deste trabalho foi a pesquisa (auto)biográfica com referência, principalmente, em Nóvoa e Josso. Os resultados sugerem que a tendência a reproduzir a abordagem bancária, recebida nos bancos escolares, do primário ao ensino superior, só pode ser revertida na medida em que o sujeito, sem abandonar o trabalho prático de docência, tem a oportunidade de estudar os fundamentos teóricos dos processos educativos formais e traz os mesmos para a sua reflexão sobre a prática. É perceptível, também, o quanto a investigação (auto)biográfica, com finalidade formativa, pode ser de grande ajuda neste processo reflexivo. Por isto mesmo, sugere-se, ao concluir, que no processo de formação continuada dos professores de Ciências Contábeis sejam utilizadas estas estratégias.(AU)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Devido à proliferação dos Cursos do Ensino Superior de Ciências Contábeis, o Conselho Federal de Contabilidade CFC criou o Exame de Suficiência, por ter constatado que a qualidade do ensino deixava a desejar, procurando, nas provas, dar ênfase às questões éticas. Atualmente, o profissional contador possui um conhecimento prático-mecânico da contabilidade muito maior do que um raciocínio contábil. Este trabalho, mediante o estudo de minha trajetória (auto)biográfica na perspectiva da formação profissional, busca identificar os elementos mais determinantes na formação de um professor de Ciências Contábeis e verificar em que medida o exercício da pesquisa esteve presente nesta formação, bem como examinar se a pesquisa, enquanto princípio educativo, se reflete na prática docente. É necessário entender que na trajetória de Contador a professor de Ciências Contábeis, este movimento se dá, na maioria das vezes, pelo convite a profissionais que deram certo no mercado de trabalho e que, portanto, nem sempre possuem formação pedagógica adequada para o exercício docente. É também finalidade deste estudo, ao compreender melhor a experiência de um percurso de Contador a professor de Ciências Contábeis, sugerir alguns possíveis caminhos para a formação continuada dos professores deste campo do conhecimento. Para tanto, do ponto de vista da Educação e Formação de Educadores, assumi como referências Freire, Demo e Schön; do ponto de vista do Ensino Superior, Cunha e Buarque; do ponto de vista das Ciências Contábeis, Iudícibus e Marion. A metodologia utilizada na elaboração deste trabalho foi a pesquisa (auto)biográfica com referência, principalmente, em Nóvoa e Josso. Os resultados sugerem que a tendência a reproduzir a abordagem bancária, recebida nos bancos escolares, do primário ao ensino superior, só pode ser revertida na medida em que o sujeito, sem abandonar o trabalho prático de docência, tem a oportunidade de estudar os fundamentos teóricos dos processos educativos formais e traz os mesmos para a sua reflexão sobre a prática. É perceptível, também, o quanto a investigação (auto)biográfica, com finalidade formativa, pode ser de grande ajuda neste processo reflexivo. Por isto mesmo, sugere-se, ao concluir, que no processo de formação continuada dos professores de Ciências Contábeis sejam utilizadas estas estratégias.(AU)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Considerable evidence exists to support the hypothesis that the hippocampus and related medial temporal lobe structures are crucial for the encoding and storage of information in long-term memory. Few human imaging studies, however, have successfully shown signal intensity changes in these areas during encoding or retrieval. Using functional magnetic resonance imaging (fMRI), we studied normal human subjects while they performed a novel picture encoding task. High-speed echo-planar imaging techniques evaluated fMRI signal changes throughout the brain. During the encoding of novel pictures, statistically significant increases in fMRI signal were observed bilaterally in the posterior hippocampal formation and parahippocampal gyrus and in the lingual and fusiform gyri. To our knowledge, this experiment is the first fMRI study to show robust signal changes in the human hippocampal region. It also provides evidence that the encoding of novel, complex pictures depends upon an interaction between ventral cortical regions, specialized for object vision, and the hippocampal formation and parahippocampal gyrus, specialized for long-term memory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The earliest characterized events during induction of tubulogenesis in renal anlage include the condensation or compaction of metanephrogenic mesenchyme with the concurrent upregulation of WT1, the gene encoding the Wilms tumor transcriptional activator/suppressor. We report that basic fibroblast growth factor (FGF2) can mimic the early effects of an inductor tissue by promoting the condensation of mesenchyme and inhibiting the tissue degeneration associated with the absence of an inductor tissue. By in situ hybridization, FGF2 was also found to mediate the transcriptional activation of WT1 and of the hepatocyte growth factor receptor gene, c-met. Although FGF2 can induce these early events of renal tubulogenesis, it cannot promote the epithelial conversion associated with tubule formation in metanephrogenic mesenchyme. For this, an undefined factor(s) from pituitary extract in combination with FGF2 can cause tubule formation in uninduced mesenchyme. These findings support the concept that induction in kidney is a multiphasic process that is mediated by more than a single comprehensive inductive factor and that soluble molecules can mimic these inductive activities in isolated uninduced metanephrogenic mesenchyme.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As part of the Coupled Model Intercomparison Project, integrations with a common design have been undertaken with eleven different climate models to compare the response of the Atlantic thermohaline circulation ( THC) to time-dependent climate change caused by increasing atmospheric CO2 concentration. Over 140 years, during which the CO2 concentration quadruples, the circulation strength declines gradually in all models, by between 10 and 50%. No model shows a rapid or complete collapse, despite the fairly rapid increase and high final concentration of CO2. The models having the strongest overturning in the control climate tend to show the largest THC reductions. In all models, the THC weakening is caused more by changes in surface heat flux than by changes in surface water flux. No model shows a cooling anywhere, because the greenhouse warming is dominant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Surface Renewal Theory (SRT) is one of the most unfamiliar models in order to characterize fluid-fluid and fluid-fluid-solid reactions, which are of considerable industrial and academicals importance. In the present work, an approach to the resolution of the SRT model by numerical methods is presented, enabling the visualization of the influence of different variables which control the heterogeneous overall process. Its use in a classroom allowed the students to reach a great understanding of the process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Numerous international policy drivers espouse the need to improve healthcare. The application of Improvement Science has the potential to restore the balance of healthcare and transform it to a more person-centred and quality improvement focussed system. However there is currently no accredited Improvement Science education offered routinely to healthcare students. This means that there are a huge number of healthcare professionals who do not have the conceptual or experiential skills to apply Improvement Science in everyday practise. Methods: This article describes how seven European Higher Education Institutions (HEIs) worked together to develop four evidence informed accredited inter-professional Improvement Science modules for under and postgraduate healthcare students. It outlines the way in which a Policy Delphi, a narrative literature review, a review of the competency and capability requirements for healthcare professionals to practise Improvement Science, and a mapping of current Improvement Science education informed the content of the modules. Results: A contemporary consensus definition of Healthcare Improvement Science was developed. The four Improvement Science modules that have been designed are outlined. A framework to evaluate the impact modules have in practise has been developed and piloted. Conclusion: The authors argue that there is a clear need to advance healthcare Improvement Science education through incorporating evidence based accredited modules into healthcare professional education. They suggest that if Improvement Science education, that incorporates work based learning, becomes a staple part of the curricula in inter-professional education then it has real promise to improve the delivery, quality and design of healthcare.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fuenterrabía (Hondarribia) is a town located on the Franco-Spanish border. Between the 16th and 19th centuries it was considered to be one of the most outstanding strongholds in the Basque Country due to its strategic position. The bastion system of fortification was extremely prevalent in this stronghold. It was one of the first Spanish towns to adopt the incipient Renaissance designs of the bastion. The military engineers subsequently carried out continuous fortification projects that enabled the structure to withstand the advances being made in artillery and siege tactics. After the construction of the citadel of Pamplona had begun in 1571, following the design of the prestigious military engineer, Jacobo Palear Fratín and being revised by Viceroy Vespasiano Gonzaga, the aforementioned engineer undertook an ambitious project commissioned by Felipe II to modernise the fortifications of Fuenterrabía. Neither the plans nor the report of this project have been conserved, but in the year 2000, César Fernández Antuña published the report written by Spannocchi on the state of the fortifications of Fuenterrabía when he arrived to the Spanish peninsula, discovered in the Archivo Histórico Provincial de Zaragoza. This document conducts an in-depth analysis of Spannocchi’s project and how it was related to Fratín’s previous project. It concludes that this project encountered problems in updating the new bastions at the end of the 16th century, and identifies the factors which prevented the stronghold from being extended as was the case in Pamplona after Fratín’s project.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les virus ont besoin d’interagir avec des facteurs cellulaires pour se répliquer et se propager dans les cellules d’hôtes. Une étude de l'interactome des protéines du virus d'hépatite C (VHC) par Germain et al. (2014) a permis d'élucider de nouvelles interactions virus-hôte. L'étude a également démontré que la majorité des facteurs de l'hôte n'avaient pas d'effet sur la réplication du virus. Ces travaux suggèrent que la majorité des protéines ont un rôle dans d'autres processus cellulaires tel que la réponse innée antivirale et ciblées pas le virus dans des mécanismes d'évasion immune. Pour tester cette hypothèse, 132 interactant virus-hôtes ont été sélectionnés et évalués par silençage génique dans un criblage d'ARNi sur la production interferon-beta (IFNB1). Nous avons ainsi observé que les réductions de l'expression de 53 interactants virus-hôte modulent la réponse antivirale innée. Une étude dans les termes de gène d'ontologie (GO) démontre un enrichissement de ces protéines au transport nucléocytoplasmique et au complexe du pore nucléaire. De plus, les gènes associés avec ces termes (CSE1L, KPNB1, RAN, TNPO1 et XPO1) ont été caractérisé comme des interactant de la protéine NS3/4A par Germain et al. (2014), et comme des régulateurs positives de la réponse innée antivirale. Comme le VHC se réplique dans le cytoplasme, nous proposons que ces interactions à des protéines associées avec le noyau confèrent un avantage de réplication et bénéficient au virus en interférant avec des processus cellulaire tel que la réponse innée. Cette réponse innée antivirale requiert la translocation nucléaire des facteurs transcriptionnelles IRF3 et NF-κB p65 pour la production des IFNs de type I. Un essai de microscopie a été développé afin d'évaluer l’effet du silençage de 60 gènes exprimant des protéines associés au complexe du pore nucléaire et au transport nucléocytoplasmique sur la translocation d’IRF3 et NF-κB p65 par un criblage ARNi lors d’une cinétique d'infection virale. En conclusion, l’étude démontre qu’il y a plusieurs protéines qui sont impliqués dans le transport de ces facteurs transcriptionnelles pendant une infection virale et peut affecter la production IFNB1 à différents niveaux de la réponse d'immunité antivirale. L'étude aussi suggère que l'effet de ces facteurs de transport sur la réponse innée est peut être un mécanisme d'évasion par des virus comme VHC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Afin d’effectuer des études fonctionnelles sur le génome de la souris, notre laboratoire a généré une bibliothèque de clones de cellules souches embryonnaires (ESC) présentant des suppressions chromosomiques chevauchantes aléatoires – la bibliothèque DELES. Cette bibliothèque contient des délétions couvrant environ 25% du génome murin. Dans le laboratoire, nous comptons identifier de nouveaux déterminants du destin des cellules hématopoïétiques en utilisant cet outil. Un crible primaire utilisant la benzidine pour démontrer la présence d'hémoglobine dans des corps embryoïdes (EBS) a permis d’identifier plusieurs clones délétés présentant un phénotype hématopoïétique anormal. Comme cet essai ne vérifie que la présence d'hémoglobine, le but de mon projet est d'établir un essai in vitro de différenciation des ESC permettant de mesurer le potentiel hématopoïétique de clones DELES. Mon hypothèse est que l’essai de différenciation hématopoïétique publié par le Dr Keller peut être importé dans notre laboratoire et utilisé pour étudier l'engagement hématopoïétique des clones DELES. À l’aide d’essais de RT-QPCR et de FACS, j’ai pu contrôler la cinétique de différenciation hématopoïétique en suivant l’expression des gènes hématopoïétiques et des marqueurs de surface comme CD41, c-kit, RUNX1, GATA2, CD45, β-globine 1 et TER-119. Cet essai sera utilisé pour valider le potentiel hématopoïétique des clones DELES candidats identifiés dans le crible principal. Mon projet secondaire vise à utiliser la même stratégie rétro-virale a base de Cre-loxP utilisée pour générer la bibliothèque DELES pour générer une bibliothèque de cellules KBM-7 contenant des suppressions chromosomiques chevauchantes. Mon but ici est de tester si la lignée cellulaire leuémique humaine presque haploïde KBM-7 peut être exploitée en utilisant l'approche DELES pour créer cette bibliothèque. La bibliothèque de clones KBM-7 servira à définir les activités moléculaires de drogues anti-leucémiques potentielless que nous avons identifiées dans le laboratoire parce qu’elles inhibent la croissance cellulaire dans plusieurs échantillons de leucémie myéloïde aiguë dérivés de patients. Elle me permettra également d'identifier les voies de signalisation moléculaires qui, lorsque génétiquement perturbées, peuvent conférer une résistance à ces drogues.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The latest round of climate negotiations that took place in Warsaw (Conference of Parties, COP19) finally resulted in a decision to agree on a timeframe for the new agreement due in COP21 in Paris in 2015, and on ways to enhance the levels of ambition in pre-2020 mitigation pledges. Specifically, Warsaw produced two milestones: i) Parties were asked to communicate “intended nationally-determined contributions” by March 2015 and ii) the Ad-hoc Working Group on the Durban Platform for Enhanced Action was requested to identify before COP20 in Lima, the information that Parties will provide when putting forward their contributions. This Commentary by Noriko Fujiwara explores what the Warsaw decision means in practice and offers some preliminary ideas about what is still needed.