861 resultados para Multi Domain Information Model
Resumo:
Hoy día, en la era post genómica, los ensayos clínicos de cáncer implican la colaboración de diversas instituciones. El análisis multicéntrico y retrospectivo requiere de métodos avanzados para garantizar la interoperabilidad semántica. En este escenario, el objetivo de los proyectos EURECA e INTEGRATE es proporcionar una infraestructura para compartir conocimientos y datos de los ensayos clínicos post genómicos de cáncer. Debido en gran parte a la gran complejidad de los procesos colaborativos de las instituciones, provoca que la gestión de una información tan heterogénea sea un desafío dentro del área médica. Las tecnologías semánticas y las investigaciones relacionadas están centradas en búsqueda de conocimiento de la información extraída, permitiendo una mayor flexibilidad y usabilidad de los datos extraidos. Debido a la falta de estándares adoptados por estas entidades y la complejidad de los datos procedentes de ensayos clínicos, una capacidad semántica es esencial para asegurar la integración homogénea de esta información. De otra manera, los usuarios finales necesitarán conocer cada modelo y cada formato de dato de las instituciones participantes en cada estudio. Para proveer de una capa de interoperabilidad semántica, el primer paso es proponer un\Common Data Model" (CDM) que represente la información a almacenar, y un \Core Dataset" que permita el uso de múltiples terminologías como vocabulario compartido. Una vez que el \Core Dataset" y el CDM han sido seleccionados, la manera en la que realizar el mapping para unir los conceptos de una terminología dada al CDM, requiere de una mecanismo especial para realizar dicha labor. Dicho mecanismo, debe definir que conceptos de diferentes vocabularios pueden ser almacenados en determinados campos del modelo de datos, con la finalidad de crear una representación común de la información. El presente proyecto fin de grado, presenta el desarrollo de un servicio que implementa dicho mecanismo para vincular elementos de las terminologías médicas SNOMED CT, LOINC y HGNC, con objetos del \Health Level 7 Reference Information Model" (HL7 RIM). El servicio propuesto, y nombrado como TermBinding, sigue las recomendaciones del proyecto TermInfo del grupo HL7, pero también se tienen en cuenta cuestiones importantes que surgen al enlazar entre las citadas terminologas y el modelo de datos planteado. En este proceso de desarrollo de la interoperabilidad semántica en ensayos clínicos de cáncer, los datos de fuentes heterogéneas tienen que ser integrados, y es requisito que se deba habilitar una interfaz de acceso homogéneo a toda esta información. Para poder hacer unificar los datos provenientes de diferentes aplicaciones y bases de datos, es esencial representar todos estos datos de una manera canónica o normalizada. La estandarización de un determinado concepto de SNOMED CT, simplifica las recomendaciones del proyecto TermInfo del grupo HL7, utilizadas para poder almacenar cada concepto en el modelo de datos. Siguiendo este enfoque, la interoperabilidad semántica es conseguida con éxito para conceptos SNOMED CT, sean o no post o pre coordinados, así como para las terminologías LOINC y HGNC. Los conceptos son estandarizados en una forma normal que puede ser usada para unir los datos al \Common Data Model" basado en el RIM de HL7. Aunque existen limitaciones debido a la gran heterogeneidad de los datos a integrar, un primer prototipo del servicio propuesto se está utilizando con éxito en el contexto de los proyectos EURECA e INTEGRATE. Una mejora en la interoperabilidad semántica de los datos de ensayos clínicos de cáncer tiene como objetivo mejorar las prácticas en oncología.
Resumo:
Fluid flow and fabric compaction during vacuum assisted resin infusion (VARI) of composite materials was simulated using a level set-based approach. Fluid infusion through the fiber preform was modeled using Darcy’s equations for the fluid flow through a porous media. The stress partition between the fluid and the fiber bed was included by means of Terzaghi’s effective stress theory. Tracking the fluid front during infusion was introduced by means of the level set method. The resulting partial differential equations for the fluid infusion and the evolution of flow front were discretized and solved approximately using the finite differences method with a uniform grid discretization of the spatial domain. The model results were validated against uniaxial VARI experiments through an [0]8 E-glass plain woven preform. The physical parameters of the model were also independently measured. The model results (in terms of the fabric thickness, pressure and fluid front evolution during filling) were in good agreement with the numerical simulations, showing the potential of the level set method to simulate resin infusion
Resumo:
(SPA) La elección de localizaciones para la implantación de actividades industriales es un problema complejo, donde a los criterios de coste y eficiencia se han ido añadiendo otros nuevos relativos tanto al impacto en el medio ambiente como a la imagen de la empresa reflejada en la Responsabilidad Social Empresarial. Los criterios medioambientales han ido adquiriendo gran relevancia en la decisión final, hasta convertirse, gracias a la obligación de someter los proyectos a evaluación ambiental, en elementos clave en la decisión final. Por ello, resulta relativamente frecuente que los promotores consulten previamente con la Administración sobre la viabilidad de sus proyectos antes de iniciar un dilatado procedimiento administrativo. En este trabajo se plantea la utilización de indicadores de sostenibilidad y su aplicación, a través de un modelo de decisiones multicriterio, para la ordenación de las distintas opciones de ubicación inicialmente consideradas, de tal forma que se conviertan en instrumento de tanteo y ayuda en la toma de estas decisiones. Para mostrar su utilidad se propone la utilización de la herramienta de apoyo basada en la metodología PROMETHEE y su aplicación en la ordenación de cinco emplazamientos alternativos para la instalación de una cementera en la Comunidad de Madrid según criterios de sostenibilidad. (ENG) The choice of locations for the implementation of industrial activities is a complex problem where the cost and efficiency criteria have been adding new ones relating to the environment impact and the company’s corporate image reflected in Corporate Social Responsability. The environmental criteria have been getting big importance in the final decision, to become key elements in the final decision, due to the duty of submit of environmental assessment projects. Therefore, promoters, quite often, ask previously to the Administration about the viability of their projects before starting a lengthy administrative procedure. This paper proposes the use of sustainability indicators and their application through a multi-criteria decision model for managing the establishment options initially considered, so that they become an help instrument of estimation in order to making these decisions. To show its usefulness we propose the use of the support tool for decision making based on the PROMETHEE methodology and its application in the management of 5 alternative sites for the installation of a cement factory in the Community of Madrid under sustainability criteria.
Resumo:
Recientemente, el paradigma de la computación en la nube ha recibido mucho interés por parte tanto de la industria como del mundo académico. Las infraestructuras cloud públicas están posibilitando nuevos modelos de negocio y ayudando a reducir costes. Sin embargo, una compañía podría desear ubicar sus datos y servicios en sus propias instalaciones, o tener que atenerse a leyes de protección de datos. Estas circunstancias hacen a las infraestructuras cloud privadas ciertamente deseables, ya sea para complementar a las públicas o para sustituirlas por completo. Por desgracia, las carencias en materia de estándares han impedido que las soluciones para la gestión de infraestructuras privadas se hayan desarrollado adecuadamente. Además, la multitud de opciones disponibles ha creado en los clientes el miedo a depender de una tecnología concreta (technology lock-in). Una de las causas de este problema es la falta de alineación entre la investigación académica y los productos comerciales, ya que aquella está centrada en el estudio de escenarios idealizados sin correspondencia con el mundo real, mientras que éstos consisten en soluciones desarrolladas sin tener en cuenta cómo van a encajar con los estándares más comunes o sin preocuparse de hacer públicos sus resultados. Con objeto de resolver este problema, propongo un sistema de gestión modular para infraestructuras cloud privadas enfocado en tratar con las aplicaciones en lugar de centrarse únicamente en los recursos hardware. Este sistema de gestión sigue el paradigma de la computación autónoma y está diseñado en torno a un modelo de información sencillo, desarrollado para ser compatible con los estándares más comunes. Este modelo divide el entorno en dos vistas, que sirven para separar aquello que debe preocupar a cada actor involucrado del resto de información, pero al mismo tiempo permitiendo relacionar el entorno físico con las máquinas virtuales que se despliegan encima de él. En dicho modelo, las aplicaciones cloud están divididas en tres tipos genéricos (Servicios, Trabajos de Big Data y Reservas de Instancias), para que así el sistema de gestión pueda sacar partido de las características propias de cada tipo. El modelo de información está complementado por un conjunto de acciones de gestión atómicas, reversibles e independientes, que determinan las operaciones que se pueden llevar a cabo sobre el entorno y que es usado para hacer posible la escalabilidad en el entorno. También describo un motor de gestión encargado de, a partir del estado del entorno y usando el ya mencionado conjunto de acciones, la colocación de recursos. Está dividido en dos niveles: la capa de Gestores de Aplicación, encargada de tratar sólo con las aplicaciones; y la capa del Gestor de Infraestructura, responsable de los recursos físicos. Dicho motor de gestión obedece un ciclo de vida con dos fases, para así modelar mejor el comportamiento de una infraestructura real. El problema de la colocación de recursos es atacado durante una de las fases (la de consolidación) por un resolutor de programación entera, y durante la otra (la online) por un heurístico hecho ex-profeso. Varias pruebas han demostrado que este acercamiento combinado es superior a otras estrategias. Para terminar, el sistema de gestión está acoplado a arquitecturas de monitorización y de actuadores. Aquella estando encargada de recolectar información del entorno, y ésta siendo modular en su diseño y capaz de conectarse con varias tecnologías y ofrecer varios modos de acceso. ABSTRACT The cloud computing paradigm has raised in popularity within the industry and the academia. Public cloud infrastructures are enabling new business models and helping to reduce costs. However, the desire to host company’s data and services on premises, and the need to abide to data protection laws, make private cloud infrastructures desirable, either to complement or even fully substitute public oferings. Unfortunately, a lack of standardization has precluded private infrastructure management solutions to be developed to a certain level, and a myriad of diferent options have induced the fear of lock-in in customers. One of the causes of this problem is the misalignment between academic research and industry ofering, with the former focusing in studying idealized scenarios dissimilar from real-world situations, and the latter developing solutions without taking care about how they f t with common standards, or even not disseminating their results. With the aim to solve this problem I propose a modular management system for private cloud infrastructures that is focused on the applications instead of just the hardware resources. This management system follows the autonomic system paradigm, and is designed around a simple information model developed to be compatible with common standards. This model splits the environment in two views that serve to separate the concerns of the stakeholders while at the same time enabling the traceability between the physical environment and the virtual machines deployed onto it. In it, cloud applications are classifed in three broad types (Services, Big Data Jobs and Instance Reservations), in order for the management system to take advantage of each type’s features. The information model is paired with a set of atomic, reversible and independent management actions which determine the operations that can be performed over the environment and is used to realize the cloud environment’s scalability. From the environment’s state and using the aforementioned set of actions, I also describe a management engine tasked with the resource placement. It is divided in two tiers: the Application Managers layer, concerned just with applications; and the Infrastructure Manager layer, responsible of the actual physical resources. This management engine follows a lifecycle with two phases, to better model the behavior of a real infrastructure. The placement problem is tackled during one phase (consolidation) by using an integer programming solver, and during the other (online) with a custom heuristic. Tests have demonstrated that this combined approach is superior to other strategies. Finally, the management system is paired with monitoring and actuators architectures. The former able to collect the necessary information from the environment, and the later modular in design and capable of interfacing with several technologies and ofering several access interfaces.
Resumo:
Several languages have been proposed for the task of describing networks of systems, either to help on managing, simulate or deploy testbeds for testing purposes. However, there is no one specifically designed to describe the honeynets, covering the specific characteristics in terms of applications and tools included in the honeypot systems that make the honeynet. In this paper, the requirements of honeynet description are studied and a survey of existing description languages is presented, concluding that a CIM (Common Information Model) match the basic requirements. Thus, a CIM like technology independent honeynet description language (TIHDL) is proposed. The language is defined being independent of the platform where the honeynet will be deployed later, and it can be translated, either using model-driven techniques or other translation mechanisms, into the description languages of honeynet deployment platforms and tools. This approach gives flexibility to allow the use of a combination of heterogeneous deployment platforms. Besides, a flexible virtual honeynet generation tool (HoneyGen) based on the approach and description language proposed and capable of deploying honeynets over VNX (Virtual Networks over LinuX) and Honeyd platforms is presented for validation purposes.
Resumo:
The activity of l-type Ca2+ channels is increased by dihydropyridine (DHP) agonists and inhibited by DHP antagonists, which are widely used in the therapy of cardiovascular disease. These drugs bind to the pore-forming α1 subunits of l-type Ca2+ channels. To define the minimal requirements for DHP binding and action, we constructed a high-affinity DHP receptor site by substituting a total of nine amino acid residues from DHP-sensitive l-type α1 subunits into the S5 and S6 transmembrane segments of domain III and the S6 transmembrane segment of domain IV of the DHP-insensitive P/Q-type α1A subunit. The resulting chimeric α1A/DHPS subunit bound DHP antagonists with high affinity in radioligand binding assays and was inhibited by DHP antagonists with high affinity in voltage clamp experiments. Substitution of these nine amino acid residues yielded 86% of the binding energy of the l-type α1C subunit and 92% of the binding energy of the l-type α1S subunit for the high-affinity DHP antagonist PN200–110. The activity of chimeric Ca2+ channels containing α1A/DHPS was increased 3.5 ± 0.7-fold by the DHP agonist (−)Bay K8644. The effect of this agonist was stereoselective as in l-type Ca2+ channels since (+) Bay K8644 inhibited the activity of α1A/DHPS. The results show conclusively that DHP agonists and antagonists bind to a single receptor site at which they have opposite effects on Ca2+ channel activity. This site contains essential components from both domains III and IV, consistent with a domain interface model for binding and allosteric modulation of Ca2+ channel activity by DHPs.
Resumo:
DdLim, a multi-domain member of the cysteine-rich family of LIM domain proteins, was isolated from Dictyostelium cells where it localizes in lamellipodia and at sites of membrane ruffling. The transcription and expression of DdLim are developmentally regulated, and the timing of its increased association with the actin cytoskeleton coincides with the acquisition in starved cells of a motile, chemotactic behavior. Vegetative cells that overexpress DdLim contain large lamella and exhibit ruffling at the cortex. The high frequency of large, multinucleated mutant cells found in suspension culture suggests that excess DdLim interferes with cytokinesis. DdLim was also identified as a protein in a Dictyostelium cell lysate that associated indirectly, but in a guanosine triphosphate-dependent manner, with a GST-rac1 fusion protein. The data presented suggest that DdLim acts as an adapter protein at the cytoskeleton-membrane interface where it is involved in a receptor-mediated rac1-signaling pathway that leads to actin polymerization in lamellipodia and ultimately cell motility.
Resumo:
Molecular analysis of complex modular structures, such as promoter regions or multi-domain proteins, often requires the creation of families of experimental DNA constructs having altered composition, order, or spacing of individual modules. Generally, creation of every individual construct of such a family uses a specific combination of restriction sites. However, convenient sites are not always available and the alternatives, such as chemical resynthesis of the experimental constructs or engineering of different restriction sites onto the ends of DNA fragments, are costly and time consuming. A general cloning strategy (nucleic acid ordered assembly with directionality, NOMAD; WWW resource locator http:@Lmb1.bios.uic.edu/NOMAD/NOMAD.htm l) is proposed that overcomes these limitations. Use of NOMAD ensures that the production of experimental constructs is no longer the rate-limiting step in applications that require combinatorial rearrangement of DNA fragments. NOMAD manipulates DNA fragments in the form of "modules" having a standardized cohesive end structure. Specially designed "assembly vectors" allow for sequential and directional insertion of any number of modules in an arbitrary predetermined order, using the ability of type IIS restriction enzymes to cut DNA outside of their recognition sequences. Studies of regulatory regions in DNA, such as promoters, replication origins, and RNA processing signals, construction of chimeric proteins, and creation of new cloning vehicles, are among the applications that will benefit from using NOMAD.
Resumo:
Este trabalho apresenta um modelo de otimização multiobjetivo aplicado ao projeto de concepção de submarinos convencionais (i.e. de propulsão dieselelétrica). Um modelo de síntese que permite a estimativa de pesos, volume, velocidade, carga elétrica e outras características de interesse para a o projeto de concepção é formulado. O modelo de síntese é integrado a um modelo de otimização multiobjetivo baseado em algoritmos genéticos (especificamente, o algoritmo NSGA II). A otimização multiobjetivo consiste na maximização da efetividade militar do submarino e na minimização de seu custo. A efetividade militar do submarino é representada por uma Medida Geral de Efetividade (OMOE) estabelecida por meio do Processo Analítico Hierárquico (AHP). O Custo Básico de Construção (BCC) do submarino é estimado a partir dos seus grupos de peso. Ao fim do processo de otimização, é estabelecida uma Fronteira de Pareto composta por soluções não dominadas. Uma dessas soluções é selecionada para refinamento preliminar e os resultados são discutidos. Subsidiariamente, esta dissertação apresenta discussão sucinta sobre aspectos históricos e operativos relacionados a submarinos, bem como sobre sua metodologia de projeto. Alguns conceitos de Arquitetura Naval, aplicada ao projeto dessas embarcações, são também abordados.
Resumo:
O mercado consumidor passou por diversas transformações ao longo do tempo devido principalmente à evolução tecnológica. A evolução tecnológica proporcionou ao consumidor a possibilidade de escolher por produtos e marcas, e permite a oportunidade de colaborar e influenciar a opinião de outros consumidores através do compartilhamento de experiências, principalmente através da utilização de plataformas digitais. O CRM (gerenciamento do relacionamento com o consumidor) é a forma utilizada pelas empresas para conhecerem o consumidor e criar um relacionamento satisfatório entre empresa e consumidor. Esse relacionamento tem o intuito de satisfazer e fidelizar o consumidor, evitando que ele deixe de consumir a marca e evitando que ele influencie negativamente outros consumidores. O e-CRM é o gerenciamento eletrônico do relacionamento com o consumidor, que possui todas as tradicionais características do CRM, porém com o incremento do ambiente digital. O ambiente digital diminuiu a distância entre pessoas e empresas e se tornou um meio colaborativo de baixo custo de interação com o consumidor. Por outro lado, este é um meio onde o consumidor deixa de ser passivo e se torna ativo, o que o torna capaz de influenciar não só um pequeno grupo de amigos, mas toda uma rede de consumidores. A digital analytics é a medição, coleta, análise e elaboração de relatórios de dados digitais para os propósitos de entendimento e otimização da performance em negócios. A utilização de dados digitais auxilia no desenvolvimento do e-CRM através da compreensão do comportamento do consumidor em um ambiente onde o consumidor é ativo. O ambiente digital permite um conhecimento mais detalhado dos consumidores, baseado não somente nos hábitos de compra, mas também nos interesses e interações. Este estudo tem como objetivo principal compreender como as empresas aplicam os conceitos do e-CRM em suas estratégias de negócios, compreendendo de que forma a digital analytics contribui para o desenvolvimento do e-CRM, e compreendendo como os fatores críticos de sucesso (humano, tecnológico e estratégico) impactam na implantação e desenvolvimento do e-CRM. Quatro empresas de diferentes segmentos foram estudadas através da aplicação de estudo de caso. As empresas buscam cada vez mais explorar as estratégias de e-CRM no ambiente digital, porém existem limitações identificadas devido à captação, armazenamento e análise de informações multicanais, principalmente considerando os canais digitais. Outros fatores como o apoio da alta direção e a compreensão de funcionários para lidar com estratégias focadas no consumidor único também foram identificados neste estudo. O estudo foi capaz de identificar as informações mais relevantes para a geração de estratégias de gerenciamento eletrônico do relacionamento com o consumidor e identificou os aspectos mais relevantes dos fatores críticos de sucesso.
Resumo:
This paper describes a CL-SR system that employs two different techniques: the first one is based on NLP rules that consist on applying logic forms to the topic processing while the second one basically consists on applying the IR-n statistical search engine to the spoken document collection. The application of logic forms to the topics allows to increase the weight of topic terms according to a set of syntactic rules. Thus, the weights of the topic terms are used by IR-n system in the information retrieval process.
Resumo:
This paper analyses the consequences of enhanced biofuel production in regions and countries of the world that have announced plans to implement or expand on biofuel policies. The analysis considers biofuel policies implemented as binding blending targets for transportation fuels. The chosen quantitative modelling approach is two-fold: it combines the analysis of biofuel policies in a multi-sectoral economic model (MAGNET) with systematic variation of the functioning of capital and labour markets. This paper adds to existing research by considering biofuel policies in the EU, the US and various other countries with considerable agricultural production and trade, such as Brazil, India and China. Moreover, the application multi-sectoral modelling system with different assumptions on the mobility of factor markets allows for the observation of changes in economic indicators under different conditions of how factor markets work. Systematic variation of factor mobility indicates that the ‘burden’ of global biofuel policies is not equally distributed across different factors within agricultural production. Agricultural land, as the pre-dominant and sector-specific factor, is, regardless of different degrees of inter-sectoral or intra-sectoral factor mobility, the most important factor limiting the expansion of agricultural production. More capital and higher employment in agriculture will ease the pressure on additional land use – but only partly. To expand agricultural production at global scale requires both land and mobile factors adapted to increase total factor productivity in agriculture in the most efficient way.
Resumo:
A detailed rock magnetic investigation has been carried out on Deep Sea Drilling Project (DSDP) pelagic sediments from the Central Equatorial Pacific. This comprises hysteresis and thermomagnetic measurements, Lowrie-Fuller test and, for the first time, ferromagnetic resonance (FMR). Nearly stochiometric magnetite in two grain size fractions, single domain (SD) and multi domain (MD), has been deduced to be the carrier of magnetic remanence. Comparatively strong paramagnetic contributions are carried by pyrite, being identified by X-ray analysis. The statistical analysis of paleomagnetic parameters (NRM, MDF, initial susceptibility, Königsberger ratio Q) from a large number (> 1000) of samples, supported by hysteresis measurements, indicates a latitude and sedimentation rate dependent ratio of SD/MD grains. Possible sources for the magnetic constituents are discussed in terms of bacterial, volcanic, meteoritic and authigenic origin.
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
The Scottish Executive has adopted a policy to combat Scotland's declining population by encouraging inward migration. Using a multi-state population model this paper presents nine long-term population scenarios for Scotland using three net international migration levels and three fertility paths. The results show inward migration can slow population decline but makes little difference to population ageing. Without a higher fertility rate Scotland's population will become demographically unsustainable. Our simulations show that a higher fertility rate substantially reduces the future ageing.