39 resultados para Network (Re) Organization
em Universidade do Minho
Resumo:
Co-cultures of two or more cell types and biodegradable biomaterials of natural origin have been successfully combined to recreate tissue microenvironments. Segregated co-cultures are preferred over conventional mixed ones in order to better control the degree of homotypic and heterotypic interactions. Hydrogel-based systems in particular, have gained much attention to mimic tissue-specific microenvironments and they can be microengineered by innovative bottom-up approaches such as microfluidics. In this study, we developed bi-compartmentalized (Janus) hydrogel microcapsules of methacrylated hyaluronic acid (MeHA)/methacrylated-chitosan (MeCht) blended with marine-origin collagen by droplet-based microfluidics co-flow. Human adipose stem cells (hASCs) and microvascular endothelial cells (hMVECs) were co-encapsulated to create platforms of study relevant for vascularized bone tissue engineering. A specially designed Janus-droplet generator chip was used to fabricate the microcapsules (<250â μm units) and Janus-gradient co-cultures of hASCs: hMVECs were generated in various ratios (90:10; 75:25; 50:50; 25:75; 10:90), through an automated microfluidic flow controller (Elveflow microfluidics system). Such monodisperse 3D co-culture systems were optimized regarding cell number and culture media specific for concomitant maintenance of both phenotypes to establish effective cell-cell (homotypic and heterotypic) and cell-materials interactions. Cellular parameters such as viability, matrix deposition, mineralization and hMVECs re-organization in tube-like structures, were enhanced by blending MeHA/MeCht with marine-origin collagen and increasing hASCs: hMVECs co-culture gradient had significant impact on it. Such Janus hybrid hydrogel microcapsules can be used as a platform to investigate biomaterials interactions with distinct combined cell populations.
Resumo:
The bond behavior between Fiber Reinforced Polymers (FRPs) and masonry substrates has been the subject of many studies during the last years. Recent accelerated aging tests have shown that bond degradation and FRP delamination are likely to occur in FRP-strengthened masonry components under hygrothermal conditions. While an investigation on the possible methods to improve the durability of these systems is necessary, the applicability of different bond repair methods should also be studied. This paper aims at investigating the debonding mechanisms after repairing delaminated FRP-strengthened masonry components. FRP-strengthened brick specimens, after being delaminated, are repaired with two different adhesives: a conventional epoxy resin and a highly flexible polymer. The latter is used as an innovative adhesive in structural applications. The bond behavior in the repaired specimens is investigated by performing single-lap shear bond tests. Digital image correlation (DIC) is used for deeper investigation of the surface deformation and strains development. The effectiveness of the repair methods is discussed and compared with the strengthened specimens.
Resumo:
A one-step melt-mixing method is proposed to study dispersion and re-agglomeration phenomena of the as-received and functionalized graphite nanoplates in polypropylene melts. Graphite nanoplates were chemically modified via 1,3-dipolar cycloaddition of an azomethine ylide and then grafted with polypropylene-graft-maleic anhydride. The effect of surface functionalization on the dispersion kinetics, nanoparticle re-agglomeration and interface bonding with the polymer is investigated. Nanocomposites with 2 or 10 wt% of as-received and functionalized graphite nanoplates were prepared in a small-scale prototype mixer coupled to a capillary rheometer. Samples were collected along the flow axis and characterized by optical microscopy, scanning electron microscopy and electrical conductivity measurements. The as-received graphite nanoplates tend to re-agglomerate upon stress relaxation of the polymer melt. The covalent attachment of a polymer to the nanoparticle surface enhances the stability of dispersion, delaying the re-agglomeration. Surface modification also improves interfacial interactions and the resulting composites presented improved electrical conductivity.
Resumo:
The kinetics of GnP dispersion in polypropylene melt was studied using a prototype small scale modular extensional mixer. Its modular nature enabled the sequential application of a mixing step, melt relaxation, and a second mixing step. The latter could reproduce the flow conditions on the first mixing step, or generate milder flow conditions. The effect of these sequences of flow constraints upon GnP dispersion along the mixer length was studied for composites with 2 and 10 wt.% GnP. The samples collected along the first mixing zone showed a gradual decrease of number and size of GnP agglomerates, at a rate that was independent of the flow conditions imposed to the melt, but dependent on composition. The relaxation zone induced GnP re-agglomeration, and the application of a second mixing step caused variable dispersion results that were largely dependent on the hydrodynamic stresses generated.
Resumo:
Positioning technologies are becoming ubiquitous and are being used more and more frequently for supporting a large variety of applica- tions. For outdoor applications, global navigation satellite systems (GNSSs), such as the global positioning system (GPS), are the most common and popular choice because of their wide coverage. GPS is also augmented with network-based systems that exploit existing wireless and mobile networks for providing positioning functions where GPS is not available or to save energy in battery-powered devices. Indoors, GNSSs are not a viable solution, but many applications require very accurate, fast, and exible positioning, tracking, and navigation functions. These and other requirements have stim- ulated research activities, in both industry and academia, where a variety of fundamental principles, techniques, and sensors are being integrated to provide positioning functions to many applications. The large majority of positioning technologies is for indoor environments, and most of the existing commercial products have been developed for use in of ce buildings, airports, shopping malls, factory plants, and similar spaces. There are, however, other spaces where positioning, tracking, and navigation systems play a central role in safety and in rescue operations, as well as in supporting speci c activities or for scienti c research activities in other elds. Among those spaces are underground tunnels, mines, and even underwater wells and caves. This chapter describes the research efforts over the past few years that have been put into the development of positioning systems for underground tun- nels, with particular emphasis in the case of the Large Hadron Collider (LHC) at CERN (the European Organization for Nuclear Research), where localiza- tion aims at enabling more automatic and unmanned radiation surveys. Examples of positioning and localization systems that have been devel- oped in the past few years for underground facilities are presented in the fol- lowing section, together with a brief characterization of those spaces’ special conditions and the requirements of some of the most common applications. Section 5.2 provides a short overview of some of the most representative research efforts that are currently being carried out by many research teams around the world. In addition, some of the fundamental principles and tech- niques are identi ed, such as the use of leaky coaxial cables, as used at the LHC. In Section 5.3, we introduce the speci c environment of the LHC and de ne the positioning requirements for the envisaged application. This is followed by a detailed description of our approach and the results that have been achieved so far. Some last comments and remarks are presented in a nal section.
Resumo:
Nowadays, many P2P applications proliferate in the Internet. The attractiveness of many of these systems relies on the collaborative approach used to exchange large resources without the dependence and associated constraints of centralized approaches where a single server is responsible to handle all the requests from the clients. As consequence, some P2P systems are also interesting and cost-effective approaches to be adopted by content-providers and other Internet players. However, there are several coexistence problems between P2P applications and In- ternet Service Providers (ISPs) due to the unforeseeable behavior of P2P traffic aggregates in ISP infrastructures. In this context, this work proposes a collaborative P2P/ISP system able to underpin the development of novel Traffic Engi- neering (TE) mechanisms contributing for a better coexistence between P2P applications and ISPs. Using the devised system, two TE methods are described being able to estimate and control the impact of P2P traffic aggregates on the ISP network links. One of the TE methods allows that ISP administrators are able to foresee the expected impact that a given P2P swarm will have in the underlying network infrastructure. The other TE method enables the definition of ISP friendly P2P topologies, where specific network links are protected from P2P traffic. As result, the proposed system and associated mechanisms will contribute for improved ISP resource management tasks and to foster the deployment of innovative ISP-friendly systems.
Resumo:
This paper presents an automated optimization framework able to provide network administrators with resilient routing configurations for link-state protocols, such as OSPF or IS-IS. In order to deal with the formulated NP-hard optimization problems, the devised framework is underpinned by the use of computational in- telligence optimization engines, such as Multi-objective Evolutionary Algorithms (MOEAs). With the objective of demonstrating the framework capabilities, two il- lustrative Traffic Engineering methods are described, allowing to attain routing con- figurations robust to changes in the traffic demands and maintaining the network stable even in the presence of link failure events. The presented illustrative results clearly corroborate the usefulness of the proposed automated framework along with the devised optimization methods.
Resumo:
Dissertação de mestrado em Engenharia de Sistemas
Resumo:
PhD Thesis in Bioengineering
Resumo:
The MAP-i Doctoral Programme in Informatics, of the Universities of Minho, Aveiro and Porto
Resumo:
The MAP-i Doctoral Program of the Universities of Minho, Aveiro and Porto
Resumo:
Tese de Doutoramento em Tecnologias e Sistemas de Informação.
Resumo:
Schizophrenia stands for a long-lasting state of mental uncertainty that may bring to an end the relation among behavior, thought, and emotion; that is, it may lead to unreliable perception, not suitable actions and feelings, and a sense of mental fragmentation. Indeed, its diagnosis is done over a large period of time; continuos signs of the disturbance persist for at least 6 (six) months. Once detected, the psychiatrist diagnosis is made through the clinical interview and a series of psychic tests, addressed mainly to avoid the diagnosis of other mental states or diseases. Undeniably, the main problem with identifying schizophrenia is the difficulty to distinguish its symptoms from those associated to different untidiness or roles. Therefore, this work will focus on the development of a diagnostic support system, in terms of its knowledge representation and reasoning procedures, based on a blended of Logic Programming and Artificial Neural Networks approaches to computing, taking advantage of a novel approach to knowledge representation and reasoning, which aims to solve the problems associated in the handling (i.e., to stand for and reason) of defective information.
Resumo:
Thrombotic disorders have severe consequences for the patients and for the society in general, being one of the main causes of death. These facts reveal that it is extremely important to be preventive; being aware of how probable is to have that kind of syndrome. Indeed, this work will focus on the development of a decision support system that will cater for an individual risk evaluation with respect to the surge of thrombotic complaints. The Knowledge Representation and Reasoning procedures used will be based on an extension to the Logic Programming language, allowing the handling of incomplete and/or default data. The computational framework in place will be centered on Artificial Neural Networks.
Resumo:
O conceito de qualidade de vida surge pela primeira vez em 1920, através do economista inglês Arthur Cecil Pigou, que utiliza este termo para descrever o impacto governamental sobre a vida das pessoas mais desfavorecidas. Com a instalação de uma era industrializada e com o fim da 2º Guerra Mundial, a sociedade mudou de paradigma e iniciou uma procura incessante de formas para melhorar a sua qualidade de vida. Este conceito desenvolve-se juntamente com o desenvolvimento do conceito de educação, saúde, habitação, transporte, trabalho e lazer, bem como indicadores do aumento da esperança de vida, a diminuição da mortalidade infantil e dos níveis de poluição. O avanço da tecnologia teve um papel fundamental para a evolução desses conceitos, bem como o Design na procura de soluções para aplicação dessas mesmas tecnologias. No caso concreto da indústria tèxtil, a tendência é o desenvolvimento de têxteis inteligentes envolvendo a engenharia electrónica no seu processo de conceptualização e de fabrico. A chamada tecnologia wearable abre novos horizontes para a criação de soluções inovadoras, abrindo novos nichos de mercado com elevado valor acrescentado. Existem atualmente vários produtos no mercado cuja funcionalidade e utilidade lhes conferiu um estatuto imutável ao longo dos anos, onde a evolução não avançou na tendência atual. Esse é o caso dos tecidos estreitos, cuja funcionalidade poderá adquirir novas capacidades e ser utilizada em diferentes componentes têxteis nas mais variadas áreas. Essas capacidades poderão ser acrescentadas pela incorporação de materiais com luminosidade (Led’s e L-Wire) nas suas estruturas. Neste estudo realizado o design de produtos com novas funcionalidades, adaptando as tecnologias até agora desenvolvidas em novas soluções e/ou novas recriações de produto.