18 resultados para Co-Coercive Mapping

em Instituto Politécnico do Porto, Portugal


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of scaffolds that combine the delivery of drugs with the physical support provided by electrospun fibres holds great potential in the field of nerve regeneration. Here it is proposed the incorporation of ibuprofen, a well-known non-steroidal anti-inflammatory drug, in electrospun fibres of the statistical copolymer poly(trimethylene carbonate-co-ε-caprolactone) [P(TMC-CL)] to serve as a drug delivery system to enhance axonal regeneration in the context of a spinal cord lesion, by limiting the inflammatory response. P(TMC-CL) fibres were electrospun from mixtures of dichloromethane (DCM) and dimethylformamide (DMF). The solvent mixture applied influenced fibre morphology, as well as mean fibre diameter, which decreased as the DMF content in solution increased. Ibuprofen-loaded fibres were prepared from P(TMC-CL) solutions containing 5% ibuprofen (w/w of polymer). Increasing drug content to 10% led to jet instability, resulting in the formation of a less homogeneous fibrous mesh. Under the optimized conditions, drug-loading efficiency was above 80%. Confocal Raman mapping showed no preferential distribution of ibuprofen in P(TMC-CL) fibres. Under physiological conditions ibuprofen was released in 24h. The release process being diffusion-dependent for fibres prepared from DCM solutions, in contrast to fibres prepared from DCM-DMF mixtures where burst release occurred. The biological activity of the drug released was demonstrated using human-derived macrophages. The release of prostaglandin E2 to the cell culture medium was reduced when cells were incubated with ibuprofen-loaded P(TMC-CL) fibres, confirming the biological significance of the drug delivery strategy presented. Overall, this study constitutes an important contribution to the design of a P(TMC-CL)-based nerve conduit with anti-inflammatory properties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main purpose of this study was to examine the applicability of geostatistical modeling to obtain valuable information for assessing the environmental impact of sewage outfall discharges. The data set used was obtained in a monitoring campaign to S. Jacinto outfall, located off the Portuguese west coast near Aveiro region, using an AUV. The Matheron’s classical estimator was used the compute the experimental semivariogram which was fitted to three theoretical models: spherical, exponential and gaussian. The cross-validation procedure suggested the best semivariogram model and ordinary kriging was used to obtain the predictions of salinity at unknown locations. The generated map shows clearly the plume dispersion in the studied area, indicating that the effluent does not reach the near by beaches. Our study suggests that an optimal design for the AUV sampling trajectory from a geostatistical prediction point of view, can help to compute more precise predictions and hence to quantify more accurately dilution. Moreover, since accurate measurements of plume’s dilution are rare, these studies might be very helpful in the future for validation of dispersion models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose- Economics and business have evolved as sciences in order to accommodate more of ‘real world’ solutions for the problems approached. In many cases, both business and economics have been supported by other disciplines in order to obtain a more complete framework for the study of complex issues. The aim of this paper is to explore the contribution of three heterodox economics disciplines to the knowledge of business co-operation. Design/methodology/approach- This approach is theoretical and it shows that many relevant aspects of business co-operation have been proposed by economic geography, institutional economics, and economic sociology. Findings- This paper highlights the business mechanisms of co-operation, reflecting on the role of places, institution and the social context where businesses operate. Research Implications- It contributes with a theoretical framework for the explanation of business co-operations and networks that goes beyond the traditional economics theories. Originality/value- This paper contributes with a framework for the study of business co-operation both from an economics and management perspective. This framework embodies a number of non-quantitative issues that are critical for understanding the complex networks in which firms operate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A constante e sistemática subida de preço dos combustíveis fósseis e as contínuas preocupações com o meio ambiente determinaram a procura de soluções ambientalmente sustentáveis. O biodiesel surge, então, como uma alternativa para essa problemática, bem como uma solução para resíduos líquidos e gordurosos produzidos pelo ser humano. A produção de biodiesel tem sido alvo de extensa atenção nos últimos anos, pois trata-se de um combustível biodegradável e não poluente. A produção de biodiesel pelo processo de transesterificação usando álcoois de cadeia curta e catalisadores químicos, nomeadamente alcalinos, tem sido aceite industrialmente devido à sua elevada conversão. Recentemente, a transesterificação enzimática tem ganho adeptos. No entanto, o custo da enzima permanece uma barreira para a sua aplicação em grande escala. O presente trabalho visa a produção de biodiesel por transesterificação enzimática a partir de óleo residual de origem vegetal. O álcool usado foi o etanol, em substituição do metanol usado convencionalmente na catálise homogénea, pois a atividade da enzima é inibida pela presença deste último. As maiores dificuldades apresentadas na etanólise residem na separação das fases (Glicerol e Biodiesel) após a reação bem como na menor velocidade de reação. Para ajudar a colmatar esta desvantagem foi estudada a influência de dois cosolventes: o hexano e o hexanol, na proporção de 20% (v/v). Após a escolha do co-solvente que permite obter melhor rendimento (o hexano), foi elaborado um planeamento fatorial no qual se estudou a influência de três variáveis na produção de biodiesel por catálise enzimática com etanol e co-solventes: a razão molar óleo/álcool (1:8, 1:6 e 1:4), a quantidade de co-solvente adicionado (30, 20 e 10%, v/v) e o tempo de reação (48, 36 e 24h). A avaliação do processo foi inicialmente seguida pelo rendimento da reação, a fim de identificar as melhores condições, sendo substituída posteriormente pela quantificação do teor de ésteres por cromatografia em fase gasosa. O biodiesel com teor de ésteres mais elevado foi produzido nas condições correspondentes a uma razão molar óleo:álcool de 1:4, com 5g de Lipozyme TL IM como catalisador, 10% co-solvente (hexano, v/v), à temperatura de 35 ºC durante 24h. O rendimento do biodiesel produzido sob estas condições foi de 73,3%, traduzido em 64,7% de teor de ésteres etílicos. Contudo o rendimento mais elevado que se obteve foi de 99,7%, para uma razão óleo/álcool de 1:8, 30% de co-solvente (hexano, v/v), reação durante 48h a 35 ºC, obtendo-se apenas 46,1% de ésteres. Por fim, a qualidade do biodiesel foi ainda avaliada, de acordo com as especificações da norma EN 14214, através das determinações de densidade, viscosidade, ponto de inflamação, teor de água, corrosão ao cobre, índice de acidez, índice de iodo, teor de sódio (Na+) e potássio (K+), CFPP e poder calorífico. Na Europa, os ésteres etílicos não têm, neste momento, norma que os regule quanto à classificação da qualidade de biodiesel. Contudo, o biodiesel produzido foi analisado de acordo com a norma europeia EN14214, norma esta que regula a qualidade dos ésteres metílicos, sendo possível concluir que nenhum dos parâmetros avaliados se encontra em conformidade com a mesma.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Concentrations of eleven trace elements (Al, As, Cd, Cr, Co, Hg, Mn, Ni, Pb, Se, and Si) were measured in 39 (natural and flavoured) water samples. Determinations were performed using graphite furnace electrothermetry for almost all elements (Al, As, Cd, Cr, Co, Mn, Ni, Pb, and Si). For Se determination hydride generation was used, and cold vapour generation for Hg. These techniques were coupled to atomic absorption spectrophotometry. The trace element content of still or sparkling natural waters changed from brand to brand. Significant differences between natural still and natural sparkling waters (p<0.001) were only apparent for Mn. The Mann–Whitney U-test was used to search for significant differences between flavoured and natural waters. The concentration of each element was compared with the presence of flavours, preservatives, acidifying agents, fruit juice and/or sweeteners, according to the labelled composition. It was shown that flavoured waters generally increase the trace element content. The addition of preservatives and acidifying regulators had a significant influence on Mn, Co, As and Si contents (p<0.05). Fruit juice can also be correlated to the increase of Co and As. Sweeteners did not provide any significant difference in Mn, Co, Se and Si content.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introdução Hoje em dia, o conceito de ontologia (Especificação explícita de uma conceptualização [Gruber, 1993]) é um conceito chave em sistemas baseados em conhecimento em geral e na Web Semântica em particular. Entretanto, os agentes de software nem sempre concordam com a mesma conceptualização, justificando assim a existência de diversas ontologias, mesmo que tratando o mesmo domínio de discurso. Para resolver/minimizar o problema de interoperabilidade entre estes agentes, o mapeamento de ontologias provou ser uma boa solução. O mapeamento de ontologias é o processo onde são especificadas relações semânticas entre entidades da ontologia origem e destino ao nível conceptual, e que por sua vez podem ser utilizados para transformar instâncias baseadas na ontologia origem em instâncias baseadas na ontologia destino. Motivação Num ambiente dinâmico como a Web Semântica, os agentes alteram não só os seus dados mas também a sua estrutura e semântica (ontologias). Este processo, denominado evolução de ontologias, pode ser definido como uma adaptação temporal da ontologia através de alterações que surgem no domínio ou nos objectivos da própria ontologia, e da gestão consistente dessas alterações [Stojanovic, 2004], podendo por vezes deixar o documento de mapeamento inconsistente. Em ambientes heterogéneos onde a interoperabilidade entre sistemas depende do documento de mapeamento, este deve reflectir as alterações efectuadas nas ontologias, existindo neste caso duas soluções: (i) gerar um novo documento de mapeamento (processo exigente em termos de tempo e recursos computacionais) ou (ii) adaptar o documento de mapeamento, corrigindo relações semânticas inválidas e criar novas relações se forem necessárias (processo menos existente em termos de tempo e recursos computacionais, mas muito dependente da informação sobre as alterações efectuadas). O principal objectivo deste trabalho é a análise, especificação e desenvolvimento do processo de evolução do documento de mapeamento de forma a reflectir as alterações efectuadas durante o processo de evolução de ontologias. Contexto Este trabalho foi desenvolvido no contexto do MAFRA Toolkit1. O MAFRA (MApping FRAmework) Toolkit é uma aplicação desenvolvida no GECAD2 que permite a especificação declarativa de relações semânticas entre entidades de uma ontologia origem e outra de destino, utilizando os seguintes componentes principais: Concept Bridge – Representa uma relação semântica entre um conceito de origem e um de destino; Property Bridge – Representa uma relação semântica entre uma ou mais propriedades de origem e uma ou mais propriedades de destino; Service – São aplicados às Semantic Bridges (Property e Concept Bridges) definindo como as instâncias origem devem ser transformadas em instâncias de destino. Estes conceitos estão especificados na ontologia SBO (Semantic Bridge Ontology) [Silva, 2004]. No contexto deste trabalho, um documento de mapeamento é uma instanciação do SBO, contendo relações semânticas entre entidades da ontologia de origem e da ontologia de destino. Processo de evolução do mapeamento O processo de evolução de mapeamento é o processo onde as entidades do documento de mapeamento são adaptadas, reflectindo eventuais alterações nas ontologias mapeadas, tentando o quanto possível preservar a semântica das relações semântica especificadas. Se as ontologias origem e/ou destino sofrerem alterações, algumas relações semânticas podem tornar-se inválidas, ou novas relações serão necessárias, sendo por isso este processo composto por dois sub-processos: (i) correcção de relações semânticas e (ii) processamento de novas entidades das ontologias. O processamento de novas entidades das ontologias requer a descoberta e cálculo de semelhanças entre entidades e a especificação de relações de acordo com a ontologia/linguagem SBO. Estas fases (“similarity measure” e “semantic bridging”) são implementadas no MAFRA Toolkit, sendo o processo (semi-) automático de mapeamento de ontologias descrito em [Silva, 2004].O processo de correcção de entidades SBO inválidas requer um bom conhecimento da ontologia/linguagem SBO, das suas entidades e relações, e de todas as suas restrições, i.e. da sua estrutura e semântica. Este procedimento consiste em (i) identificar as entidades SBO inválidas, (ii) a causa da sua invalidez e (iii) corrigi-las da melhor forma possível. Nesta fase foi utilizada informação vinda do processo de evolução das ontologias com o objectivo de melhorar a qualidade de todo o processo. Conclusões Para além do processo de evolução do mapeamento desenvolvido, um dos pontos mais importantes deste trabalho foi a aquisição de um conhecimento mais profundo sobre ontologias, processo de evolução de ontologias, mapeamento etc., expansão dos horizontes de conhecimento, adquirindo ainda mais a consciência da complexidade do problema em questão, o que permite antever e perspectivar novos desafios para o futuro.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Adhesive bonding has become more efficient in the last few decades due to the adhesives developments, granting higher strength and ductility. On the other hand, natural fibre composites have recently gained interest due to the low cost and density. It is therefore essential to predict the fracture behavior of joints between these materials, to assess the feasibility of joining or repairing with adhesives. In this work, the tensile fracture toughness (Gc n) of adhesive joints between natural fibre composites is studied, by bonding with a ductile adhesive and co-curing. Conventional methods to obtain Gc n are used for the co-cured specimens, while for the adhesive within the bonded joint, the J-integral is considered. For the J-integral calculation, an optical measurement method is developed for the evaluation of the crack tip opening and adherends rotation at the crack tip during the test, supported by a Matlab sub-routine for the automated extraction of these quantities. As output of this work, an optical method that allows an easier and quicker extraction of the parameters to obtain Gc n than the available methods is proposed (by the J-integral technique), and the fracture behaviour in tension of bonded and co-cured joints in jute-reinforced natural fibre composites is also provided for the subsequent strength prediction. Additionally, for the adhesively- bonded joints, the tensile cohesive law of the adhesive is derived by the direct method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many-core platforms based on Network-on-Chip (NoC [Benini and De Micheli 2002]) present an emerging technology in the real-time embedded domain. Although the idea to group the applications previously executed on separated single-core devices, and accommodate them on an individual many-core chip offers various options for power savings, cost reductions and contributes to the overall system flexibility, its implementation is a non-trivial task. In this paper we address the issue of application mapping onto a NoCbased many-core platform when considering fundamentals and trends of current many-core operating systems, specifically, we elaborate on a limited migrative application model encompassing a message-passing paradigm as a communication primitive. As the main contribution, we formulate the problem of real-time application mapping, and propose a three-stage process to efficiently solve it. Through analysis it is assured that derived solutions guarantee the fulfilment of posed time constraints regarding worst-case communication latencies, and at the same time provide an environment to perform load balancing for e.g. thermal, energy, fault tolerance or performance reasons.We also propose several constraints regarding the topological structure of the application mapping, as well as the inter- and intra-application communication patterns, which efficiently solve the issues of pessimism and/or intractability when performing the analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the application of multidimensional scaling (MDS) analysis to data emerging from noninvasive lung function tests, namely the input respiratory impedance. The aim is to obtain a geometrical mapping of the diseases in a 3D space representation, allowing analysis of (dis)similarities between subjects within the same pathology groups, as well as between the various groups. The adult patient groups investigated were healthy, diagnosed chronic obstructive pulmonary disease (COPD) and diagnosed kyphoscoliosis, respectively. The children patient groups were healthy, asthma and cystic fibrosis. The results suggest that MDS can be successfully employed for mapping purposes of restrictive (kyphoscoliosis) and obstructive (COPD) pathologies. Hence, MDS tools can be further examined to define clear limits between pools of patients for clinical classification, and used as a training aid for medical traineeship.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conferência multidisciplinar e multicultural.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mestrado em Engenharia Mecânica – Especialização Gestão Industrial

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Heterogeneous multicore platforms are becoming an interesting alternative for embedded computing systems with limited power supply as they can execute specific tasks in an efficient manner. Nonetheless, one of the main challenges of such platforms consists of optimising the energy consumption in the presence of temporal constraints. This paper addresses the problem of task-to-core allocation onto heterogeneous multicore platforms such that the overall energy consumption of the system is minimised. To this end, we propose a two-phase approach that considers both dynamic and leakage energy consumption: (i) the first phase allocates tasks to the cores such that the dynamic energy consumption is reduced; (ii) the second phase refines the allocation performed in the first phase in order to achieve better sleep states by trading off the dynamic energy consumption with the reduction in leakage energy consumption. This hybrid approach considers core frequency set-points, tasks energy consumption and sleep states of the cores to reduce the energy consumption of the system. Major value has been placed on a realistic power model which increases the practical relevance of the proposed approach. Finally, extensive simulations have been carried out to demonstrate the effectiveness of the proposed algorithm. In the best-case, savings up to 18% of energy are reached over the first fit algorithm, which has shown, in previous works, to perform better than other bin-packing heuristics for the target heterogeneous multicore platform.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many-core platforms are an emerging technology in the real-time embedded domain. These devices offer various options for power savings, cost reductions and contribute to the overall system flexibility, however, issues such as unpredictability, scalability and analysis pessimism are serious challenges to their integration into the aforementioned area. The focus of this work is on many-core platforms using a limited migrative model (LMM). LMM is an approach based on the fundamental concepts of the multi-kernel paradigm, which is a promising step towards scalable and predictable many-cores. In this work, we formulate the problem of real-time application mapping on a many-core platform using LMM, and propose a three-stage method to solve it. An extended version of the existing analysis is used to assure that derived mappings (i) guarantee the fulfilment of timing constraints posed on worst-case communication delays of individual applications, and (ii) provide an environment to perform load balancing for e.g. energy/thermal management, fault tolerance and/or performance reasons.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The underground scenarios are one of the most challenging environments for accurate and precise 3d mapping where hostile conditions like absence of Global Positioning Systems, extreme lighting variations and geometrically smooth surfaces may be expected. So far, the state-of-the-art methods in underground modelling remain restricted to environments in which pronounced geometric features are abundant. This limitation is a consequence of the scan matching algorithms used to solve the localization and registration problems. This paper contributes to the expansion of the modelling capabilities to structures characterized by uniform geometry and smooth surfaces, as is the case of road and train tunnels. To achieve that, we combine some state of the art techniques from mobile robotics, and propose a method for 6DOF platform positioning in such scenarios, that is latter used for the environment modelling. A visual monocular Simultaneous Localization and Mapping (MonoSLAM) approach based on the Extended Kalman Filter (EKF), complemented by the introduction of inertial measurements in the prediction step, allows our system to localize himself over long distances, using exclusively sensors carried on board a mobile platform. By feeding the Extended Kalman Filter with inertial data we were able to overcome the major problem related with MonoSLAM implementations, known as scale factor ambiguity. Despite extreme lighting variations, reliable visual features were extracted through the SIFT algorithm, and inserted directly in the EKF mechanism according to the Inverse Depth Parametrization. Through the 1-Point RANSAC (Random Sample Consensus) wrong frame-to-frame feature matches were rejected. The developed method was tested based on a dataset acquired inside a road tunnel and the navigation results compared with a ground truth obtained by post-processing a high grade Inertial Navigation System and L1/L2 RTK-GPS measurements acquired outside the tunnel. Results from the localization strategy are presented and analyzed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Until this day, the most efficient Cu(In,Ga)Se2 thin film solar cells have been prepared using a rather complex growth process often referred to as three-stage or multistage. This family of processes is mainly characterized by a first step deposited with only In, Ga and Se flux to form a first layer. Cu is added in a second step until the film becomes slightly Cu-rich, where-after the film is converted to its final Cu-poor composition by a third stage, again with no or very little addition of Cu. In this paper, a comparison between solar cells prepared with the three-stage process and a one-stage/in-line process with the same composition, thickness, and solar cell stack is made. The one-stage process is easier to be used in an industrial scale and do not have Cu-rich transitions. The samples were analyzed using glow discharge optical emission spectroscopy, scanning electron microscopy, X-ray diffraction, current–voltage-temperature, capacitance-voltage, external quantum efficiency, transmission/reflection, and photoluminescence. It was concluded that in spite of differences in the texturing, morphology and Ga gradient, the electrical performance of the two types of samples is quite similar as demonstrated by the similar J–V behavior, quantum spectral response, and the estimated recombination losses.