54 resultados para Applied (CO)
Resumo:
The development of scaffolds that combine the delivery of drugs with the physical support provided by electrospun fibres holds great potential in the field of nerve regeneration. Here it is proposed the incorporation of ibuprofen, a well-known non-steroidal anti-inflammatory drug, in electrospun fibres of the statistical copolymer poly(trimethylene carbonate-co-ε-caprolactone) [P(TMC-CL)] to serve as a drug delivery system to enhance axonal regeneration in the context of a spinal cord lesion, by limiting the inflammatory response. P(TMC-CL) fibres were electrospun from mixtures of dichloromethane (DCM) and dimethylformamide (DMF). The solvent mixture applied influenced fibre morphology, as well as mean fibre diameter, which decreased as the DMF content in solution increased. Ibuprofen-loaded fibres were prepared from P(TMC-CL) solutions containing 5% ibuprofen (w/w of polymer). Increasing drug content to 10% led to jet instability, resulting in the formation of a less homogeneous fibrous mesh. Under the optimized conditions, drug-loading efficiency was above 80%. Confocal Raman mapping showed no preferential distribution of ibuprofen in P(TMC-CL) fibres. Under physiological conditions ibuprofen was released in 24h. The release process being diffusion-dependent for fibres prepared from DCM solutions, in contrast to fibres prepared from DCM-DMF mixtures where burst release occurred. The biological activity of the drug released was demonstrated using human-derived macrophages. The release of prostaglandin E2 to the cell culture medium was reduced when cells were incubated with ibuprofen-loaded P(TMC-CL) fibres, confirming the biological significance of the drug delivery strategy presented. Overall, this study constitutes an important contribution to the design of a P(TMC-CL)-based nerve conduit with anti-inflammatory properties.
Resumo:
Trabalho de Projecto para a obtenção do grau de Mestre em Contabilidade e Finanças
Resumo:
A definition of medium voltage (MV) load diagrams was made, based on the data base knowledge discovery process. Clustering techniques were used as support for the agents of the electric power retail markets to obtain specific knowledge of their customers’ consumption habits. Each customer class resulting from the clustering operation is represented by its load diagram. The Two-step clustering algorithm and the WEACS approach based on evidence accumulation (EAC) were applied to an electricity consumption data from a utility client’s database in order to form the customer’s classes and to find a set of representative consumption patterns. The WEACS approach is a clustering ensemble combination approach that uses subsampling and that weights differently the partitions in the co-association matrix. As a complementary step to the WEACS approach, all the final data partitions produced by the different variations of the method are combined and the Ward Link algorithm is used to obtain the final data partition. Experiment results showed that WEACS approach led to better accuracy than many other clustering approaches. In this paper the WEACS approach separates better the customer’s population than Two-step clustering algorithm.
Resumo:
Long-term contractual decisions are the basis of an efficient risk management. However those types of decisions have to be supported with a robust price forecast methodology. This paper reports a different approach for long-term price forecast which tries to give answers to that need. Making use of regression models, the proposed methodology has as main objective to find the maximum and a minimum Market Clearing Price (MCP) for a specific programming period, and with a desired confidence level α. Due to the problem complexity, the meta-heuristic Particle Swarm Optimization (PSO) was used to find the best regression parameters and the results compared with the obtained by using a Genetic Algorithm (GA). To validate these models, results from realistic data are presented and discussed in detail.
Resumo:
Short-term risk management is highly dependent on long-term contractual decisions previously established; risk aversion factor of the agent and short-term price forecast accuracy. Trying to give answers to that problem, this paper provides a different approach for short-term risk management on electricity markets. Based on long-term contractual decisions and making use of a price range forecast method developed by the authors, the short-term risk management tool presented here has as main concern to find the optimal spot market strategies that a producer should have for a specific day in function of his risk aversion factor, with the objective to maximize the profits and simultaneously to practice the hedge against price market volatility. Due to the complexity of the optimization problem, the authors make use of Particle Swarm Optimization (PSO) to find the optimal solution. Results from realistic data, namely from OMEL electricity market, are presented and discussed in detail.
Resumo:
In recent years the use of several new resources in power systems, such as distributed generation, demand response and more recently electric vehicles, has significantly increased. Power systems aim at lowering operational costs, requiring an adequate energy resources management. In this context, load consumption management plays an important role, being necessary to use optimization strategies to adjust the consumption to the supply profile. These optimization strategies can be integrated in demand response programs. The control of the energy consumption of an intelligent house has the objective of optimizing the load consumption. This paper presents a genetic algorithm approach to manage the consumption of a residential house making use of a SCADA system developed by the authors. Consumption management is done reducing or curtailing loads to keep the power consumption in, or below, a specified energy consumption limit. This limit is determined according to the consumer strategy and taking into account the renewable based micro generation, energy price, supplier solicitations, and consumers’ preferences. The proposed approach is compared with a mixed integer non-linear approach.
Resumo:
The concept of demand response has a growing importance in the context of the future power systems. Demand response can be seen as a resource like distributed generation, storage, electric vehicles, etc. All these resources require the existence of an infrastructure able to give players the means to operate and use them in an efficient way. This infrastructure implements in practice the smart grid concept, and should accommodate a large number of diverse types of players in the context of a competitive business environment. In this paper, demand response is optimally scheduled jointly with other resources such as distributed generation units and the energy provided by the electricity market, minimizing the operation costs from the point of view of a virtual power player, who manages these resources and supplies the aggregated consumers. The optimal schedule is obtained using two approaches based on particle swarm optimization (with and without mutation) which are compared with a deterministic approach that is used as a reference methodology. A case study with two scenarios implemented in DemSi, a demand Response simulator developed by the authors, evidences the advantages of the use of the proposed particle swarm approaches.
Resumo:
This paper proposes a swarm intelligence long-term hedging tool to support electricity producers in competitive electricity markets. This tool investigates the long-term hedging opportunities available to electric power producers through the use of contracts with physical (spot and forward) and financial (options) settlement. To find the optimal portfolio the producer risk preference is stated by a utility function (U) expressing the trade-off between the expectation and the variance of the return. Variance estimation and the expected return are based on a forecasted scenario interval determined by a long-term price range forecast model, developed by the authors, whose explanation is outside the scope of this paper. The proposed tool makes use of Particle Swarm Optimization (PSO) and its performance has been evaluated by comparing it with a Genetic Algorithm (GA) based approach. To validate the risk management tool a case study, using real price historical data for mainland Spanish market, is presented to demonstrate the effectiveness of the proposed methodology.
Resumo:
This paper aims to study the relationships between chromosomal DNA sequences of twenty species. We propose a methodology combining DNA-based word frequency histograms, correlation methods, and an MDS technique to visualize structural information underlying chromosomes (CRs) and species. Four statistical measures are tested (Minkowski, Cosine, Pearson product-moment, and Kendall τ rank correlations) to analyze the information content of 421 nuclear CRs from twenty species. The proposed methodology is built on mathematical tools and allows the analysis and visualization of very large amounts of stream data, like DNA sequences, with almost no assumptions other than the predefined DNA “word length.” This methodology is able to produce comprehensible three-dimensional visualizations of CR clustering and related spatial and structural patterns. The results of the four test correlation scenarios show that the high-level information clusterings produced by the MDS tool are qualitatively similar, with small variations due to each correlation method characteristics, and that the clusterings are a consequence of the input data and not method’s artifacts.
Resumo:
This paper proposes two meta-heuristics (Genetic Algorithm and Evolutionary Particle Swarm Optimization) for solving a 15 bid-based case of Ancillary Services Dispatch in an Electricity Market. A Linear Programming approach is also included for comparison purposes. A test case based on the dispatch of Regulation Down, Regulation Up, Spinning Reserve and Non-Spinning Reserve services is used to demonstrate that the use of meta-heuristics is suitable for solving this kind of optimization problem. Faster execution times and lower computational resources requirements are the most relevant advantages of the used meta-heuristics when compared with the Linear Programming approach.
Resumo:
This paper aims to present a multi-agent model for a simulation, whose goal is to help one specific participant of multi-criteria group decision making process.This model has five main intervenient types: the human participant, who is using the simulation and argumentation support system; the participant agents, one associated to the human participant and the others simulating the others human members of the decision meeting group; the directory agent; the proposal agents, representing the different alternatives for a decision (the alternatives are evaluated based on criteria); and the voting agent responsiblefor all voting machanisms.At this stage it is proposed a two phse algorithm. In the first phase each participantagent makes his own evaluation of the proposals under discussion, and the voting agent proposes a simulation of a voting process.In the second phase, after the dissemination of the voting results,each one ofthe partcipan agents will argue to convince the others to choose one of the possible alternatives. The arguments used to convince a specific participant are dependent on agent knowledge about that participant. This two-phase algorithm is applied iteratively.
Resumo:
Purpose- Economics and business have evolved as sciences in order to accommodate more of ‘real world’ solutions for the problems approached. In many cases, both business and economics have been supported by other disciplines in order to obtain a more complete framework for the study of complex issues. The aim of this paper is to explore the contribution of three heterodox economics disciplines to the knowledge of business co-operation. Design/methodology/approach- This approach is theoretical and it shows that many relevant aspects of business co-operation have been proposed by economic geography, institutional economics, and economic sociology. Findings- This paper highlights the business mechanisms of co-operation, reflecting on the role of places, institution and the social context where businesses operate. Research Implications- It contributes with a theoretical framework for the explanation of business co-operations and networks that goes beyond the traditional economics theories. Originality/value- This paper contributes with a framework for the study of business co-operation both from an economics and management perspective. This framework embodies a number of non-quantitative issues that are critical for understanding the complex networks in which firms operate.
Resumo:
Introduction: Image resizing is a normal feature incorporated into the Nuclear Medicine digital imaging. Upsampling is done by manufacturers to adequately fit more the acquired images on the display screen and it is applied when there is a need to increase - or decrease - the total number of pixels. This paper pretends to compare the “hqnx” and the “nxSaI” magnification algorithms with two interpolation algorithms – “nearest neighbor” and “bicubic interpolation” – in the image upsampling operations. Material and Methods: Three distinct Nuclear Medicine images were enlarged 2 and 4 times with the different digital image resizing algorithms (nearest neighbor, bicubic interpolation nxSaI and hqnx). To evaluate the pixel’s changes between the different output images, 3D whole image plot profiles and surface plots were used as an addition to the visual approach in the 4x upsampled images. Results: In the 2x enlarged images the visual differences were not so noteworthy. Although, it was clearly noticed that bicubic interpolation presented the best results. In the 4x enlarged images the differences were significant, with the bicubic interpolated images presenting the best results. Hqnx resized images presented better quality than 4xSaI and nearest neighbor interpolated images, however, its intense “halo effect” affects greatly the definition and boundaries of the image contents. Conclusion: The hqnx and the nxSaI algorithms were designed for images with clear edges and so its use in Nuclear Medicine images is obviously inadequate. Bicubic interpolation seems, from the algorithms studied, the most suitable and its each day wider applications seem to show it, being assumed as a multi-image type efficient algorithm.
Resumo:
A constante e sistemática subida de preço dos combustíveis fósseis e as contínuas preocupações com o meio ambiente determinaram a procura de soluções ambientalmente sustentáveis. O biodiesel surge, então, como uma alternativa para essa problemática, bem como uma solução para resíduos líquidos e gordurosos produzidos pelo ser humano. A produção de biodiesel tem sido alvo de extensa atenção nos últimos anos, pois trata-se de um combustível biodegradável e não poluente. A produção de biodiesel pelo processo de transesterificação usando álcoois de cadeia curta e catalisadores químicos, nomeadamente alcalinos, tem sido aceite industrialmente devido à sua elevada conversão. Recentemente, a transesterificação enzimática tem ganho adeptos. No entanto, o custo da enzima permanece uma barreira para a sua aplicação em grande escala. O presente trabalho visa a produção de biodiesel por transesterificação enzimática a partir de óleo residual de origem vegetal. O álcool usado foi o etanol, em substituição do metanol usado convencionalmente na catálise homogénea, pois a atividade da enzima é inibida pela presença deste último. As maiores dificuldades apresentadas na etanólise residem na separação das fases (Glicerol e Biodiesel) após a reação bem como na menor velocidade de reação. Para ajudar a colmatar esta desvantagem foi estudada a influência de dois cosolventes: o hexano e o hexanol, na proporção de 20% (v/v). Após a escolha do co-solvente que permite obter melhor rendimento (o hexano), foi elaborado um planeamento fatorial no qual se estudou a influência de três variáveis na produção de biodiesel por catálise enzimática com etanol e co-solventes: a razão molar óleo/álcool (1:8, 1:6 e 1:4), a quantidade de co-solvente adicionado (30, 20 e 10%, v/v) e o tempo de reação (48, 36 e 24h). A avaliação do processo foi inicialmente seguida pelo rendimento da reação, a fim de identificar as melhores condições, sendo substituída posteriormente pela quantificação do teor de ésteres por cromatografia em fase gasosa. O biodiesel com teor de ésteres mais elevado foi produzido nas condições correspondentes a uma razão molar óleo:álcool de 1:4, com 5g de Lipozyme TL IM como catalisador, 10% co-solvente (hexano, v/v), à temperatura de 35 ºC durante 24h. O rendimento do biodiesel produzido sob estas condições foi de 73,3%, traduzido em 64,7% de teor de ésteres etílicos. Contudo o rendimento mais elevado que se obteve foi de 99,7%, para uma razão óleo/álcool de 1:8, 30% de co-solvente (hexano, v/v), reação durante 48h a 35 ºC, obtendo-se apenas 46,1% de ésteres. Por fim, a qualidade do biodiesel foi ainda avaliada, de acordo com as especificações da norma EN 14214, através das determinações de densidade, viscosidade, ponto de inflamação, teor de água, corrosão ao cobre, índice de acidez, índice de iodo, teor de sódio (Na+) e potássio (K+), CFPP e poder calorífico. Na Europa, os ésteres etílicos não têm, neste momento, norma que os regule quanto à classificação da qualidade de biodiesel. Contudo, o biodiesel produzido foi analisado de acordo com a norma europeia EN14214, norma esta que regula a qualidade dos ésteres metílicos, sendo possível concluir que nenhum dos parâmetros avaliados se encontra em conformidade com a mesma.
Resumo:
A presente dissertação descreve o desenvolvimento e a caracterização de sensores ópticos com base em membranas de poli(cloreto de vinilo), PVC, para determinação de Norfloxacina em amostras do sector da aquacultura. Estes sensores basearam-se na reacção colorimétrica entre um metal imobilizado em PVC e a Norfloxacina. O metal foi escolhido com base em ensaios prévios de reacção colorimétrica entre a Norfloxacina e várias espécies metálicas, nomeadamente, Fe(III), Al(III), Pb(II), Aluminon, Mo(II), Mn(II), Ni(II), Cu(II), Co(II), Sn(II) e V(V). A reacção mais intensa foi obtida com o Fe(III). Neste sentido, numa primeira fase foram desenvolvidos sensores baseados em Fe(III). O efeito de alguns parâmetros experimentais na resposta desses sensores foi avaliado de modo univariado. Incluem-se aqui o efeito do pH, avaliado entre 2,00 e 6,00, e o da concentração de Fe(III), variada entre cerca de 1,00x10-5 M e 2,00x10-4 M. Os melhores valores foram obtidos a pH 3, para o qual se verificou um comportamento linear entre cerca de 1,00x10-5 M e 1,70x10-4 M de Fe(III). Utilizando as condições seleccionadas anteriormente, procedeu-se à caracterização do complexo sob ponto de vista químico. Os valores obtidos apontaram para a necessidade de um excesso de Fe(III) de, pelo menos, 10 vezes, no sentido de garantir a máxima extensão de complexação. O complexo referido apresentou, nestas condições, um comportamento linear ao longo do intervalo de concentrações de cerca de 7,00x10-5 M a 7,00x10-4 M em NOR. O complexo formado foi estável ao longo de 90 minutos. As condições óptimas para análise desse complexo numa superfície sólida foram obtidas após avaliação do efeito da quantidade de Fe(III) e do tipo e quantidade de solvente mediador (o-nitrofenil octil éter, di-n-octilftalato, dibutilftalato, bis(etilhexil)sebacato, bis(etilhexil)ftalato). O bis(etilhexil)sebacato foi o solvente mediador escolhido e a relação de quantidade entre o PVC e o solvente mediador foi igual a 1:2. O procedimento de preparação do sensor sólido e subsequente optimização foi aplicado a outras espécies metálicas, para além do Fe(III), tais como, Cu(II), Mn(II) e aluminon. A conjugação de todos estes metais permitiu desenvolver um array de sensores para despistagem de Norfloxacina em águas de aquacultura. Algumas membranas sensoras foram aplicadas com sucesso no controlo de Norfloxacina em amostras de águas ambientais dopadas. Os resultados obtidos com membranas de Fe(III) e Cu(II) foram exactos, tendo-se registado valores de concentração próximos dos reais. O método proposto permitiu, por isso, a despistagem rápida e eficaz da presença de um antibiótico em águas ambientais, permitindo ainda o seu doseamento a um baixo custo. Numa perspectiva de rotina, e tendo em vista a despistagem deste antibiótico, este método revelou-se mais rápido e mais barato do que os demais métodos descritos na literatura para este efeito.