27 resultados para co-operating target
em Instituto Politécnico do Porto, Portugal
Resumo:
Our day-to-day life is dependent on several embedded devices, and in the near future, many more objects will have computation and communication capabilities enabling an Internet of Things. Correspondingly, with an increase in the interaction of these devices around us, developing novel applications is set to become challenging with current software infrastructures. In this paper, we argue that a new paradigm for operating systems needs to be conceptualized to provide aconducive base for application development on Cyber-physical systems. We demonstrate its need and importance using a few use-case scenarios and provide the design principles behind, and an architecture of a co-operating system or CoS that can serve as an example of this new paradigm.
Resumo:
Power system organization has gone through huge changes in the recent years. Significant increase in distributed generation (DG) and operation in the scope of liberalized markets are two relevant driving forces for these changes. More recently, the smart grid (SG) concept gained increased importance, and is being seen as a paradigm able to support power system requirements for the future. This paper proposes a computational architecture to support day-ahead Virtual Power Player (VPP) bid formation in the smart grid context. This architecture includes a forecasting module, a resource optimization and Locational Marginal Price (LMP) computation module, and a bid formation module. Due to the involved problems characteristics, the implementation of this architecture requires the use of Artificial Intelligence (AI) techniques. Artificial Neural Networks (ANN) are used for resource and load forecasting and Evolutionary Particle Swarm Optimization (EPSO) is used for energy resource scheduling. The paper presents a case study that considers a 33 bus distribution network that includes 67 distributed generators, 32 loads and 9 storage units.
Resumo:
Purpose- Economics and business have evolved as sciences in order to accommodate more of ‘real world’ solutions for the problems approached. In many cases, both business and economics have been supported by other disciplines in order to obtain a more complete framework for the study of complex issues. The aim of this paper is to explore the contribution of three heterodox economics disciplines to the knowledge of business co-operation. Design/methodology/approach- This approach is theoretical and it shows that many relevant aspects of business co-operation have been proposed by economic geography, institutional economics, and economic sociology. Findings- This paper highlights the business mechanisms of co-operation, reflecting on the role of places, institution and the social context where businesses operate. Research Implications- It contributes with a theoretical framework for the explanation of business co-operations and networks that goes beyond the traditional economics theories. Originality/value- This paper contributes with a framework for the study of business co-operation both from an economics and management perspective. This framework embodies a number of non-quantitative issues that are critical for understanding the complex networks in which firms operate.
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores
Resumo:
A constante e sistemática subida de preço dos combustíveis fósseis e as contínuas preocupações com o meio ambiente determinaram a procura de soluções ambientalmente sustentáveis. O biodiesel surge, então, como uma alternativa para essa problemática, bem como uma solução para resíduos líquidos e gordurosos produzidos pelo ser humano. A produção de biodiesel tem sido alvo de extensa atenção nos últimos anos, pois trata-se de um combustível biodegradável e não poluente. A produção de biodiesel pelo processo de transesterificação usando álcoois de cadeia curta e catalisadores químicos, nomeadamente alcalinos, tem sido aceite industrialmente devido à sua elevada conversão. Recentemente, a transesterificação enzimática tem ganho adeptos. No entanto, o custo da enzima permanece uma barreira para a sua aplicação em grande escala. O presente trabalho visa a produção de biodiesel por transesterificação enzimática a partir de óleo residual de origem vegetal. O álcool usado foi o etanol, em substituição do metanol usado convencionalmente na catálise homogénea, pois a atividade da enzima é inibida pela presença deste último. As maiores dificuldades apresentadas na etanólise residem na separação das fases (Glicerol e Biodiesel) após a reação bem como na menor velocidade de reação. Para ajudar a colmatar esta desvantagem foi estudada a influência de dois cosolventes: o hexano e o hexanol, na proporção de 20% (v/v). Após a escolha do co-solvente que permite obter melhor rendimento (o hexano), foi elaborado um planeamento fatorial no qual se estudou a influência de três variáveis na produção de biodiesel por catálise enzimática com etanol e co-solventes: a razão molar óleo/álcool (1:8, 1:6 e 1:4), a quantidade de co-solvente adicionado (30, 20 e 10%, v/v) e o tempo de reação (48, 36 e 24h). A avaliação do processo foi inicialmente seguida pelo rendimento da reação, a fim de identificar as melhores condições, sendo substituída posteriormente pela quantificação do teor de ésteres por cromatografia em fase gasosa. O biodiesel com teor de ésteres mais elevado foi produzido nas condições correspondentes a uma razão molar óleo:álcool de 1:4, com 5g de Lipozyme TL IM como catalisador, 10% co-solvente (hexano, v/v), à temperatura de 35 ºC durante 24h. O rendimento do biodiesel produzido sob estas condições foi de 73,3%, traduzido em 64,7% de teor de ésteres etílicos. Contudo o rendimento mais elevado que se obteve foi de 99,7%, para uma razão óleo/álcool de 1:8, 30% de co-solvente (hexano, v/v), reação durante 48h a 35 ºC, obtendo-se apenas 46,1% de ésteres. Por fim, a qualidade do biodiesel foi ainda avaliada, de acordo com as especificações da norma EN 14214, através das determinações de densidade, viscosidade, ponto de inflamação, teor de água, corrosão ao cobre, índice de acidez, índice de iodo, teor de sódio (Na+) e potássio (K+), CFPP e poder calorífico. Na Europa, os ésteres etílicos não têm, neste momento, norma que os regule quanto à classificação da qualidade de biodiesel. Contudo, o biodiesel produzido foi analisado de acordo com a norma europeia EN14214, norma esta que regula a qualidade dos ésteres metílicos, sendo possível concluir que nenhum dos parâmetros avaliados se encontra em conformidade com a mesma.
Resumo:
Concentrations of eleven trace elements (Al, As, Cd, Cr, Co, Hg, Mn, Ni, Pb, Se, and Si) were measured in 39 (natural and flavoured) water samples. Determinations were performed using graphite furnace electrothermetry for almost all elements (Al, As, Cd, Cr, Co, Mn, Ni, Pb, and Si). For Se determination hydride generation was used, and cold vapour generation for Hg. These techniques were coupled to atomic absorption spectrophotometry. The trace element content of still or sparkling natural waters changed from brand to brand. Significant differences between natural still and natural sparkling waters (p<0.001) were only apparent for Mn. The Mann–Whitney U-test was used to search for significant differences between flavoured and natural waters. The concentration of each element was compared with the presence of flavours, preservatives, acidifying agents, fruit juice and/or sweeteners, according to the labelled composition. It was shown that flavoured waters generally increase the trace element content. The addition of preservatives and acidifying regulators had a significant influence on Mn, Co, As and Si contents (p<0.05). Fruit juice can also be correlated to the increase of Co and As. Sweeteners did not provide any significant difference in Mn, Co, Se and Si content.
Resumo:
A deteção e seguimento de pessoas tem uma grande variedade de aplicações em visão computacional. Embora tenha sido alvo de anos de investigação, continua a ser um tópico em aberto, e ainda hoje, um grande desafio a obtenção de uma abordagem que inclua simultaneamente exibilidade e precisão. O trabalho apresentado nesta dissertação desenvolve um caso de estudo sobre deteção e seguimento automático de faces humanas, em ambiente de sala de reuniões, concretizado num sistema flexível de baixo custo. O sistema proposto é baseado no sistema operativo GNU's Not Unix (GNU) linux, e é dividido em quatro etapas, a aquisição de vídeo, a deteção da face, o tracking e reorientação da posição da câmara. A aquisição consiste na captura de frames de vídeo das três câmaras Internet Protocol (IP) Sony SNC-RZ25P, instaladas na sala, através de uma rede Local Area Network (LAN) também ele já existente. Esta etapa fornece os frames de vídeo para processamento à detecção e tracking. A deteção usa o algoritmo proposto por Viola e Jones, para a identificação de objetos, baseando-se nas suas principais características, que permite efetuar a deteção de qualquer tipo de objeto (neste caso faces humanas) de uma forma genérica e em tempo real. As saídas da deteção, quando é identificado com sucesso uma face, são as coordenadas do posicionamento da face, no frame de vídeo. As coordenadas da face detetada são usadas pelo algoritmo de tracking, para a partir desse ponto seguir a face pelos frames de vídeo subsequentes. A etapa de tracking implementa o algoritmo Continuously Adaptive Mean-SHIFT (Camshift) que baseia o seu funcionamento na pesquisa num mapa de densidade de probabilidade, do seu valor máximo, através de iterações sucessivas. O retorno do algoritmo são as coordenadas da posição e orientação da face. Estas coordenadas permitem orientar o posicionamento da câmara de forma que a face esteja sempre o mais próximo possível do centro do campo de visão da câmara. Os resultados obtidos mostraram que o sistema de tracking proposto é capaz de reconhecer e seguir faces em movimento em sequências de frames de vídeo, mostrando adequabilidade para aplicação de monotorização em tempo real.
Resumo:
Thiodicarb, a carbamate pesticide widely used on crops, may pose several environmental and health concerns. This study aimed to explore its toxicological profile on male rats using hematological, biochemical, histopathological, and flow cytometry markers. Exposed animals were dosed daily at 10, 20, or 40 mg/kg/body weight (group A, B, and C, respectively) during 30 d. No significant changes were observed in hematological parameters among all groups. After 10 d, a decrease of total cholesterol levels was noted in rats exposed to 40 mg/kg. Aspartate aminotransferase (AST) activity increased (group A at 20 d; groups A and B at 30 d) and alkaline phosphatase (ALP) (group B at 30 d) activity significantly reduced. At 30 d a decrease of some of the other evaluated parameters was observed with total cholesterol and urea levels in group A as well as total protein and creatinine levels in groups A and B. Histological results demonstrated multi-organ dose-related damage in thiodicarb-exposed animals, evidenced as hemorrhagic and diffuse vacuolation in hepatic tissue; renal histology showed disorganized glomeruli and tubular cell degeneration; spleen was ruptured with white pulp and clusters of iron deposits within red pulp; significant cellular loss was noted at the cortex of thymus; and degenerative changes were observed within testis. The histopathologic alterations were most prominent in the high-dose group. Concerning flow cytometry studies, an increase of lymphocyte number, especially T lymphocytes, was seen in blood samples from animals exposed to the highest dose. Taken together, these results indicate marked systemic organ toxicity in rats after subacute exposure to thiodicarb.
Resumo:
This study aimed to characterize air pollution and the associated carcinogenic risks of polycyclic aromatic hydrocarbon (PAHs) at an urban site, to identify possible emission sources of PAHs using several statistical methodologies, and to analyze the influence of other air pollutants and meteorological variables on PAH concentrations.The air quality and meteorological data were collected in Oporto, the second largest city of Portugal. Eighteen PAHs (the 16 PAHs considered by United States Environment Protection Agency (USEPA) as priority pollutants, dibenzo[a,l]pyrene, and benzo[j]fluoranthene) were collected daily for 24 h in air (gas phase and in particles) during 40 consecutive days in November and December 2008 by constant low-flow samplers and using polytetrafluoroethylene (PTFE) membrane filters for particulate (PM10 and PM2.5 bound) PAHs and pre-cleaned polyurethane foam plugs for gaseous compounds. The other monitored air pollutants were SO2, PM10, NO2, CO, and O3; the meteorological variables were temperature, relative humidity, wind speed, total precipitation, and solar radiation. Benzo[a]pyrene reached a mean concentration of 2.02 ngm−3, surpassing the EU annual limit value. The target carcinogenic risks were equal than the health-based guideline level set by USEPA (10−6) at the studied site, with the cancer risks of eight PAHs reaching senior levels of 9.98×10−7 in PM10 and 1.06×10−6 in air. The applied statistical methods, correlation matrix, cluster analysis, and principal component analysis, were in agreement in the grouping of the PAHs. The groups were formed according to their chemical structure (number of rings), phase distribution, and emission sources. PAH diagnostic ratios were also calculated to evaluate the main emission sources. Diesel vehicular emissions were the major source of PAHs at the studied site. Besides that source, emissions from residential heating and oil refinery were identified to contribute to PAH levels at the respective area. Additionally, principal component regression indicated that SO2, NO2, PM10, CO, and solar radiation had positive correlation with PAHs concentrations, while O3, temperature, relative humidity, and wind speed were negatively correlated.
Resumo:
Embedded systems are increasingly complex and dynamic, imposing progressively higher developing time and costs. Tuning a particular system for deployment is thus becoming more demanding. Furthermore when considering systems which have to adapt themselves to evolving requirements and changing service requests. In this perspective, run-time monitoring of the system behaviour becomes an important requirement, allowing to dynamically capturing the actual scheduling progress and resource utilization. For this to succeed, operating systems need to expose their internal behaviour and state, making it available to external applications, and a runtime monitoring mechanism must be available. However, such mechanism can impose a burden in the system itself if not wisely used. In this paper we explore this problem and propose a framework, which is intended to provide this run-time mechanism whilst achieving code separation, run-time efficiency and flexibility for the final developer.
Resumo:
Adhesive bonding has become more efficient in the last few decades due to the adhesives developments, granting higher strength and ductility. On the other hand, natural fibre composites have recently gained interest due to the low cost and density. It is therefore essential to predict the fracture behavior of joints between these materials, to assess the feasibility of joining or repairing with adhesives. In this work, the tensile fracture toughness (Gc n) of adhesive joints between natural fibre composites is studied, by bonding with a ductile adhesive and co-curing. Conventional methods to obtain Gc n are used for the co-cured specimens, while for the adhesive within the bonded joint, the J-integral is considered. For the J-integral calculation, an optical measurement method is developed for the evaluation of the crack tip opening and adherends rotation at the crack tip during the test, supported by a Matlab sub-routine for the automated extraction of these quantities. As output of this work, an optical method that allows an easier and quicker extraction of the parameters to obtain Gc n than the available methods is proposed (by the J-integral technique), and the fracture behaviour in tension of bonded and co-cured joints in jute-reinforced natural fibre composites is also provided for the subsequent strength prediction. Additionally, for the adhesively- bonded joints, the tensile cohesive law of the adhesive is derived by the direct method.
Resumo:
In this paper, we analyse the ability of Profibus fieldbus to cope with the real-time requirements of a Distributed Computer Control System (DCCS), where messages associated to discrete events must be made available within a maximum bound time. Our methodology is based on the knowledge of real-time traffic characteristics, setting the network parameters in order to cope with timing requirements. Since non-real-time traffic characteristics are usually unknown at the design stage, we consider an operational profile where, constraining non-real-time traffic at the application level, we assure that realtime requirements are met.
Resumo:
The development of scaffolds that combine the delivery of drugs with the physical support provided by electrospun fibres holds great potential in the field of nerve regeneration. Here it is proposed the incorporation of ibuprofen, a well-known non-steroidal anti-inflammatory drug, in electrospun fibres of the statistical copolymer poly(trimethylene carbonate-co-ε-caprolactone) [P(TMC-CL)] to serve as a drug delivery system to enhance axonal regeneration in the context of a spinal cord lesion, by limiting the inflammatory response. P(TMC-CL) fibres were electrospun from mixtures of dichloromethane (DCM) and dimethylformamide (DMF). The solvent mixture applied influenced fibre morphology, as well as mean fibre diameter, which decreased as the DMF content in solution increased. Ibuprofen-loaded fibres were prepared from P(TMC-CL) solutions containing 5% ibuprofen (w/w of polymer). Increasing drug content to 10% led to jet instability, resulting in the formation of a less homogeneous fibrous mesh. Under the optimized conditions, drug-loading efficiency was above 80%. Confocal Raman mapping showed no preferential distribution of ibuprofen in P(TMC-CL) fibres. Under physiological conditions ibuprofen was released in 24h. The release process being diffusion-dependent for fibres prepared from DCM solutions, in contrast to fibres prepared from DCM-DMF mixtures where burst release occurred. The biological activity of the drug released was demonstrated using human-derived macrophages. The release of prostaglandin E2 to the cell culture medium was reduced when cells were incubated with ibuprofen-loaded P(TMC-CL) fibres, confirming the biological significance of the drug delivery strategy presented. Overall, this study constitutes an important contribution to the design of a P(TMC-CL)-based nerve conduit with anti-inflammatory properties.
Resumo:
The IEEE 802.15.4 is the most widespread used protocol for Wireless Sensor Networks (WSNs) and it is being used as a baseline for several higher layer protocols such as ZigBee, 6LoWPAN or WirelessHART. Its MAC (Medium Access Control) supports both contention-free (CFP, based on the reservation of guaranteed time-slots GTS) and contention based (CAP, ruled by CSMA/CA) access, when operating in beacon-enabled mode. Thus, it enables the differentiation between real-time and best-effort traffic. However, some WSN applications and higher layer protocols may strongly benefit from the possibility of supporting more traffic classes. This happens, for instance, for dense WSNs used in time-sensitive industrial applications. In this context, we propose to differentiate traffic classes within the CAP, enabling lower transmission delays and higher success probability to timecritical messages, such as for event detection, GTS reservation and network management. Building upon a previously proposed methodology (TRADIF), in this paper we outline its implementation and experimental validation over a real-time operating system. Importantly, TRADIF is fully backward compatible with the IEEE 802.15.4 standard, enabling to create different traffic classes just by tuning some MAC parameters.
Resumo:
Traditional Real-Time Operating Systems (RTOS) are not designed to accommodate application specific requirements. They address a general case and the application must co-exist with any limitations imposed by such design. For modern real-time applications this limits the quality of services offered to the end-user. Research in this field has shown that it is possible to develop dynamic systems where adaptation is the key for success. However, adaptation requires full knowledge of the system state. To overcome this we propose a framework to gather data, and interact with the operating system, extending the traditional POSIX trace model with a partial reflective model. Such combination still preserves the trace mechanism semantics while creating a powerful platform to develop new dynamic systems, with little impact in the system and avoiding complex changes in the kernel source code.