37 resultados para Comparative mapping
em Instituto Politécnico do Porto, Portugal
Resumo:
The main purpose of this study was to examine the applicability of geostatistical modeling to obtain valuable information for assessing the environmental impact of sewage outfall discharges. The data set used was obtained in a monitoring campaign to S. Jacinto outfall, located off the Portuguese west coast near Aveiro region, using an AUV. The Matheron’s classical estimator was used the compute the experimental semivariogram which was fitted to three theoretical models: spherical, exponential and gaussian. The cross-validation procedure suggested the best semivariogram model and ordinary kriging was used to obtain the predictions of salinity at unknown locations. The generated map shows clearly the plume dispersion in the studied area, indicating that the effluent does not reach the near by beaches. Our study suggests that an optimal design for the AUV sampling trajectory from a geostatistical prediction point of view, can help to compute more precise predictions and hence to quantify more accurately dilution. Moreover, since accurate measurements of plume’s dilution are rare, these studies might be very helpful in the future for validation of dispersion models.
Resumo:
PURPOSE: To analyze and compare the Ground Reaction Forces (GRF), during the stance phase of walking in pregnant women in the 3rd trimester of pregnancy, and non pregnant women. METHODS: 20 women, 10 pregnant and 10 non pregnant, voluntarily took part in this study. GRF were measured (1000 Hz) using a force platform (BERTEC 4060-15), an amplifier (BERTEC AM 6300) and an analogical-digital converter of 16 Bits (Biopac). RESULTS: The study showed that there were significant differences among the two groups concerning absolute values of time of the stance phase. In what concerns to the normalized values the most significant differences were verified in the maximums values of vertical force (Fz3, Fz1) and in the impulse of the antero-posterior force (Fy2), taxes of growth of the vertical force, and in the period of time for the antero-posterior force (Fy) be null. CONCLUSIONS: It is easier for the pregnant to continue forward movement (push-off phase). O smaller growth rates in what concerns to the maximum of the vertical force (Fz1) for the pregnant, can be associated with a slower speed of gait, as an adaptation strategy to maintain the balance, to compensate the alterations in the position of her center of gravity due to the load increase. The data related to the antero-posterior component of the force (Fy), shows that there is a significant difference between the pregnant woman’s left foot and right foot, which accuses a different functional behavior in each one of the feet, during the propulsion phase (TS).
Resumo:
In this paper is presented a Game Theory based methodology to allocate transmission costs, considering cooperation and competition between producers. As original contribution, it finds the degree of participation on the additional costs according to the demand behavior. A comparative study was carried out between the obtained results using Nucleolus balance and Shapley Value, with other techniques such as Averages Allocation method and the Generalized Generation Distribution Factors method (GGDF). As example, a six nodes network was used for the simulations. The results demonstrate the ability to find adequate solutions on open access environment to the networks.
Resumo:
Important research effort has been devoted to the topic of optimal planning of distribution systems. The non linear nature of the system, the need to consider a large number of scenarios and the increasing necessity to deal with uncertainties make optimal planning in distribution systems a difficult task. Heuristic techniques approaches have been proposed to deal with these issues, overcoming some of the inherent difficulties of classic methodologies. This paper considers several methodologies used to address planning problems of electrical power distribution networks, namely mixedinteger linear programming (MILP), ant colony algorithms (AC), genetic algorithms (GA), tabu search (TS), branch exchange (BE), simulated annealing (SA) and the Bender´s decomposition deterministic non-linear optimization technique (BD). Adequacy of theses techniques to deal with uncertainties is discussed. The behaviour of each optimization technique is compared from the point of view of the obtained solution and of the methodology performance. The paper presents results of the application of these optimization techniques to a real case of a 10-kV electrical distribution system with 201 nodes that feeds an urban area.
Resumo:
Purpose – The aim of this article is to present some results from research undertaken into the information behaviour of European Documentation Centre (EDC) users. It will reflect on the practices of a group of 234 users of 55 EDCs covering 21 Member States of the European Union (EU), used to access European information. Design/methodology/approach – In order to collect the data presented here, five questionnaires were sent to users in all the EDCs in Finland, Ireland, Hungary and Portugal. In the remaining EU countries, five questionnaires were sent to two EDCs chosen at random. The questionnaires were sent by post, following telephone contact with the EDC managers. Findings – Factors determining access to information on the European Union and the frequency of this access are identified. The information providers most commonly used to access European information and the information sources considered the most reliable by respondents will also be analysed. Another area of analysis concerns the factors cited by respondents as facilitating access to information on Europe or, conversely, making it more difficult to access. Parallel to this, the aspects of accessing information on EU that are valued most by users will also be assessed. Research limitations/implications – Questionnaires had to be used, as the intention was to cover a very extensive geographical area. However, in opting for closed questions, it is acknowledged that standard responses have been obtained with no scope for capturing the individual circumstances of each respondent, thus making a qualitative approach difficult. Practical implications – The results provide an overall picture of certain aspects of the information behaviour of EDC users. They may serve as a starting point for planning training sessions designed to develop the skills required to search, access, evaluate and apply European information within an academic context. From a broader perspective, they also constitute factors which the European Commission should take into consideration when formulating its information and communication policy. Originality/value – This is the first piece of academic research into the EDCs and their users, which aimed to cover all Members State of the EU.
Resumo:
Nowadays, new emerging products claiming antioxidant properties are becoming more frequent. However, information about this topic in their labels is usually scarce. In this paper, we analyzed total phenolics, total flavonoids and ascorbic acid contents, as well as DPPH scavenging activity of several commercial samples, namely green tea and other herbal infusions, dietary supplements, and fruit juices, available in the Portuguese market. In general, beverages containing green tea and hibiscus showed higher phenolics contents (including flavonoids) and antioxidant activity than those without these ingredients. A borututu infusion presented the lowest concentrations of bioactive compounds and scavenging activity, due to the low recommended amount of plant to prepare the beverage. Some juices without antioxidant claims in the label presented similar values to those with it.
Resumo:
Introduction: Although relative uptake values aren’t the most important objective of a 99mTc-DMSA scan, they are important quantitative information. In most of the dynamic renal scintigraphies attenuation correction is essential if one wants to obtain a reliable result of the quantification process. Although in DMSA scans the absent of significant background and the lesser attenuation in pediatric patients, makes that this attenuation correction techniques are actually not applied. The geometric mean is the most common method, but that includes the acquisition of an anterior (extra) projection, which it is not acquired by a large number of NM departments. This method and the attenuation factors proposed by Tonnesen will be correlated with the absence of attenuation correction procedures. Material and Methods: Images from 20 individuals (aged 3 years +/- 2) were used and the two attenuation correction methods applied. The mean time of acquisition (time post DMSA administration) was 3.5 hours +/- 0.8h. Results: The absence of attenuation correction showed a good correlation with both attenuation methods (r=0.73 +/- 0.11) and the mean difference verified on the uptake values between the different methods were 4 +/- 3. The correlation was higher when the age was lower. The attenuation correction methods correlation was higher between them two than with the “no attenuation correction” method (r=0.82 +/- 0.8), and the mean differences of the uptake values were 2 +/- 2. Conclusion: The decision of not doing any kind of attenuation correction method can be justified by the minor differences verified on the relative kidney uptake values. Nevertheless, if it is recognized that there is a need for an accurate value of the relative kidney uptake, then an attenuation correction method should be used. Attenuation correction factors proposed by Tonnesen can be easily implemented and so become a practical and easy to implement alternative, namely when the anterior projection - needed for the geometric mean methodology – is not acquired.
Resumo:
The purpose of this work was to assess the acute toxicity on male mice to a chromated copper arsenate (CCA) solution, a widespread wood preservative used in building industry until 2002. Animals were subcutaneously injected with CCA (7.2 mg/kg arsenic and 10.2 mg/kg chromium per body weight), CrO3 (10.2 mg/kg), As2O5 (7.2 mg/kg) and NaCl (0.9%) per se, during 48 h and 96 h, for histopathology, histochemistry, chromium and arsenic analysis. The results showed some histopathological changes within renal tubules lumen of CCA exposed animals (during 48 h, and 96 h), and CrO3 (for the period of 96 h). Furthermore, the renal levels of arsenic and chromium in treated animals were statistically more evident than controls. Although, the same contents of pentavalent arsenic and hexavalent chromium were injected into treated animals with CCA and with the prepared solutions of As2O5 and CrO3, a different distribution of the pattern of these compounds was observed in kidneys.
Resumo:
Introdução Hoje em dia, o conceito de ontologia (Especificação explícita de uma conceptualização [Gruber, 1993]) é um conceito chave em sistemas baseados em conhecimento em geral e na Web Semântica em particular. Entretanto, os agentes de software nem sempre concordam com a mesma conceptualização, justificando assim a existência de diversas ontologias, mesmo que tratando o mesmo domínio de discurso. Para resolver/minimizar o problema de interoperabilidade entre estes agentes, o mapeamento de ontologias provou ser uma boa solução. O mapeamento de ontologias é o processo onde são especificadas relações semânticas entre entidades da ontologia origem e destino ao nível conceptual, e que por sua vez podem ser utilizados para transformar instâncias baseadas na ontologia origem em instâncias baseadas na ontologia destino. Motivação Num ambiente dinâmico como a Web Semântica, os agentes alteram não só os seus dados mas também a sua estrutura e semântica (ontologias). Este processo, denominado evolução de ontologias, pode ser definido como uma adaptação temporal da ontologia através de alterações que surgem no domínio ou nos objectivos da própria ontologia, e da gestão consistente dessas alterações [Stojanovic, 2004], podendo por vezes deixar o documento de mapeamento inconsistente. Em ambientes heterogéneos onde a interoperabilidade entre sistemas depende do documento de mapeamento, este deve reflectir as alterações efectuadas nas ontologias, existindo neste caso duas soluções: (i) gerar um novo documento de mapeamento (processo exigente em termos de tempo e recursos computacionais) ou (ii) adaptar o documento de mapeamento, corrigindo relações semânticas inválidas e criar novas relações se forem necessárias (processo menos existente em termos de tempo e recursos computacionais, mas muito dependente da informação sobre as alterações efectuadas). O principal objectivo deste trabalho é a análise, especificação e desenvolvimento do processo de evolução do documento de mapeamento de forma a reflectir as alterações efectuadas durante o processo de evolução de ontologias. Contexto Este trabalho foi desenvolvido no contexto do MAFRA Toolkit1. O MAFRA (MApping FRAmework) Toolkit é uma aplicação desenvolvida no GECAD2 que permite a especificação declarativa de relações semânticas entre entidades de uma ontologia origem e outra de destino, utilizando os seguintes componentes principais: Concept Bridge – Representa uma relação semântica entre um conceito de origem e um de destino; Property Bridge – Representa uma relação semântica entre uma ou mais propriedades de origem e uma ou mais propriedades de destino; Service – São aplicados às Semantic Bridges (Property e Concept Bridges) definindo como as instâncias origem devem ser transformadas em instâncias de destino. Estes conceitos estão especificados na ontologia SBO (Semantic Bridge Ontology) [Silva, 2004]. No contexto deste trabalho, um documento de mapeamento é uma instanciação do SBO, contendo relações semânticas entre entidades da ontologia de origem e da ontologia de destino. Processo de evolução do mapeamento O processo de evolução de mapeamento é o processo onde as entidades do documento de mapeamento são adaptadas, reflectindo eventuais alterações nas ontologias mapeadas, tentando o quanto possível preservar a semântica das relações semântica especificadas. Se as ontologias origem e/ou destino sofrerem alterações, algumas relações semânticas podem tornar-se inválidas, ou novas relações serão necessárias, sendo por isso este processo composto por dois sub-processos: (i) correcção de relações semânticas e (ii) processamento de novas entidades das ontologias. O processamento de novas entidades das ontologias requer a descoberta e cálculo de semelhanças entre entidades e a especificação de relações de acordo com a ontologia/linguagem SBO. Estas fases (“similarity measure” e “semantic bridging”) são implementadas no MAFRA Toolkit, sendo o processo (semi-) automático de mapeamento de ontologias descrito em [Silva, 2004].O processo de correcção de entidades SBO inválidas requer um bom conhecimento da ontologia/linguagem SBO, das suas entidades e relações, e de todas as suas restrições, i.e. da sua estrutura e semântica. Este procedimento consiste em (i) identificar as entidades SBO inválidas, (ii) a causa da sua invalidez e (iii) corrigi-las da melhor forma possível. Nesta fase foi utilizada informação vinda do processo de evolução das ontologias com o objectivo de melhorar a qualidade de todo o processo. Conclusões Para além do processo de evolução do mapeamento desenvolvido, um dos pontos mais importantes deste trabalho foi a aquisição de um conhecimento mais profundo sobre ontologias, processo de evolução de ontologias, mapeamento etc., expansão dos horizontes de conhecimento, adquirindo ainda mais a consciência da complexidade do problema em questão, o que permite antever e perspectivar novos desafios para o futuro.
Resumo:
This study modeled the impact on freshwater ecosystems of pharmaceuticals detected in biosolids following application on agricultural soils. The detected sulfonamides and hydrochlorothiazide displayed comparatively moderate retention in solid matrices and, therefore, higher transfer fractions from biosolids to the freshwater compartment. However, the residence times of these pharmaceuticals in freshwater were estimated to be short due to abiotic degradation processes. The non-steroidal anti-inflammatory mefenamic acid had the highest environmental impact on aquatic ecosystems and warrants further investigation. The estimation of the solid-water partitioning coefficient was generally the most influential parameter of the probabilistic comparative impact assessment. These results and the modeling approach used in this study serve to prioritize pharmaceuticals in the research effort to assess the risks and the environmental impacts on aquatic biota of these emerging pollutants.
Resumo:
This paper presents some results of a survey on access to European information among a group of 234 users of 55 European Documentation Centres (EDCs), from 21 European Union (EU) Member-States. The findings of the questionnaire made to 88 EDCs' managers, from 26 EU Member-States, will also be analyse. Two different points of view regarding issues related to reasons to access European information, the valued aspects during that access and the use of European databases will be compare.
Resumo:
Many-core platforms based on Network-on-Chip (NoC [Benini and De Micheli 2002]) present an emerging technology in the real-time embedded domain. Although the idea to group the applications previously executed on separated single-core devices, and accommodate them on an individual many-core chip offers various options for power savings, cost reductions and contributes to the overall system flexibility, its implementation is a non-trivial task. In this paper we address the issue of application mapping onto a NoCbased many-core platform when considering fundamentals and trends of current many-core operating systems, specifically, we elaborate on a limited migrative application model encompassing a message-passing paradigm as a communication primitive. As the main contribution, we formulate the problem of real-time application mapping, and propose a three-stage process to efficiently solve it. Through analysis it is assured that derived solutions guarantee the fulfilment of posed time constraints regarding worst-case communication latencies, and at the same time provide an environment to perform load balancing for e.g. thermal, energy, fault tolerance or performance reasons.We also propose several constraints regarding the topological structure of the application mapping, as well as the inter- and intra-application communication patterns, which efficiently solve the issues of pessimism and/or intractability when performing the analysis.
Resumo:
In the last years, several solutions have been proposed to extend PROFIBUS in order to support wired and wireless network stations in the same network. In this paper we compare two of those solutions, one in which the interconnection between wired and wireless stations is made by repeaters and another in which the interconnection is made by bridges. The comparison is both qualitative and numerical, based on simulation models of both architectures.
Resumo:
The characteristics of carbon fiber-reinforced plastics allow a very broad range of uses. Drilling is often necessary to assemble different components, but this can lead to various forms of damage, such as delamination which is the most severe. However, a reduced thrust force can decrease the risk of delamination. In this work, two variables of the drilling process were compared: tool material and geometry, as well as the effect of feed rate and cutting speed. The parameters that were analyzed include: thrust force, delamination extension and mechanical strength through open-hole tensile test, bearing test, and flexural test on drilled plates. The present work shows that a proper combination of all the factors involved in drilling operations, like tool material, tool geometry and cutting parameters, such as feed rate or cutting speed, can lead to the reduction of delamination damage and, consequently, to the enhancement of the mechanical properties of laminated parts in complex structures, evaluated by open-hole, bearing, or flexural tests.
Resumo:
Different heating systems have been used in pultrusion, where the most widely used heaters are planar resistances. The primary objective of this study was to develop an improved heating system and compare its performance with that of a system with planar resistances. In this study, thermography was used to better understand the temperature profile along the die. Finite element analysis was performed to determine the amount of energy consumed by the heating systems. Improvements were made to the die to test the new heating system, and it was found that the new system reduced the setup time and energy consumption by approximately 57%.