64 resultados para Complex Effective Porosity
Resumo:
Animal locomotion is a complex process, involving the central pattern generators (neural networks, located in the spinal cord, that produce rhythmic patterns), the brainstem command systems, the steering and posture control systems and the top layer structures that decide which motor primitive is activated at a given time. Pinto and Golubitsky studied an integer CPG model for legs rhythms in bipeds. It is a four-coupled identical oscillators' network with dihedral symmetry. This paper considers a new complex order central pattern generator (CPG) model for locomotion in bipeds. A complex derivative Dα±jβ, with α, β ∈ ℜ+, j = √-1, is a generalization of the concept of an integer derivative, where α = 1, β = 0. Parameter regions where periodic solutions, identified with legs' rhythms in bipeds, occur, are analyzed. Also observed is the variation of the amplitude and period of periodic solutions with the complex order derivative.
Resumo:
A procedure for coupling mesoscale and CFD codes is presented, enabling the inclusion of realistic stratification flow regimes and boundary conditions in CFD simulations of relevance to site and resource assessment studies in complex terrain. Two distinct techniques are derived: (i) in the first one, boundary conditions are extracted from mesoscale results to produce time-varying CFD solutions; (ii) in the second case, a statistical treatment of mesoscale data leads to steady-state flow boundary conditions believed to be more representative than the idealised profiles which are current industry practice. Results are compared with measured data and traditional CFD approaches.
Resumo:
Redundant manipulators allow the trajectory optimization, the obstacle avoidance, and the resolution of singularities. For this type of manipulators, the kinematic control algorithms adopt generalized inverse matrices that may lead to unpredictable responses. Motivated by these problems this paper studies the complexity revealed by the trajectory planning scheme when controlling redundant manipulators. The results reveal fundamental properties of the chaotic phenomena and give a deeper insight towards the development of superior trajectory control algorithms.
Resumo:
In this article we describe several methods for the discretization of the differintegral operator sa, where α = u + jv is a complex value. The concept of the conjugated-order differintegral is also introduced, which enables the use of complex-order differintegrals while still producing real-valued time responses and transfer functions. The performance of the resulting approximations is analysed in both the time and frequency domains. Several results are presented that demonstrate its utility in control system design.
Resumo:
This paper studies the dynamics of foot–ground interaction in hexapod locomotion systems. For that objective the robot motion is characterized in terms of several locomotion variables and the ground is modelled through a non-linear spring-dashpot system, with parameters based on the studies of soil mechanics. Moreover, it is adopted an algorithm with foot-force feedback to control the robot locomotion. A set of model-based experiments reveals the influence of the locomotion velocity on the foot–ground transfer function, which presents complex-order dynamics.
Resumo:
The main purpose of this work was the development of procedures for the simulation of atmospheric ows over complex terrain, using OpenFOAM. For this aim, tools and procedures were developed apart from this code for the preprocessing and data extraction, which were thereafter applied in the simulation of a real case. For the generation of the computational domain, a systematic method able to translate the terrain elevation model to a native OpenFOAM format (blockMeshDict) was developed. The outcome was a structured mesh, in which the user has the ability to de ne the number of control volumes and its dimensions. With this procedure, the di culties of case set up and the high computation computational e ort reported in literature associated to the use of snappyHexMesh, the OpenFOAM resource explored until then for the accomplishment of this task, were considered to be overwhelmed. Developed procedures for the generation of boundary conditions allowed for the automatic creation of idealized inlet vertical pro les, de nition of wall functions boundary conditions and the calculation of internal eld rst guesses for the iterative solution process, having as input experimental data supplied by the user. The applicability of the generated boundary conditions was limited to the simulation of turbulent, steady-state, incompressible and neutrally strati ed atmospheric ows, always recurring to RaNS (Reynolds-averaged Navier-Stokes) models. For the modelling of terrain roughness, the developed procedure allowed to the user the de nition of idealized conditions, like an uniform aerodynamic roughness length or making its value variable as a function of topography characteristic values, or the using of real site data, and it was complemented by the development of techniques for the visual inspection of generated roughness maps. The absence and the non inclusion of a forest canopy model limited the applicability of this procedure to low aerodynamic roughness lengths. The developed tools and procedures were then applied in the simulation of a neutrally strati ed atmospheric ow over the Askervein hill. In the performed simulations was evaluated the solution sensibility to di erent convection schemes, mesh dimensions, ground roughness and formulations of the k - ε and k - ω models. When compared to experimental data, calculated values showed a good agreement of speed-up in hill top and lee side, with a relative error of less than 10% at a height of 10 m above ground level. Turbulent kinetic energy was considered to be well simulated in the hill windward and hill top, and grossly predicted in the lee side, where a zone of ow separation was also identi ed. Despite the need of more work to evaluate the importance of the downstream recirculation zone in the quality of gathered results, the agreement between the calculated and experimental values and the OpenFOAM sensibility to the tested parameters were considered to be generally in line with the simulations presented in the reviewed bibliographic sources.
Resumo:
Consolidation consists in scheduling multiple virtual machines onto fewer servers in order to improve resource utilization and to reduce operational costs due to power consumption. However, virtualization technologies do not offer performance isolation, causing applications’ slowdown. In this work, we propose a performance enforcing mechanism, composed of a slowdown estimator, and a interference- and power-aware scheduling algorithm. The slowdown estimator determines, based on noisy slowdown data samples obtained from state-of-the-art slowdown meters, if tasks will complete within their deadlines, invoking the scheduling algorithm if needed. When invoked, the scheduling algorithm builds performance and power aware virtual clusters to successfully execute the tasks. We conduct simulations injecting synthetic jobs which characteristics follow the last version of the Google Cloud tracelogs. The results indicate that our strategy can be efficiently integrated with state-of-the-art slowdown meters to fulfil contracted SLAs in real-world environments, while reducing operational costs in about 12%.
Resumo:
Among the most important measures to prevent wild forest fires is the use of prescribed and controlled burning actions in order to reduce the availability of fuel mass. However, the impact of these activities on soil physical and chemical properties varies according to the type of both soil and vegetation and is not fully understood. Therefore, soil monitoring campaigns are often used to measure these impacts. In this paper we have successfully used three statistical data treatments - the Kolmogorov-Smirnov test followed by the ANOVA and the Kruskall-Wallis tests – to investigate the variability among the soil pH, soil moisture, soil organic matter and soil iron variables for different monitoring times and sampling procedures.
Resumo:
The World Business Council for Sustainable Development (WBCSD) defines Eco-Efficiency as follows: ‘Eco- Efficiency is achieved by the delivery of competitively priced-goods and services that satisfy human needs and bring quality of life, while progressively reducing ecological impacts and resource intensity throughout the life-cycle to a level at least in line with the earth’s estimated carrying capacity’. Eco-Efficiency is under this point of view a key concept for sustainable development, bringing together economic and ecological progress. Measuring the Eco-Efficiency of a company, factory or business, is a complex process that involves the measurement and control of several and relevant parameters or indicators, globally applied to all companies in general, or specific according to the nature and specificities of the business itself. In this study, an attempt was made in order to measure and evaluate the eco-efficiency of a pultruded composite processing company. For this purpose the recommendations of WBCSD [1] and the directives of ISO 14301 standard [2] were followed and applied. The analysis was restricted to the main business branch of the company: the production and sale of standard GFRP pultrusion profiles. The main general indicators of eco-efficiency, as well as the specific indicators, were defined and determined according to ISO 14031 recommendations. With basis on indicators’ figures, the value profile, the environmental profile, and the pertinent eco-efficiency’s ratios were established and analyzed. In order to evaluate potential improvements on company eco-performance, new indicators values and ecoefficiency ratios were estimated taking into account the implementation of new proceedings and procedures, both in upstream and downstream of the production process, namely: a) Adoption of new heating system for pultrusion die in the manufacturing process, more effective and with minor heat losses; b) Implementation of new software for stock management (raw materials and final products) that minimize production failures and delivery delays to final consumer; c) Recycling approach, with partial waste reuse of scrap material derived from manufacturing, cutting and assembly processes of GFRP profiles. In particular, the last approach seems to significantly improve the eco-efficient performance of the company. Currently, by-products and wastes generated in the manufacturing process of GFRP profiles are landfilled, with supplementary added costs to this company traduced by transport of scrap, landfill taxes and required test analysis to waste materials. However, mechanical recycling of GFRP waste materials, with reduction to powdered and fibrous particulates, constitutes a recycling process that can be easily attained on heavy-duty cutting mills. The posterior reuse of obtained recyclates, either into a close-looping process, as filler replacement of resin matrix of GFRP profiles, or as reinforcement of other composite materials produced by the company, will drive to both costs reduction in raw materials and landfill process, and minimization of waste landfill. These features lead to significant improvements on the sequent assessed eco-efficiency ratios of the present case study, yielding to a more sustainable product and manufacturing process of pultruded GFRP profiles.
Resumo:
The aim of this work was to assess the influence of meteorological conditions on the dispersion of particulate matter from an industrial zone into urban and suburban areas. The particulate matter concentration was related to the most important meteorological variables such as wind direction, velocity and frequency. A coal-fired power plant was considered to be the main emission source with two stacks of 225 m height. A middle point between the two stacks was taken as the centre of two concentric circles with 6 and 20 km radius delimiting the sampling area. About 40 sampling collectors were placed within this area. Meteorological data was obtained from a portable meteorological station placed at approximately 1.7 km to SE from the stacks. Additional data was obtained from the electrical company that runs the coal power plant. These data covers the years from 2006 to the present. A detailed statistical analysis was performed to identify the most frequent meteorological conditions concerning mainly wind speed and direction. This analysis revealed that the most frequent wind blows from Northwest and North and the strongest winds blow from Northwest. Particulate matter deposition was obtained in two sampling campaigns carried out in summer and in spring. For the first campaign the monthly average flux deposition was 1.90 g/m2 and for the second campaign this value was 0.79 g/m2. Wind dispersion occurred predominantly from North to South, away from the nearest residential area, located at about 6 km to Northwest from the stacks. Nevertheless, the higher deposition fluxes occurred in the NW/N and NE/E quadrants. This study was conducted considering only the contribution of particulate matter from coal combustion, however, others sources may be present as well, such as road traffic. Additional chemical analyses and microanalysis are needed to identify the source linkage to flux deposition levels.
Resumo:
To meet the increasing demands of the complex inter-organizational processes and the demand for continuous innovation and internationalization, it is evident that new forms of organisation are being adopted, fostering more intensive collaboration processes and sharing of resources, in what can be called collaborative networks (Camarinha-Matos, 2006:03). Information and knowledge are crucial resources in collaborative networks, being their management fundamental processes to optimize. Knowledge organisation and collaboration systems are thus important instruments for the success of collaborative networks of organisations having been researched in the last decade in the areas of computer science, information science, management sciences, terminology and linguistics. Nevertheless, research in this area didn’t give much attention to multilingual contexts of collaboration, which pose specific and challenging problems. It is then clear that access to and representation of knowledge will happen more and more on a multilingual setting which implies the overcoming of difficulties inherent to the presence of multiple languages, through the use of processes like localization of ontologies. Although localization, like other processes that involve multilingualism, is a rather well-developed practice and its methodologies and tools fruitfully employed by the language industry in the development and adaptation of multilingual content, it has not yet been sufficiently explored as an element of support to the development of knowledge representations - in particular ontologies - expressed in more than one language. Multilingual knowledge representation is then an open research area calling for cross-contributions from knowledge engineering, terminology, ontology engineering, cognitive sciences, computational linguistics, natural language processing, and management sciences. This workshop joined researchers interested in multilingual knowledge representation, in a multidisciplinary environment to debate the possibilities of cross-fertilization between knowledge engineering, terminology, ontology engineering, cognitive sciences, computational linguistics, natural language processing, and management sciences applied to contexts where multilingualism continuously creates new and demanding challenges to current knowledge representation methods and techniques. In this workshop six papers dealing with different approaches to multilingual knowledge representation are presented, most of them describing tools, approaches and results obtained in the development of ongoing projects. In the first case, Andrés Domínguez Burgos, Koen Kerremansa and Rita Temmerman present a software module that is part of a workbench for terminological and ontological mining, Termontospider, a wiki crawler that aims at optimally traverse Wikipedia in search of domainspecific texts for extracting terminological and ontological information. The crawler is part of a tool suite for automatically developing multilingual termontological databases, i.e. ontologicallyunderpinned multilingual terminological databases. In this paper the authors describe the basic principles behind the crawler and summarized the research setting in which the tool is currently tested. In the second paper, Fumiko Kano presents a work comparing four feature-based similarity measures derived from cognitive sciences. The purpose of the comparative analysis presented by the author is to verify the potentially most effective model that can be applied for mapping independent ontologies in a culturally influenced domain. For that, datasets based on standardized pre-defined feature dimensions and values, which are obtainable from the UNESCO Institute for Statistics (UIS) have been used for the comparative analysis of the similarity measures. The purpose of the comparison is to verify the similarity measures based on the objectively developed datasets. According to the author the results demonstrate that the Bayesian Model of Generalization provides for the most effective cognitive model for identifying the most similar corresponding concepts existing for a targeted socio-cultural community. In another presentation, Thierry Declerck, Hans-Ulrich Krieger and Dagmar Gromann present an ongoing work and propose an approach to automatic extraction of information from multilingual financial Web resources, to provide candidate terms for building ontology elements or instances of ontology concepts. The authors present a complementary approach to the direct localization/translation of ontology labels, by acquiring terminologies through the access and harvesting of multilingual Web presences of structured information providers in the field of finance, leading to both the detection of candidate terms in various multilingual sources in the financial domain that can be used not only as labels of ontology classes and properties but also for the possible generation of (multilingual) domain ontologies themselves. In the next paper, Manuel Silva, António Lucas Soares and Rute Costa claim that despite the availability of tools, resources and techniques aimed at the construction of ontological artifacts, developing a shared conceptualization of a given reality still raises questions about the principles and methods that support the initial phases of conceptualization. These questions become, according to the authors, more complex when the conceptualization occurs in a multilingual setting. To tackle these issues the authors present a collaborative platform – conceptME - where terminological and knowledge representation processes support domain experts throughout a conceptualization framework, allowing the inclusion of multilingual data as a way to promote knowledge sharing and enhance conceptualization and support a multilingual ontology specification. In another presentation Frieda Steurs and Hendrik J. Kockaert present us TermWise, a large project dealing with legal terminology and phraseology for the Belgian public services, i.e. the translation office of the ministry of justice, a project which aims at developing an advanced tool including expert knowledge in the algorithms that extract specialized language from textual data (legal documents) and whose outcome is a knowledge database including Dutch/French equivalents for legal concepts, enriched with the phraseology related to the terms under discussion. Finally, Deborah Grbac, Luca Losito, Andrea Sada and Paolo Sirito report on the preliminary results of a pilot project currently ongoing at UCSC Central Library, where they propose to adapt to subject librarians, employed in large and multilingual Academic Institutions, the model used by translators working within European Union Institutions. The authors are using User Experience (UX) Analysis in order to provide subject librarians with a visual support, by means of “ontology tables” depicting conceptual linking and connections of words with concepts presented according to their semantic and linguistic meaning. The organizers hope that the selection of papers presented here will be of interest to a broad audience, and will be a starting point for further discussion and cooperation.
Resumo:
Electricity markets are complex environments, involving a large number of different entities, playing in a dynamic scene to obtain the best advantages and profits. MASCEM (Multi-Agent System for Competitive Electricity Markets) is a multi-agent electricity market simulator that models market players and simulates their operation in the market. Market players are entities with specific characteristics and objectives, making their decisions and interacting with other players. This paper presents a methodology to provide decision support to electricity market negotiating players. This model allows integrating different strategic approaches for electricity market negotiations, and choosing the most appropriate one at each time, for each different negotiation context. This methodology is integrated in ALBidS (Adaptive Learning strategic Bidding System) – a multiagent system that provides decision support to MASCEM's negotiating agents so that they can properly achieve their goals. ALBidS uses artificial intelligence methodologies and data analysis algorithms to provide effective adaptive learning capabilities to such negotiating entities. The main contribution is provided by a methodology that combines several distinct strategies to build actions proposals, so that the best can be chosen at each time, depending on the context and simulation circumstances. The choosing process includes reinforcement learning algorithms, a mechanism for negotiating contexts analysis, a mechanism for the management of the efficiency/effectiveness balance of the system, and a mechanism for competitor players' profiles definition.
Resumo:
O desenvolvimento de recursos multilingues robustos para fazer face às exigências crescentes na complexidade dos processos intra e inter-organizacionais é um processo complexo que obriga a um aumento da qualidade nos modos de interacção e partilha dos recursos das organizações, através, por exemplo, de um maior envolvimento dos diferentes interlocutores em formas eficazes e inovadoras de colaboração. É um processo em que se identificam vários problemas e dificuldades, como sendo, no caso da criação de bases de dados lexicais multilingues, o desenvolvimento de uma arquitectura capaz de dar resposta a um conjunto vasto de questões linguísticas, como a polissemia, os padrões lexicais ou os equivalentes de tradução. Estas questões colocam-se na construção quer dos recursos terminológicos, quer de ontologias multilingues. No caso da construção de uma ontologia em diferentes línguas, processo no qual focalizaremos a nossa atenção, as questões e a complexidade aumentam, dado o tipo e propósitos do artefacto semântico, os elementos a localizar (conceitos e relações conceptuais) e o contexto em que o processo de localização ocorre. Pretendemos, assim, com este artigo, analisar o conceito e o processo de localização no contexto dos sistemas de gestão do conhecimento baseados em ontologias, tendo em atenção o papel central da terminologia no processo de localização, as diferentes abordagens e modelos propostos, bem como as ferramentas de base linguística que apoiam a implementação do processo. Procuraremos, finalmente, estabelecer alguns paralelismos entre o processo tradicional de localização e o processo de localização de ontologias, para melhor o situar e definir.
Resumo:
When a pesticide is released into the environment, most of it is lost before it reaches its target. An effective way to reduce environmental losses of pesticides is by using controlled release technology. Microencapsulation becomes a promising technique for the production of controlled release agricultural formulations. In this work, the microencapsulation of chlorophenoxy herbicide MCPA with native b-cyclodextrin and its methyl and hydroxypropyl derivatives was investigated. The phase solubility study showed that both native and b-CD derivatives increased the water solubility of the herbicide and inclusion complexes are formed in a stoichiometric ratio of 1:1. The stability constants describing the extent of formation of the complexes have been determined by phase solubility studies. 1H NMR experiments were also accomplished for the prepared solid systems and the data gathered confirm the formation of the inclusion complexes. 1H NMR data obtained for the MCPA/CDs complexes disclosed noticeable proton shift displacements for OCH2 group and H6 aromatic proton of MCPA provided clear evidence of inclusion complexation process, suggesting that the phenyl moiety of the herbicide was included in the hydrophobic cavity of CDs. Free energy molecular mechanics calculations confirm all these findings. The gathered results can be regarded as an essential step to the development of controlled release agricultural formulations containing herbicide MCPA.
Resumo:
Dissertação de Mestrado apresentada ao Instituto de Contabilidade e Administração do Porto para a obtenção do grau de Mestre em Auditoria, sob orientação de Mestre Adalmiro Álvaro Malheiro de Castro Andrade Pereira