51 resultados para Standard models
em Instituto Politécnico do Porto, Portugal
Resumo:
This paper analyzes the dynamical properties of systems with backlash and impact phenomena based on the describing function method. It is shown that this type of nonlinearity can be analyzed in the perspective of the fractional calculus theory. The fractional dynamics is compared with that of standard models.
Resumo:
This paper analyses the dynamical properties of systems with backlash and impact phenomena based on the describing function method. The dynamics is illustrated using the Nyquist and Bode plots and the results are compared with those of standard models.
Resumo:
Dynamic parallel scheduling using work-stealing has gained popularity in academia and industry for its good performance, ease of implementation and theoretical bounds on space and time. Cores treat their own double-ended queues (deques) as a stack, pushing and popping threads from the bottom, but treat the deque of another randomly selected busy core as a queue, stealing threads only from the top, whenever they are idle. However, this standard approach cannot be directly applied to real-time systems, where the importance of parallelising tasks is increasing due to the limitations of multiprocessor scheduling theory regarding parallelism. Using one deque per core is obviously a source of priority inversion since high priority tasks may eventually be enqueued after lower priority tasks, possibly leading to deadline misses as in this case the lower priority tasks are the candidates when a stealing operation occurs. Our proposal is to replace the single non-priority deque of work-stealing with ordered per-processor priority deques of ready threads. The scheduling algorithm starts with a single deque per-core, but unlike traditional work-stealing, the total number of deques in the system may now exceed the number of processors. Instead of stealing randomly, cores steal from the highest priority deque.
Resumo:
The international Electrotechnical Commission (IEC) 61499 architecture incorporated several function block with which distributed control application may be developed, and how these are interpreted and executed. However, due the distributed nature of the control applications, many issues also need to be taken into account. Most of these are due to the new error model and failure modes of the distributed hardware on which the distributed application is executed and also due the incomplete standards definition of the execution models. IEC 61499 frameworks does not clarify how to handle with replication of software and hardware components. In this paper we propose a replication model for IEC 61499 applications and which mechanisms and protocols may be used for their support.
Resumo:
The structural integrity of multi-component structures is usually determined by the strength and durability of their unions. Adhesive bonding is often chosen over welding, riveting and bolting, due to the reduction of stress concentrations, reduced weight penalty and easy manufacturing, amongst other issues. In the past decades, the Finite Element Method (FEM) has been used for the simulation and strength prediction of bonded structures, by strength of materials or fracture mechanics-based criteria. Cohesive-zone models (CZMs) have already proved to be an effective tool in modelling damage growth, surpassing a few limitations of the aforementioned techniques. Despite this fact, they still suffer from the restriction of damage growth only at predefined growth paths. The eXtended Finite Element Method (XFEM) is a recent improvement of the FEM, developed to allow the growth of discontinuities within bulk solids along an arbitrary path, by enriching degrees of freedom with special displacement functions, thus overcoming the main restriction of CZMs. These two techniques were tested to simulate adhesively bonded single- and double-lap joints. The comparative evaluation of the two methods showed their capabilities and/or limitations for this specific purpose.
Resumo:
The idiomatic expression “In Rome be a Roman” can be applied to leadership training and development as well. Leaders who can act as role models inspire other future leaders in their behaviour, attitudes and ways of thinking. Based on two examples of current leaders in the fields of Politics and Public Administration, I support the idea that exposure to role models during their training was decisive for their career paths and current activities as prominent characters in their profession. Issues such as how students should be prepared for community or national leadership as well as cross-cultural engagement are raised here. The hypothesis of transculturalism and cross-cultural commitment as a factor of leadership is presented. Based on current literature on Leadership as well as the presented case studies, I expect to raise a debate focusing on strategies for improving leaders’ training in their cross-cultural awareness.
Resumo:
Oriêntador: Mestre Carlos Pedro
Resumo:
Long-term contractual decisions are the basis of an efficient risk management. However those types of decisions have to be supported with a robust price forecast methodology. This paper reports a different approach for long-term price forecast which tries to give answers to that need. Making use of regression models, the proposed methodology has as main objective to find the maximum and a minimum Market Clearing Price (MCP) for a specific programming period, and with a desired confidence level α. Due to the problem complexity, the meta-heuristic Particle Swarm Optimization (PSO) was used to find the best regression parameters and the results compared with the obtained by using a Genetic Algorithm (GA). To validate these models, results from realistic data are presented and discussed in detail.
Resumo:
Currently, power systems (PS) already accommodate a substantial penetration of distributed generation (DG) and operate in competitive environments. In the future, as the result of the liberalisation and political regulations, PS will have to deal with large-scale integration of DG and other distributed energy resources (DER), such as storage and provide market agents to ensure a flexible and secure operation. This cannot be done with the traditional PS operational tools used today like the quite restricted information systems Supervisory Control and Data Acquisition (SCADA) [1]. The trend to use the local generation in the active operation of the power system requires new solutions for data management system. The relevant standards have been developed separately in the last few years so there is a need to unify them in order to receive a common and interoperable solution. For the distribution operation the CIM models described in the IEC 61968/70 are especially relevant. In Europe dispersed and renewable energy resources (D&RER) are mostly operated without remote control mechanisms and feed the maximal amount of available power into the grid. To improve the network operation performance the idea of virtual power plants (VPP) will become a reality. In the future power generation of D&RER will be scheduled with a high accuracy. In order to realize VPP decentralized energy management, communication facilities are needed that have standardized interfaces and protocols. IEC 61850 is suitable to serve as a general standard for all communication tasks in power systems [2]. The paper deals with international activities and experiences in the implementation of a new data management and communication concept in the distribution system. The difficulties in the coordination of the inconsistent developed in parallel communication and data management standards - are first addressed in the paper. The upcoming unification work taking into account the growing role of D&RER in the PS is shown. It is possible to overcome the lag in current practical experiences using new tools for creating and maintenance the CIM data and simulation of the IEC 61850 protocol – the prototype of which is presented in the paper –. The origin and the accuracy of the data requirements depend on the data use (e.g. operation or planning) so some remarks concerning the definition of the digital interface incorporated in the merging unit idea from the power utility point of view are presented in the paper too. To summarize some required future work has been identified.
Resumo:
Grande parte dos triples-stores são open source e desenvolvidos em Java, disponibilizando interfaces standards e privadas de acesso. A grande maioria destes sistemas não dispõe de mecanismos de controlo de acessos nativos, o que dificulta ou impossibilita a sua adopção em ambientes em que a segurança dos factos é importante (e.g. ambiente empresarial). Complementarmente observa-se que o modelo de controlo de acesso a triplos e em particular a triplos descritos por ontologias não está standardizado nem sequer estabilizado, havendo diversos modelos de descrição e algoritmos de avaliação de permissões de acesso. O trabalho desenvolvido nesta tese/dissertação propõe um modelo e interface de controlo de acesso que permite e facilite a sua adopção por diferentes triple-stores já existentes e a integração dos triples-stores com outros sistemas já existentes na organização. Complementarmente, a plataforma de controlo de acesso não impõe qualquer modelo ou algoritmo de avaliação de permissões, mas pelo contrário permite a adopção de modelos e algoritmos distintos em função das necessidades ou desejos. Finalmente demonstra-se a aplicabilidade e validade do modelo e interface propostos, através da sua implementação e adopção ao triple-store SwiftOWLIM já existente, que não dispõe de mecanismo de controlo de acessos nativo.
Resumo:
A saúde em Portugal vive hoje mudanças significativas. A criação de modelos de Gestão Empresarial em instituições públicas visa a melhor qualidade ao menor custo. A aquisição de equipamento médico, cada vez mais sofisticado, exige das instituições esforços redobrados. A necessidade de redução dos custos acoplada à necessidade de aquisição de tecnologias cada vez mais avançadas exige que as instituições tomem mediadas mais rigorosas para melhorar o processo de aquisição. É importante estabelecer desde o início de um processo de aquisição, as exatas necessidades da instituição, com um conjunto de especificações bem detalhado do produto a adquirir bem como um conjunto exigências que devem ser feitas perante os fornecedores que salvaguardem a instituição. O conhecimento do equipamento a adquirir facilita todo o processo. Assim é de extrema importância garantir o estudo bastante alargado do equipamento, permitindo à instituição uma melhor avaliação do equipamento, aquando da seleção do mesmo. A garantia da confiabilidade metrológica é outro ponto muito importante a ter em conta no processo, uma vez que o sucesso dos cuidados de saúde parte da confiança e segurança que transmitem aos seus utentes. O objetivo deste trabalho é o estudo de Ventiladores Pulmonares (VP) focando essencialmente na seleção e aquisição destes equipamentos. Neste estudo faz-se também um estudo dos procedimentos de avaliação da confiabilidade metrológica dos VP, tendo em vista a definição dos testes de verificação a serem efetuados ao longo do processo de aquisição. É normalizado o Caderno de Encargos (CE) e respetivas especificações/requisitos técnicos, tentando comprar de acordo com as reais necessidades da instituição, visando o menor desperdício e garantido a melhor qualidade.
Resumo:
Tecnologias da Web Semântica como RDF, OWL e SPARQL sofreram nos últimos anos um forte crescimento e aceitação. Projectos como a DBPedia e Open Street Map começam a evidenciar o verdadeiro potencial da Linked Open Data. No entanto os motores de pesquisa semânticos ainda estão atrasados neste crescendo de tecnologias semânticas. As soluções disponíveis baseiam-se mais em recursos de processamento de linguagem natural. Ferramentas poderosas da Web Semântica como ontologias, motores de inferência e linguagens de pesquisa semântica não são ainda comuns. Adicionalmente a esta realidade, existem certas dificuldades na implementação de um Motor de Pesquisa Semântico. Conforme demonstrado nesta dissertação, é necessária uma arquitectura federada de forma a aproveitar todo o potencial da Linked Open Data. No entanto um sistema federado nesse ambiente apresenta problemas de performance que devem ser resolvidos através de cooperação entre fontes de dados. O standard actual de linguagem de pesquisa na Web Semântica, o SPARQL, não oferece um mecanismo para cooperação entre fontes de dados. Esta dissertação propõe uma arquitectura federada que contém mecanismos que permitem cooperação entre fontes de dados. Aborda o problema da performance propondo um índice gerido de forma centralizada assim como mapeamentos entre os modelos de dados de cada fonte de dados. A arquitectura proposta é modular, permitindo um crescimento de repositórios e funcionalidades simples e de forma descentralizada, à semelhança da Linked Open Data e da própria World Wide Web. Esta arquitectura trabalha com pesquisas por termos em linguagem natural e também com inquéritos formais em linguagem SPARQL. No entanto os repositórios considerados contêm apenas dados em formato RDF. Esta dissertação baseia-se em múltiplas ontologias partilhadas e interligadas.
Resumo:
The first and second authors would like to thank the support of the PhD grants with references SFRH/BD/28817/2006 and SFRH/PROTEC/49517/2009, respectively, from Fundação para a Ciência e Tecnol ogia (FCT). This work was partially done in the scope of the project “Methodologies to Analyze Organs from Complex Medical Images – Applications to Fema le Pelvic Cavity”, wi th reference PTDC/EEA- CRO/103320/2008, financially supported by FCT.
Resumo:
Studies were undertaken to determine the adsorption behavior of α-cypermethrin [R)-α-cyano-3-phenoxybenzyl(1S)-cis- 3-(2,2-dichlorovinyl)-2,2-dimethylcyclopropanecarboxylate, and (S)-α-cyano-3-phenoxybenzyl (1R)-cis-3-(2,2-dichlorovinyl)-2,2- dimethylcyclopropanecarboxylate] in solutions on granules of cork and activated carbon (GAC). The adsorption studies were carried out using a batch equilibrium technique. A gas chromatograph with an electron capture detector (GC-ECD) was used to analyze α-cypermethrin after solid phase extraction with C18 disks. Physical properties including real density, pore volume, surface area and pore diameter of cork were evaluated by mercury porosimetry. Characterization of cork particles showed variations thereby indicating the highly heterogeneous structure of the material. The average surface area of cork particles was lower than that of GAC. Kinetics adsorption studies allowed the determination of the equilibrium time—24 hours for both cork (1–2 mm and 3–4 mm) and GAC. For the studied α-cypermethrin concentration range, GAC revealed to be a better sorbent. However, adsorption parameters for equilibrium concentrations, obtained through the Langmuir and Freundlich models, showed that granulated cork 1–2 mm have the maximum amount of adsorbed α-cypermethrin (qm) (303 μg/g); followed by GAC (186 μg/g) and cork 3-4 mm (136 μg/g). The standard deviation (SD) values, demonstrate that Freundlich model better describes the α-cypermethrin adsorption phenomena on GAC, while α-cypermethrin adsorption on cork (1-2 mm and 3-4 mm) is better described by the Langmuir. In view of the adsorption results obtained in this study it appears that granulated cork may be a better and a cheaper alternative to GAC for removing α-cypermethrin from water.