14 resultados para Generation from examples
em Instituto Politécnico do Porto, Portugal
Resumo:
Vishnu is a tool for XSLT visual programming in Eclipse - a popular and extensible integrated development environment. Rather than writing the XSLT transformations, the programmer loads or edits two document instances, a source document and its corresponding target document, and pairs texts between then by drawing lines over the documents. This form of XSLT programming is intended for simple transformations between related document types, such as HTML formatting or conversion among similar formats. Complex XSLT programs involving, for instance, recursive templates or second order transformations are out of the scope of Vishnu. We present the architecture of Vishnu composed by a graphical editor and a programming engine. The editor is an Eclipse plug-in where the programmer loads and edits document examples and pairs their content using graphical primitives. The programming engine receives the data collected by the editor and produces an XSLT program. The design of the engine and the process of creation of an XSLT program from examples are also detailed. It starts with the generation of an initial transformation that maps source document to the target document. This transformation is fed to a rewrite process where each step produces a refined version of the transformation. Finally, the transformation is simplified before being presented to the programmer for further editing.
Resumo:
XSLT is a powerful and widely used language for transforming XML documents. However its power and complexity can be overwhelming for novice or infrequent users, many of which simply give up on using this language. On the other hand, many XSLT programs of practical use are simple enough to be automatically inferred from examples of source and target documents. An inferred XSLT program is seldom adequate for production usage but can be used as a skeleton of the final program, or at least as scaffolding in the process of coding it. It should be noted that the authors do not claim that XSLT programs, in general, can be inferred from examples. The aim of Vishnu - the XSLT generator engine described in this paper – is to produce XSLT programs for processing documents similar to the given examples and with enough readability to be easily understood by a programmer not familiar with the language. The architecture of Vishnu is composed by a graphical editor and a programming engine. In this paper we focus on the editor as a GWT web application where the programmer loads and edits document examples and pairs their content using graphical primitives. The programming engine receives the data collected by the editor and produces an XSLT program.
Resumo:
XSLT is a powerful and widely used language for transforming XML documents. However, its power and complexity can be overwhelming for novice or infrequent users, many of whom simply give up on using this language. On the other hand, many XSLT programs of practical use are simple enough to be automatically inferred from examples of source and target documents. An inferred XSLT program is seldom adequate for production usage but can be used as a skeleton of the final program, or at least as scaffolding in the process of coding it. It should be noted that the authors do not claim that XSLT programs, in general, can be inferred from examples. The aim of Vishnu—the XSLT generator engine described in this chapter—is to produce XSLT programs for processing documents similar to the given examples and with enough readability to be easily understood by a programmer not familiar with the language. The architecture of Vishnu is composed by a graphical editor and a programming engine. In this chapter, the authors focus on the editor as a GWT Web application where the programmer loads and edits document examples and pairs their content using graphical primitives. The programming engine receives the data collected by the editor and produces an XSLT program.
Resumo:
These are the proceedings for the eighth national conference on XML, its Associated Technologies and its Applications (XATA'2010). The paper selection resulted in 33% of papers accepted as full papers, and 33% of papers accepted as short papers. While these two types of papers were distinguish during the conference, and they had different talk duration, they all had the same limit of 12 pages. We are happy that the selected papers focus both aspects of the conference: XML technologies, and XML applications. In the first group we can include the articles on parsing and transformation technologies, like “Processing XML: a rewriting system approach", “Visual Programming of XSLT from examples", “A Refactoring Model for XML Documents", “A Performance based Approach for Processing Large XML Files in Multicore Machines", “XML to paper publishing with manual intervention" and “Parsing XML Documents in Java using Annotations". XML-core related papers are also available, focusing XML tools testing on “Test::XML::Generator: Generating XML for Unit Testing" and “XML Archive for Testing: a benchmark for GuessXQ". XML as the base for application development is also present, being discussed on different areas, like “Web Service for Interactive Products and Orders Configuration", “XML Description for Automata Manipulations", “Integration of repositories in Moodle", “XML, Annotations and Database: a Comparative Study of Metadata Definition Strategies for Frameworks", “CardioML: Integrating Personal Cardiac Information for Ubiquous Diagnosis and Analysis", “A Semantic Representation of Users Emotions when Watching Videos" and “Integrating SVG and SMIL in DAISY DTB production to enhance the contents accessibility in the Open Library for Higher Education". The wide spread of subjects makes us believe that for the time being XML is here to stay what enhances the importance of gathering this community to discuss related science and technology. Small conferences are traversing a bad period. Authors look for impact and numbers and only submit their works to big conferences sponsored by the right institutions. However the group of people behind this conference still believes that spaces like this should be preserved and maintained. This 8th gathering marks the beginning of a new cycle. We know who we are, what is our identity and we will keep working to preserve that. We hope the publication containing the works of this year's edition will catch the same attention and interest of the previous editions and above all that this publication helps in some other's work. Finally, we would like to thank all authors for their work and interest in the conference, and to the scientific committee members for their review work.
Resumo:
During the past 15 years, emergence and dissemination of third-generation cephalosporins resistance in nosocomial Enterobacteriaceae became a serious problem worldwide, due to the production of extended-spectrum-β-lactamases (ESBLs). The aim of this study was to investigate among the presence of ESBL-producing enterobacteria among Portuguese clinical isolates nearby Spain, to investigate the antimicrobial susceptibility patterns and to compare the two countries. The β-lactamases genes, blaTEM, blaSHV and blaCTX-M were detected by molecular methods. Among the ESBL-producing isolates it was found extraordinary levels (98.9%) of resistance to the fourth-generation cephalosporin Cefepime. These findings point to the need of reevaluate the definition of ESBL.
Resumo:
This paper presents ELECON - Electricity Consumption Analysis to Promote Energy Efficiency Considering Demand Response and Non-technical Losses, an international research project that involves European and Brazilian partners. ELECON focuses on energy efficiency increasing through consumer´s active participation which is a key area for Europe and Brazil cooperation. The project aims at significantly contributing towards the successful implementation of smart grids, focussing on the use of new methods that allow the efficient use of distributed energy resources, namely distributed generation, storage and demand response. ELECON puts together researchers from seven European and Brazilian partners, with consolidated research background and evidencing complementary competences. ELECON involves institutions of 3 European countries (Portugal, Germany, and France) and 4 Brazilian institutions. The complementary background and experience of the European and Brazilian partners is of main relevance to ensure the capacities required to achieve the proposed goals. In fact, the European Union (EU) and Brazil have very different resources and approaches in what concerns this area. Having huge hydro and fossil resources, Brazil has not been putting emphasis on distributed renewable based electricity generation. On the contrary, EU has been doing huge investments in this area, taking into account environmental concerns and also the economic EU external dependence dictated by huge requirements of energy related products imports. Sharing these different backgrounds allows the project team to propose new methodologies able to efficiently address the new challenges of smart grids.
Resumo:
Nowadays, there is a growing environmental concern about were the energy that we use comes from, bringing the att ention on renewable energies. However, the use and trade of renewable e nergies in the market seem to be complicated because of the lack of guara ntees of generation, mainly in the wind farms. The lack of guarantees is usually addressed by using a reserve generation. The aggregation of DG p lants gives place to a new concept: the Virtual Power Producer (VPP). VPPs can reinforce the importance of wind generation technologies, making them valuable in electricity markets. This paper presents some resul ts obtained with a simulation tool (ViProd) developed to support VPPs in the analysis of their operation and management methods and of their strat egies effects.
Resumo:
In the last years there has been a considerable increase in the number of people in need of intensive care, especially among the elderly, a phenomenon that is related to population ageing (Brown 2003). However, this is not exclusive of the elderly, as diseases as obesity, diabetes, and blood pressure have been increasing among young adults (Ford and Capewell 2007). As a new fact, it has to be dealt with by the healthcare sector, and particularly by the public one. Thus, the importance of finding new and cost effective ways for healthcare delivery are of particular importance, especially when the patients are not to be detached from their environments (WHO 2004). Following this line of thinking, a VirtualECare Multiagent System is presented in section 2, being our efforts centered on its Group Decision modules (Costa, Neves et al. 2007) (Camarinha-Matos and Afsarmanesh 2001).On the other hand, there has been a growing interest in combining the technological advances in the information society - computing, telecommunications and knowledge – in order to create new methodologies for problem solving, namely those that convey on Group Decision Support Systems (GDSS), based on agent perception. Indeed, the new economy, along with increased competition in today’s complex business environments, takes the companies to seek complementarities, in order to increase competitiveness and reduce risks. Under these scenarios, planning takes a major role in a company life cycle. However, effective planning depends on the generation and analysis of ideas (innovative or not) and, as a result, the idea generation and management processes are crucial. Our objective is to apply the GDSS referred to above to a new area. We believe that the use of GDSS in the healthcare arena will allow professionals to achieve better results in the analysis of one’s Electronically Clinical Profile (ECP). This attainment is vital, regarding the incoming to the market of new drugs and medical practices, which compete in the use of limited resources.
Resumo:
23rd SPACE AGM and Conference from 9 to 12 May 2012 Conference theme: The Role of Professional Higher Education: Responsibility and Reflection Venue: Mikkeli University of Applied Sciences, Mikkeli, Finland
Resumo:
Demand response is assumed as an essential resource to fully achieve the smart grids operating benefits, namely in the context of competitive markets and of the increasing use of renewable-based energy sources. Some advantages of Demand Response (DR) programs and of smart grids can only be achieved through the implementation of Real Time Pricing (RTP). The integration of the expected increasing amounts of distributed energy resources, as well as new players, requires new approaches for the changing operation of power systems. The methodology proposed in this paper aims the minimization of the operation costs in a distribution network operated by a virtual power player that manages the available energy resources focusing on hour ahead re-scheduling. When facing lower wind power generation than expected from day ahead forecast, demand response is used in order to minimize the impacts of such wind availability change. In this way, consumers actively participate in regulation up and spinning reserve ancillary services through demand response programs. Real time pricing is also applied. The proposed model is especially useful when actual and day ahead wind forecast differ significantly. Its application is illustrated in this paper implementing the characteristics of a real resources conditions scenario in a 33 bus distribution network with 32 consumers and 66 distributed generators.
Resumo:
This document presents a tool able to automatically gather data provided by real energy markets and to generate scenarios, capture and improve market players’ profiles and strategies by using knowledge discovery processes in databases supported by artificial intelligence techniques, data mining algorithms and machine learning methods. It provides the means for generating scenarios with different dimensions and characteristics, ensuring the representation of real and adapted markets, and their participating entities. The scenarios generator module enhances the MASCEM (Multi-Agent Simulator of Competitive Electricity Markets) simulator, endowing a more effective tool for decision support. The achievements from the implementation of the proposed module enables researchers and electricity markets’ participating entities to analyze data, create real scenarios and make experiments with them. On the other hand, applying knowledge discovery techniques to real data also allows the improvement of MASCEM agents’ profiles and strategies resulting in a better representation of real market players’ behavior. This work aims to improve the comprehension of electricity markets and the interactions among the involved entities through adequate multi-agent simulation.
Resumo:
The electricity market restructuring, along with the increasing necessity for an adequate integration of renewable energy sources, is resulting in an rising complexity in power systems operation. Various power system simulators have been introduced in recent years with the purpose of helping operators, regulators, and involved players to understand and deal with this complex environment. This paper focuses on the development of an upper ontology which integrates the essential concepts necessary to interpret all the available information. The restructuring of MASCEM (Multi-Agent System for Competitive Electricity Markets), and this system’s integration with MASGriP (Multi-Agent Smart Grid Platform), and ALBidS (Adaptive Learning Strategic Bidding System) provide the means for the exemplification of the usefulness of this ontology. A practical example is presented, showing how common simulation scenarios for different simulators, directed to very distinct environments, can be created departing from the proposed ontology.
Resumo:
This paper presents the Realistic Scenarios Generator (RealScen), a tool that processes data from real electricity markets to generate realistic scenarios that enable the modeling of electricity market players’ characteristics and strategic behavior. The proposed tool provides significant advantages to the decision making process in an electricity market environment, especially when coupled with a multi-agent electricity markets simulator. The generation of realistic scenarios is performed using mechanisms for intelligent data analysis, which are based on artificial intelligence and data mining algorithms. These techniques allow the study of realistic scenarios, adapted to the existing markets, and improve the representation of market entities as software agents, enabling a detailed modeling of their profiles and strategies. This work contributes significantly to the understanding of the interactions between the entities acting in electricity markets by increasing the capability and realism of market simulations.
Resumo:
O decréscimo das reservas de petróleo e as consequências ambientais resultantes do recurso a combustíveis fósseis nos motores a diesel têm levado à procura de combustíveis alternativos. Esta pesquisa alicerçada nas fontes de energia renovável tornou-se essencial, face à crescente procura de energia e ao limitado fornecimento de combustíveis fósseis . Resíduos de óleo de cozinha, gordura animal, entre outros resíduos de origem biológica, tais como a borra de café, são exemplos de matérias-primas para a produção de biodiesel. A sua valorização tem interesse quer pela perspetiva ambiental, quer pela económica, pois aumenta não só a flexibilidade e diversificação das matérias-primas, mas também contribui para uma estabilidade de custos e alteração nas políticas agrícolas e de uso do solo. É neste contexto que se enquadra o biodiesel e a borra de café, pretendendo-se aqui efetuar o estudo da produção, à escala laboratorial, de biodiesel a partir da borra de café, por transesterificação enzimática, visando a procura das melhores condições reacionais. Iniciando-se com a caracterização da borra de café, foram avaliados antes e após a extração do óleo da borra de café, diversos parâmetros, de entre os quais se destacam: o teor de humidade (16,97% e 6,79%), teor de cinzas (1,91 e 1,57%), teor de azoto (1,71 e 2,30%), teor de proteínas (10,7 e 14,4%), teor de carbono (70,2 e 71,7%), teor de celulose bruta (14,77 e 18,48%), teor de lenhina (31,03% e 30,97%) e poder calorifico superior (19,5 MJ/kg e 19,9 MJ/kg). Sumariamente, constatou-se que os valores da maioria dos parâmetros não difere substancialmente dos valores encontrados na literatura, tendo sido evidenciado o potencial da utilização desta biomassa, como fonte calorifica para queima e geração de energia. Sendo a caracterização do óleo extraído da borra de café um dos objetivos antecedentes à produção do biodiesel, pretendeu-se avaliar os diferentes parâmetros mais significativos. No que diz respeito à caracterização do óleo extraído, distingue-se a sua viscosidade cinemática (38,04 mm2/s), densidade 0,9032 g/cm3, poder calorífico de 37,9 kcal/kg, índice de iodo igual a 63,0 gI2/ 100 g óleo, o teor de água do óleo foi de 0,15 %, o índice de acidez igual a 44,8 mg KOH/g óleo, ponto de inflamação superior a 120 ºC e teor em ácidos gordos de 82,8%. Inicialmente foram efetuados ensaios preliminares, a fim de selecionar a lipase (Lipase RMIM, TL 100L e CALB L) e álcool (metanol ou etanol puros) mais adequados à produção de biodiesel, pelo que o rendimento de 83,5% foi obtido através da transesterificação mediada pela lipase RMIM, utilizando como álcool o etanol. Sendo outro dos objetivos a otimização do processo de transesterificação enzimática, através de um desenho composto central a três variáveis (razão molar etanol: óleo, concentração de enzima e temperatura), recorrendo ao software JMP 8.0, determinou-se como melhores condições, uma razão molar etanol: óleo 5:1, adição de 4,5% (m/m) de enzima e uma temperatura de 45 ºC, que conduziram a um rendimento experimental equivalente a 96,7 % e teor de ésteres 87,6%. Nestas condições, o rendimento teórico foi de 99,98%. Procurou-se ainda estudar o efeito da adição de água ao etanol, isto é, o efeito da variação da concentração do etanol pela adição de água, para teores de etanol de 92%, 85% e 75%. Verificou-se que até 92% decorreu um aumento da transesterificação (97,2%) para um teor de ésteres de (92,2%), pelo que para teores superiores de água adicionada (75% e 85%) ocorreu um decréscimo no teor final em ésteres (77,2% e 89,9%) e no rendimento da reação (84,3% e 91,9%). Isto indica a ocorrência da reação de hidrólise em maior extensão, que leva ao desvio do equilíbrio no sentido contrário à reação de formação dos produtos, isto é, dos ésteres. Finalmente, relativamente aos custos associados ao processo de produção de biodiesel, foram estimados para o conjunto de 27 ensaios realizados neste trabalho, e que corresponderam a 767,4 g de biodiesel produzido, sendo o custo dos reagentes superior ao custo energético, de 156,16 € e 126,02 €, respetivamente. Naturalmente que não esperamos que, a nível industrial os custos sejam desta ordem de grandeza, tanto mais que há economia de escala e que as enzimas utilizadas no processo deveriam ser reutilizadas diversas vezes.