37 resultados para Concurrent object- oriented Petri nets (CO-OPN)
em Instituto Politécnico do Porto, Portugal
Resumo:
Applications are subject of a continuous evolution process with a profound impact on their underlining data model, hence requiring frequent updates in the applications' class structure and database structure as well. This twofold problem, schema evolution and instance adaptation, usually known as database evolution, is addressed in this thesis. Additionally, we address concurrency and error recovery problems with a novel meta-model and its aspect-oriented implementation. Modern object-oriented databases provide features that help programmers deal with object persistence, as well as all related problems such as database evolution, concurrency and error handling. In most systems there are transparent mechanisms to address these problems, nonetheless the database evolution problem still requires some human intervention, which consumes much of programmers' and database administrators' work effort. Earlier research works have demonstrated that aspect-oriented programming (AOP) techniques enable the development of flexible and pluggable systems. In these earlier works, the schema evolution and the instance adaptation problems were addressed as database management concerns. However, none of this research was focused on orthogonal persistent systems. We argue that AOP techniques are well suited to address these problems in orthogonal persistent systems. Regarding the concurrency and error recovery, earlier research showed that only syntactic obliviousness between the base program and aspects is possible. Our meta-model and framework follow an aspect-oriented approach focused on the object-oriented orthogonal persistent context. The proposed meta-model is characterized by its simplicity in order to achieve efficient and transparent database evolution mechanisms. Our meta-model supports multiple versions of a class structure by applying a class versioning strategy. Thus, enabling bidirectional application compatibility among versions of each class structure. That is to say, the database structure can be updated because earlier applications continue to work, as well as later applications that have only known the updated class structure. The specific characteristics of orthogonal persistent systems, as well as a metadata enrichment strategy within the application's source code, complete the inception of the meta-model and have motivated our research work. To test the feasibility of the approach, a prototype was developed. Our prototype is a framework that mediates the interaction between applications and the database, providing them with orthogonal persistence mechanisms. These mechanisms are introduced into applications as an {\it aspect} in the aspect-oriented sense. Objects do not require the extension of any super class, the implementation of an interface nor contain a particular annotation. Parametric type classes are also correctly handled by our framework. However, classes that belong to the programming environment must not be handled as versionable due to restrictions imposed by the Java Virtual Machine. Regarding concurrency support, the framework provides the applications with a multithreaded environment which supports database transactions and error recovery. The framework keeps applications oblivious to the database evolution problem, as well as persistence. Programmers can update the applications' class structure because the framework will produce a new version for it at the database metadata layer. Using our XML based pointcut/advice constructs, the framework's instance adaptation mechanism is extended, hence keeping the framework also oblivious to this problem. The potential developing gains provided by the prototype were benchmarked. In our case study, the results confirm that mechanisms' transparency has positive repercussions on the programmer's productivity, simplifying the entire evolution process at application and database levels. The meta-model itself also was benchmarked in terms of complexity and agility. Compared with other meta-models, it requires less meta-object modifications in each schema evolution step. Other types of tests were carried out in order to validate prototype and meta-model robustness. In order to perform these tests, we used an OO7 small size database due to its data model complexity. Since the developed prototype offers some features that were not observed in other known systems, performance benchmarks were not possible. However, the developed benchmark is now available to perform future performance comparisons with equivalent systems. In order to test our approach in a real world scenario, we developed a proof-of-concept application. This application was developed without any persistence mechanisms. Using our framework and minor changes applied to the application's source code, we added these mechanisms. Furthermore, we tested the application in a schema evolution scenario. This real world experience using our framework showed that applications remains oblivious to persistence and database evolution. In this case study, our framework proved to be a useful tool for programmers and database administrators. Performance issues and the single Java Virtual Machine concurrent model are the major limitations found in the framework.
Resumo:
The life cycle of software applications in general is very short and with extreme volatile requirements. Within these conditions programmers need development tools and techniques with an extreme level of productivity. We consider the code reuse as the most prominent approach to solve that problem. Our proposal uses the advantages provided by the Aspect-Oriented Programming in order to build a reusable framework capable to turn both programmer and application oblivious as far as data persistence is concerned, thus avoiding the need to write any line of code about that concern. Besides the benefits to productivity, the software quality increases. This paper describes the actual state of the art, identifying the main challenge to build a complete and reusable framework for Orthogonal Persistence in concurrent environments with support for transactions. The present work also includes a successfully developed prototype of that framework, capable of freeing the programmer of implementing any read or write data operations. This prototype is supported by an object oriented database and, in the future, will also use a relational database and have support for transactions.
Resumo:
Multiple-Choice items are used in many different kinds of tests in several areas of knowledge. They can be considered an interesting tool to the self-assessing or as an alternative or complementary instrument to the traditional methods for assessing knowledge. The objectivity and accuracy of the multiple-choice tests is an important reason to think about. They are especially useful when the number of students to evaluate is too large. Moodle (Modular Object-Oriented Dynamic Learning Environment) is an Open Source course management system centered around learners' needs and designed to support collaborative approaches to teaching and learning. Moodle offers to the users a rich interface, context-specific help buttons, and a wide variety of tools such as discussion forums, wikis, chat, surveys, quizzes, glossaries, journals, grade books and more, that allow them to learn and collaborate in a truly interactive space. Come together the interactivity of the Moodle platform and the objectivity of this kind of tests one can easily build manifold random tests. The proposal of this paper is to relate our journey in the construction of these tests and share our experience in the use of the Moodle platform to create, take advantage and improve the multiple-choices tests in the Mathematic area.
Resumo:
A área da simulação computacional teve um rápido crescimento desde o seu apareciment, sendo actualmente uma das ciências de gestão e de investigação operacional mais utilizadas. O seu princípio baseia-se na replicação da operação de processos ou sistemas ao longo de períodos de tempo, tornando-se assim uma metodologia indispensável para a resolução de variados problemas do mundo real, independentemente da sua complexidade. Das inúmeras áreas de aplicação, nos mais diversos campos, a que mais se destaca é a utilização em sistemas de produção, onde o leque de aplicações disponível é muito vasto. A sua aplicação tem vindo a ser utilizada para solucionar problemas em sistemas de produção, uma vez que permite às empresas ajustar e planear de uma maneira rápida, eficaz e ponderada as suas operações e os seus sistemas, permitindo assim uma rápida adaptação das mesmas às constantes mudanças das necessidades da economia global. As aplicações e packages de simulação têm seguindo as tendências tecnológicas pelo que é notório o recurso a tecnologias orientadas a objectos para o desenvolvimento das mesmas. Este estudo baseou-se, numa primeira fase, na recolha de informação de suporte aos conceitos de modelação e simulação, bem como a respectiva aplicação a sistemas de produção em tempo real. Posteriormente centralizou-se no desenvolvimento de um protótipo de uma aplicação de simulação de ambientes de fabrico em tempo real. O desenvolvimento desta ferramenta teve em vista eventuais fins pedagógicos e uma utilização a nível académico, sendo esta capaz de simular um modelo de um sistema de produção, estando também dotada de animação. Sem deixar de parte a possibilidade de integração de outros módulos ou, até mesmo, em outras plataformas, houve ainda a preocupação acrescida de que a sua implementação recorresse a metodologias de desenvolvimento orientadas a objectos.
Resumo:
Applications refactorings that imply the schema evolution are common activities in programming practices. Although modern object-oriented databases provide transparent schema evolution mechanisms, those refactorings continue to be time consuming tasks for programmers. In this paper we address this problem with a novel approach based on aspect-oriented programming and orthogonal persistence paradigms, as well as our meta-model. An overview of our framework is presented. This framework, a prototype based on that approach, provides applications with aspects of persistence and database evolution. It also provides a new pointcut/advice language that enables the modularization of the instance adaptation crosscutting concern of classes, which were subject to a schema evolution. We also present an application that relies on our framework. This application was developed without any concern regarding persistence and database evolution. However, its data is recovered in each execution, as well as objects, in previous schema versions, remain available, transparently, by means of our framework.
Resumo:
O desenvolvimento aplicacional é uma área em grande expansão no mercado das tecnologias de informação e como tal, é uma área que evolui rápido. Os impulsionadores para esta característica são as comunicações e os equipamentos informáticos, pois detêm características mais robustas e são cada vez mais rápidos. A função das aplicações é acompanhar esta evolução, possuindo arquiteturas mais complexas/completas visando suportar todos os pedidos dos clientes, através da produção de respostas em tempos aceitáveis. Esta dissertação aborda várias arquiteturas aplicacionais possíveis de implementar, mediante o contexto que esteja inserida, como por exemplo, um cenário com poucos ou muitos clientes, pouco ou muito capital para investir em servidores, etc. É fornecido um nivelamento acerca dos conceitos subjacentes ao desenvolvimento aplicacional. Posteriormente é analisado o estado de arte das linguagens de programação web e orientadas a objetos, bases de dados, frameworks em JavaScript, arquiteturas aplicacionais e, por fim, as abordagens para definir objetivos mensuráveis no desenvolvimento aplicacional. Foram implementados dois protótipos. Um deles, numa arquitetura multicamada com várias linguagens de programação e tecnologias. O segundo, numa única camada (monolítica) com uma única linguagem de programação. Os dois protótipos foram testados e comparados com o intuito de escolher uma das arquiteturas, num determinado cenário de utilização.
Resumo:
In this paper the problem of the evolution of an object-oriented database in the context of orthogonal persistent programming systems is addressed. We have observed two characteristics in that type of systems that offer particular conditions to implement the evolution in a semi-transparent fashion. That transparency can further be enhanced with the obliviousness provided by the Aspect-Oriented Programming techniques. Was conceived a meta-model and developed a prototype to test the feasibility of our approach. The system allows programs, written to a schema, access semi-transparently to data in other versions of the schema.
Resumo:
Purpose- Economics and business have evolved as sciences in order to accommodate more of ‘real world’ solutions for the problems approached. In many cases, both business and economics have been supported by other disciplines in order to obtain a more complete framework for the study of complex issues. The aim of this paper is to explore the contribution of three heterodox economics disciplines to the knowledge of business co-operation. Design/methodology/approach- This approach is theoretical and it shows that many relevant aspects of business co-operation have been proposed by economic geography, institutional economics, and economic sociology. Findings- This paper highlights the business mechanisms of co-operation, reflecting on the role of places, institution and the social context where businesses operate. Research Implications- It contributes with a theoretical framework for the explanation of business co-operations and networks that goes beyond the traditional economics theories. Originality/value- This paper contributes with a framework for the study of business co-operation both from an economics and management perspective. This framework embodies a number of non-quantitative issues that are critical for understanding the complex networks in which firms operate.
Resumo:
A constante e sistemática subida de preço dos combustíveis fósseis e as contínuas preocupações com o meio ambiente determinaram a procura de soluções ambientalmente sustentáveis. O biodiesel surge, então, como uma alternativa para essa problemática, bem como uma solução para resíduos líquidos e gordurosos produzidos pelo ser humano. A produção de biodiesel tem sido alvo de extensa atenção nos últimos anos, pois trata-se de um combustível biodegradável e não poluente. A produção de biodiesel pelo processo de transesterificação usando álcoois de cadeia curta e catalisadores químicos, nomeadamente alcalinos, tem sido aceite industrialmente devido à sua elevada conversão. Recentemente, a transesterificação enzimática tem ganho adeptos. No entanto, o custo da enzima permanece uma barreira para a sua aplicação em grande escala. O presente trabalho visa a produção de biodiesel por transesterificação enzimática a partir de óleo residual de origem vegetal. O álcool usado foi o etanol, em substituição do metanol usado convencionalmente na catálise homogénea, pois a atividade da enzima é inibida pela presença deste último. As maiores dificuldades apresentadas na etanólise residem na separação das fases (Glicerol e Biodiesel) após a reação bem como na menor velocidade de reação. Para ajudar a colmatar esta desvantagem foi estudada a influência de dois cosolventes: o hexano e o hexanol, na proporção de 20% (v/v). Após a escolha do co-solvente que permite obter melhor rendimento (o hexano), foi elaborado um planeamento fatorial no qual se estudou a influência de três variáveis na produção de biodiesel por catálise enzimática com etanol e co-solventes: a razão molar óleo/álcool (1:8, 1:6 e 1:4), a quantidade de co-solvente adicionado (30, 20 e 10%, v/v) e o tempo de reação (48, 36 e 24h). A avaliação do processo foi inicialmente seguida pelo rendimento da reação, a fim de identificar as melhores condições, sendo substituída posteriormente pela quantificação do teor de ésteres por cromatografia em fase gasosa. O biodiesel com teor de ésteres mais elevado foi produzido nas condições correspondentes a uma razão molar óleo:álcool de 1:4, com 5g de Lipozyme TL IM como catalisador, 10% co-solvente (hexano, v/v), à temperatura de 35 ºC durante 24h. O rendimento do biodiesel produzido sob estas condições foi de 73,3%, traduzido em 64,7% de teor de ésteres etílicos. Contudo o rendimento mais elevado que se obteve foi de 99,7%, para uma razão óleo/álcool de 1:8, 30% de co-solvente (hexano, v/v), reação durante 48h a 35 ºC, obtendo-se apenas 46,1% de ésteres. Por fim, a qualidade do biodiesel foi ainda avaliada, de acordo com as especificações da norma EN 14214, através das determinações de densidade, viscosidade, ponto de inflamação, teor de água, corrosão ao cobre, índice de acidez, índice de iodo, teor de sódio (Na+) e potássio (K+), CFPP e poder calorífico. Na Europa, os ésteres etílicos não têm, neste momento, norma que os regule quanto à classificação da qualidade de biodiesel. Contudo, o biodiesel produzido foi analisado de acordo com a norma europeia EN14214, norma esta que regula a qualidade dos ésteres metílicos, sendo possível concluir que nenhum dos parâmetros avaliados se encontra em conformidade com a mesma.
Resumo:
O desenvolvimento de software orientado a modelos defende a utilização dos modelos como um artefacto que participa activamente no processo de desenvolvimento. O modelo ocupa uma posição que se encontra ao mesmo nível do código. Esta é uma abordagem importante que tem sido alvo de atenção crescente nos últimos tempos. O Object Management Group (OMG) é o responsável por uma das principais especificações utilizadas na definição da arquitectura dos sistemas cujo desenvolvimento é orientado a modelos: o Model Driven Architecture (MDA). Os projectos que têm surgido no âmbito da modelação e das linguagens específicas de domínio para a plataforma Eclipse são um bom exemplo da atenção dada a estas áreas. São projectos totalmente abertos à comunidade, que procuram respeitar os standards e que constituem uma excelente oportunidade para testar e por em prática novas ideias e abordagens. Nesta dissertação foram usadas ferramentas criadas no âmbito do Amalgamation Project, desenvolvido para a plataforma Eclipse. Explorando o UML e usando a linguagem QVT, desenvolveu-se um processo automático para extrair elementos da arquitectura do sistema a partir da definição de requisitos. Os requisitos são representados por modelos UML que são transformados de forma a obter elementos para uma aproximação inicial à arquitectura do sistema. No final, obtêm-se um modelo UML que agrega os componentes, interfaces e tipos de dados extraídos a partir dos modelos dos requisitos. É uma abordagem orientada a modelos que mostrou ser exequível, capaz de oferecer resultados práticos e promissora no que concerne a trabalho futuro.
Resumo:
Concentrations of eleven trace elements (Al, As, Cd, Cr, Co, Hg, Mn, Ni, Pb, Se, and Si) were measured in 39 (natural and flavoured) water samples. Determinations were performed using graphite furnace electrothermetry for almost all elements (Al, As, Cd, Cr, Co, Mn, Ni, Pb, and Si). For Se determination hydride generation was used, and cold vapour generation for Hg. These techniques were coupled to atomic absorption spectrophotometry. The trace element content of still or sparkling natural waters changed from brand to brand. Significant differences between natural still and natural sparkling waters (p<0.001) were only apparent for Mn. The Mann–Whitney U-test was used to search for significant differences between flavoured and natural waters. The concentration of each element was compared with the presence of flavours, preservatives, acidifying agents, fruit juice and/or sweeteners, according to the labelled composition. It was shown that flavoured waters generally increase the trace element content. The addition of preservatives and acidifying regulators had a significant influence on Mn, Co, As and Si contents (p<0.05). Fruit juice can also be correlated to the increase of Co and As. Sweeteners did not provide any significant difference in Mn, Co, Se and Si content.
Resumo:
Adhesive bonding has become more efficient in the last few decades due to the adhesives developments, granting higher strength and ductility. On the other hand, natural fibre composites have recently gained interest due to the low cost and density. It is therefore essential to predict the fracture behavior of joints between these materials, to assess the feasibility of joining or repairing with adhesives. In this work, the tensile fracture toughness (Gc n) of adhesive joints between natural fibre composites is studied, by bonding with a ductile adhesive and co-curing. Conventional methods to obtain Gc n are used for the co-cured specimens, while for the adhesive within the bonded joint, the J-integral is considered. For the J-integral calculation, an optical measurement method is developed for the evaluation of the crack tip opening and adherends rotation at the crack tip during the test, supported by a Matlab sub-routine for the automated extraction of these quantities. As output of this work, an optical method that allows an easier and quicker extraction of the parameters to obtain Gc n than the available methods is proposed (by the J-integral technique), and the fracture behaviour in tension of bonded and co-cured joints in jute-reinforced natural fibre composites is also provided for the subsequent strength prediction. Additionally, for the adhesively- bonded joints, the tensile cohesive law of the adhesive is derived by the direct method.
Resumo:
In this study, efforts were made in order to put forward an integrated recycling approach for the thermoset based glass fibre reinforced polymer (GPRP) rejects derived from the pultrusion manufacturing industry. Both the recycling process and the development of a new cost-effective end-use application for the recyclates were considered. For this purpose, i) among the several available recycling techniques for thermoset based composite materials, the most suitable one for the envisaged application was selected (mechanical recycling); and ii) an experimental work was carried out in order to assess the added-value of the obtained recyclates as aggregates and reinforcement replacements into concrete-polymer composite materials. Potential recycling solution was assessed by mechanical behaviour of resultant GFRP waste modified concrete-polymer composites with regard to unmodified materials. In the mix design process of the new GFRP waste based composite material, the recyclate content and size grade, and the effect of the incorporation of an adhesion promoter were considered as material factors and systematically tested between reasonable ranges. The optimization process of the modified formulations was supported by the Fuzzy Boolean Nets methodology, which allowed finding the best balance between material parameters that maximizes both flexural and compressive strengths of final composite. Comparing to related end-use applications of GFRP wastes in cementitious based concrete materials, the proposed solution overcome some of the problems found, namely the possible incompatibilities arisen from alkalis-silica reaction and the decrease in the mechanical properties due to high water-cement ratio required to achieve the desirable workability. Obtained results were very promising towards a global cost-effective waste management solution for GFRP industrial wastes and end-of-life products that will lead to a more sustainable composite materials industry.
Resumo:
Virtual Reality (VR) has grown to become state-of-theart technology in many business- and consumer oriented E-Commerce applications. One of the major design challenges of VR environments is the placement of the rendering process. The rendering process converts the abstract description of a scene as contained in an object database to an image. This process is usually done at the client side like in VRML [1] a technology that requires the client’s computational power for smooth rendering. The vision of VR is also strongly connected to the issue of Quality of Service (QoS) as the perceived realism is subject to an interactive frame rate ranging from 10 to 30 frames-per-second (fps), real-time feedback mechanisms and realistic image quality. These requirements overwhelm traditional home computers or even high sophisticated graphical workstations over their limits. Our work therefore introduces an approach for a distributed rendering architecture that gracefully balances the workload between the client and a clusterbased server. We believe that a distributed rendering approach as described in this paper has three major benefits: It reduces the clients workload, it decreases the network traffic and it allows to re-use already rendered scenes.
Resumo:
Classical lock-based concurrency control does not scale with current and foreseen multi-core architectures, opening space for alternative concurrency control mechanisms. The concept of transactions executing concurrently in isolation with an underlying mechanism maintaining a consistent system state was already explored in fault-tolerant and distributed systems, and is currently being explored by transactional memory, this time being used to manage concurrent memory access. In this paper we discuss the use of Software Transactional Memory (STM), and how Ada can provide support for it. Furthermore, we draft a general programming interface to transactional memory, supporting future implementations of STM oriented to real-time systems.