46 resultados para ebXML (Electronic Business using eXtensible Markup Language)
em Instituto Politécnico do Porto, Portugal
Resumo:
Over time, XML markup language has acquired a considerable importance in applications development, standards definition and in the representation of large volumes of data, such as databases. Today, processing XML documents in a short period of time is a critical activity in a large range of applications, which imposes choosing the most appropriate mechanism to parse XML documents quickly and efficiently. When using a programming language for XML processing, such as Java, it becomes necessary to use effective mechanisms, e.g. APIs, which allow reading and processing of large documents in appropriated manners. This paper presents a performance study of the main existing Java APIs that deal with XML documents, in order to identify the most suitable one for processing large XML files
Resumo:
Over time, XML markup language has acquired a considerable importance in applications development, standards definition and in the representation of large volumes of data, such as databases. Today, processing XML documents in a short period of time is a critical activity in a large range of applications, which imposes choosing the most appropriate mechanism to parse XML documents quickly and efficiently. When using a programming language for XML processing, such as Java, it becomes necessary to use effective mechanisms, e.g. APIs, which allow reading and processing of large documents in appropriated manners. This paper presents a performance study of the main existing Java APIs that deal with XML documents, in order to identify the most suitable one for processing large XML files.
Resumo:
Esta dissertação enquadra-se no âmbito dos Sistemas de Informação, em concreto, no desenvolvimento de aplicações Web, como é o caso de um website. Com a utilização em larga escala dos meios tecnológicos tem-se verificado um crescimento exponencial dos mesmos, o que se traduz na facilidade com que podem ser encontradas na Internet diversos tipos de plataformas informáticas. Além disso, hoje em dia, uma grande parte das organizações possui o seu próprio sítio na Internet, onde procede à divulgação dos seus serviços e/ou produtos. Pretende-se com esta dissertação explorar estas novas tecnologias, nomeadamente, os diagramas UML - Unified Modeling Language e a concepção de bases de dados, e posteriormente desenvolver um website. Com o desenvolvimento deste website não se propõe a criação de uma nova tecnologia, mas o uso de diversas tecnologias em conjunto com recurso às ferramentas UML. Este encontra-se organizado em três fases principais: análise de requisitos, implementação e desenho das interfaces. Na análise de requisitos efectuou-se o levantamento dos objectivos propostos para o sistema e das necessidades/requisitos necessários à sua implementação, auxiliado essencialmente pelo Diagrama de Use Cases do sistema. Na fase de implementação foram elaborados os arquivos e directórios que formam a arquitectura lógica de acordo com os modelos descritos no Diagrama de Classes e no Diagrama de Entidade-Relação. Os requisitos identificados foram analisados e usados na composição das interfaces e sistema de navegação. Por fim, na fase de desenho das interfaces foram aperfeiçoadas as interfaces desenvolvidas, com base no conceito artístico e criativo do autor. Este aperfeiçoamento vai de encontro ao gosto pessoal e tem como objectivo elaborar uma interface que possa também agradar ao maior número possível de utilizadores. Este pode ser observado na maneira como se encontram distribuídas as ligações (links) entre páginas, nos títulos, nos cabeçalhos, nas cores e animações e no seu design em geral. Para o desenvolvimento do website foram utilizadas diferentes linguagens de programação, nomeadamente a HyperText Markup Language (HTML), a Page Hypertext Preprocessor (PHP) e Javascript. A HTML foi utilizada para a disposição de todo o conteúdo visível das páginas e para definição do layout das mesmas e a PHP para executar pequenos scripts que permitem interagir com as diferentes funcionalidades do site. A linguagem Javascript foi usada para definir o design das páginas e incluir alguns efeitos visuais nas mesmas. Para a construção das páginas que compõem o website foi utilizado o software Macromedia Dreamweaver, o que simplificou a sua implementação pela facilidade com que estas podem ser construídas. Para interacção com o sistema de gestão da base de dados, o MySQL, foi utilizada a aplicação phpMyAdmin, que simplifica o acesso à base de dados, permitindo definir, manipular e consultar os seus dados.
Resumo:
In this paper we describe a low cost distributed system intended to increase the positioning accuracy of outdoor navigation systems based on the Global Positioning System (GPS). Since the accuracy of absolute GPS positioning is insufficient for many outdoor navigation tasks, another GPS based methodology – the Differential GPS (DGPS) – was developed in the nineties. The differential or relative positioning approach is based on the calculation and dissemination of the range errors of the received GPS satellites. GPS/DGPS receivers correlate the broadcasted GPS data with the DGPS corrections, granting users increased accuracy. DGPS data can be disseminated using terrestrial radio beacons, satellites and, more recently, the Internet. Our goal is to provide mobile platforms within our campus with DGPS data for precise outdoor navigation. To achieve this objective, we designed and implemented a three-tier client/server distributed system that, first, establishes Internet links with remote DGPS sources and, then, performs campus-wide dissemination of the obtained data. The Internet links are established between data servers connected to remote DGPS sources and the client, which is the data input module of the campus-wide DGPS data provider. The campus DGPS data provider allows the establishment of both Intranet and wireless links within the campus. This distributed system is expected to provide adequate support for accurate outdoor navigation tasks.
Resumo:
A Classificação Internacional da Funcionalidade, Crianças e Jovens (CIF-CJ), usando uma linguagem comum e universal, permite a comunicação entre os diferentes profissionais e investigadores. Sustenta-se numa abordagem biopsicossocial que pressupõe um processo interactivo da participação com os contextos de vida da criança, classificando a criança quanto à sua funcionalidade, definindo perfis de funcionalidade. A reestruturação do Programa Educativo Individual (PEI) através do decreto-lei 3/2008, passa a incluir uma terminologia da CIF-CJ, para a determinação do Perfil de Funcionalidade e Plano de Intervenção. Devido à sua pertinência e actualidade foi objectivo deste estudo fazer a análise da aplicação da CIF nos Programas Educativos Individualizados das crianças da Creche e Jardim de Infância do Cabedelo, verificando se os conteúdos dos PEI das crianças tem ligação com as componentes da CIF-CJ, se são elaborados de modo a englobarem predominantemente os constructos ligados à Actividade e Participação e ao Ambiente segundo o modelo biopsicossocial, e se os constructos identificados no Perfil de Funcionalidade surgem como alvo no Plano de Intervenção. Seleccionámos uma amostra de 15 crianças com Necessidades Educativas Especiais (NEE) com idades entre os dois e os seis anos de idade. A metodologia foi mista iniciando com um estudo qualitativo seguido de uma metodologia quantitativa. Assim procedeu-se a uma análise de conteúdo dos PEI’s, utilizando as linking rules para os relacionar com a CIF-CJ, e posteriormente efectuou-se uma análise de frequências com recurso à estatística descritiva. Os resultados indicam existir uma ligação dos conteúdos dos PEI’s com os componentes da CIF-CJ. O Perfil de Funcionalidade e Plano de Intervenção centram-se na componente Actividades e Participação sendo os Factores do Ambiente menos citados em ambos os processos. Em relação à existência de correspondência dos constructos do Perfil de Funcionalidade com o Plano de Intervenção não há uma correspondência directa em grande parte dos códigos, havendo no entanto uma correspondência na mesma área ou em áreas diferentes mas relacionáveis entre si.
Resumo:
Mestrado em Engenharia Informática
Resumo:
O desenvolvimento de software orientado a modelos defende a utilização dos modelos como um artefacto que participa activamente no processo de desenvolvimento. O modelo ocupa uma posição que se encontra ao mesmo nível do código. Esta é uma abordagem importante que tem sido alvo de atenção crescente nos últimos tempos. O Object Management Group (OMG) é o responsável por uma das principais especificações utilizadas na definição da arquitectura dos sistemas cujo desenvolvimento é orientado a modelos: o Model Driven Architecture (MDA). Os projectos que têm surgido no âmbito da modelação e das linguagens específicas de domínio para a plataforma Eclipse são um bom exemplo da atenção dada a estas áreas. São projectos totalmente abertos à comunidade, que procuram respeitar os standards e que constituem uma excelente oportunidade para testar e por em prática novas ideias e abordagens. Nesta dissertação foram usadas ferramentas criadas no âmbito do Amalgamation Project, desenvolvido para a plataforma Eclipse. Explorando o UML e usando a linguagem QVT, desenvolveu-se um processo automático para extrair elementos da arquitectura do sistema a partir da definição de requisitos. Os requisitos são representados por modelos UML que são transformados de forma a obter elementos para uma aproximação inicial à arquitectura do sistema. No final, obtêm-se um modelo UML que agrega os componentes, interfaces e tipos de dados extraídos a partir dos modelos dos requisitos. É uma abordagem orientada a modelos que mostrou ser exequível, capaz de oferecer resultados práticos e promissora no que concerne a trabalho futuro.
Resumo:
This paper presents an architecture (Multi-μ) being implemented to study and develop software based fault tolerant mechanisms for Real-Time Systems, using the Ada language (Ada 95) and Commercial Off-The-Shelf (COTS) components. Several issues regarding fault tolerance are presented and mechanisms to achieve fault tolerance by software active replication in Ada 95 are discussed. The Multi-μ architecture, based on a specifically proposed Fault Tolerance Manager (FTManager), is then described. Finally, some considerations are made about the work being done and essential future developments.
Resumo:
Mestrado em Engenharia Informática - Área de Especialização em Tecnologias do Conhecimento e Decisão
Resumo:
This paper presents a Multi-Agent Market simulator designed for developing new agent market strategies based on a complete understanding of buyer and seller behaviors, preference models and pricing algorithms, considering user risk preferences and game theory for scenario analysis. This tool studies negotiations based on different market mechanisms and, time and behavior dependent strategies. The results of the negotiations between agents are analyzed by data mining algorithms in order to extract rules that give agents feedback to improve their strategies. The system also includes agents that are capable of improving their performance with their own experience, by adapting to the market conditions, and capable of considering other agent reactions.
Resumo:
Doctoral Thesis in Information Systems and Technologies Area of Engineering and Manag ement Information Systems
Resumo:
Currently, due to the widespread use of computers and the internet, students are trading libraries for the World Wide Web and laboratories with simulation programs. In most courses, simulators are made available to students and can be used to proof theoretical results or to test a developing hardware/product. Although this is an interesting solution: low cost, easy and fast way to perform some courses work, it has indeed major disadvantages. As everything is currently being done with/in a computer, the students are loosing the “feel” of the real values of the magnitudes. For instance in engineering studies, and mainly in the first years, students need to learn electronics, algorithmic, mathematics and physics. All of these areas can use numerical analysis software, simulation software or spreadsheets and in the majority of the cases data used is either simulated or random numbers, but real data could be used instead. For example, if a course uses numerical analysis software and needs a dataset, the students can learn to manipulate arrays. Also, when using the spreadsheets to build graphics, instead of using a random table, students could use a real dataset based, for instance, in the room temperature and its variation across the day. In this work we present a framework which uses a simple interface allowing it to be used by different courses where the computers are the teaching/learning process in order to give a more realistic feeling to students by using real data. A framework is proposed based on a set of low cost sensors for different physical magnitudes, e.g. temperature, light, wind speed, which are connected to a central server, that the students have access with an Ethernet protocol or are connected directly to the student computer/laptop. These sensors use the communication ports available such as: serial ports, parallel ports, Ethernet or Universal Serial Bus (USB). Since a central server is used, the students are encouraged to use sensor values results in their different courses and consequently in different types of software such as: numerical analysis tools, spreadsheets or simply inside any programming language when a dataset is needed. In order to do this, small pieces of hardware were developed containing at least one sensor using different types of computer communication. As long as the sensors are attached in a server connected to the internet, these tools can also be shared between different schools. This allows sensors that aren't available in a determined school to be used by getting the values from other places that are sharing them. Another remark is that students in the more advanced years and (theoretically) more know how, can use the courses that have some affinities with electronic development to build new sensor pieces and expand the framework further. The final solution provided is very interesting, low cost, simple to develop, allowing flexibility of resources by using the same materials in several courses bringing real world data into the students computer works.
Resumo:
Globalisation has eliminated frontiers and in the case of Europe helped the crossing of borders and changed forever the concept of social (and I would also say individual) geography: the rest of the world is out there; we can pretend not to see it, but we cannot avoid it. Moreover, Europe is undergoing a serious crisis, also economic, and new markets and business partners are welcome. In this context, cultural bonds like a common language can open new investment paths and give rise to successful stories. In this paper we intend to present an example of how low linguistic distance can lead to good business, even if a) in the internationalization process of the companies’ language is often forgotten as a management element and b) consumers of language products (like User Guides) are also not stimulating investment in language by the companies. Through the results of 2 studies carried out in 2010 and 2011 we will show how a pluricentric language like Portuguese is managed in multinational companies (MC) and Small and medium Enterprises (SMEs). The second study is based on an online survey questioning the effectiveness, efficiency and general quality of User Guides and the reaction of consumers to language. Results show that although playing a role in the internationalization process of companies in the same linguistic space, language is opportunistically used. On the other hand, Portuguese and Brazilian consumers show a very functional perception of the Portuguese language...
Resumo:
Introduction / Aims: Adopting the important decisions represents a specific task of the manager. An efficient manager takes these decisions during a sistematic process with well-defined elements, each with a precise order. In the pharmaceutical practice and business, in the supply process of the pharmacies, there are situations when the medicine distributors offer a certain discount, but require payment in a shorter period of time. In these cases, the analysis of the offer can be made with the help of the decision tree method, which permits identifying the decision offering the best possible result in a given situation. The aims of the research have been the analysis of the product offers of many different suppliers and the establishing of the most advantageous ways of pharmacy supplying. Material / Methods: There have been studied the general product offers of the following medical stores: A&G Med, Farmanord, Farmexim, Mediplus, Montero and Relad. In the case of medicine offers including a discount, the decision tree method has been applied in order to select the most advantageous offers. The Decision Tree is a management method used in taking the right decisions and it is generally used when one needs to evaluate the decisions that involve a series of stages. The tree diagram is used in order to look for the most efficient means to attain a specific goal. The decision trees are the most probabilistic methods, useful when adopting risk taking decisions. Results: The results of the analysis on the tree diagrams have indicated the fact that purchasing medicines with discount (1%, 10%, 15%) and payment in a shorter time interval (120 days) is more profitable than purchasing without a discount and payment in a longer time interval (160 days). Discussion / Conclusion: Depending on the results of the tree diagram analysis, the pharmacies would purchase from the selected suppliers. The research has shown that the decision tree method represents a valuable work instrument in choosing the best ways for supplying pharmacies and it is very useful to the specialists from the pharmaceutical field, pharmaceutical management, to medicine suppliers, pharmacy practitioners from the community pharmacies and especially to pharmacy managers, chief – pharmacists.
Resumo:
Electricity market players operating in a liberalized environment requires access to an adequate decision support tool, allowing them to consider all the business opportunities and take strategic decisions. Ancillary services represent a good negotiation opportunity that must be considered by market players. For this, decision support tools must include ancillary market simulation. This paper proposes two different methods (Linear Programming and Genetic Algorithm approaches) for ancillary services dispatch. The methodologies are implemented in MASCEM, a multi-agent based electricity market simulator. A test case concerning the dispatch of Regulation Down, Regulation Up, Spinning Reserve and Non-Spinning Reserve services is included in this paper.