994 resultados para Automated process
Resumo:
The development of self-adaptive software (SaS) has specific characteristics compared to traditional one, since it allows that changes to be incorporated at runtime. Automated processes have been used as a feasible solution to conduct the software adaptation at runtime. In parallel, reference model has been used to aggregate knowledge and architectural artifacts, since capture the systems essence of specific domains. However, there is currently no reference model based on reflection for the development of SaS. Thus, the main contribution of this paper is to present a reference model based on reflection for development of SaS that have a need to adapt at runtime. To present the applicability of this model, a case study was conducted and good perspective to efficiently contribute to the area of SaS has been obtained.
Resumo:
The problem of software (SW) defaults is becoming more and more topical because of increasing amount of the SW and its complication. The majority of these defaults are founded during the test part that consumes about 40-50% of the development efforts. Test automation allows reducing the cost of this process and increasing testing effectiveness. In the middle of 1980 the first tools for automated testing appeared and the automated process was implemented in different kinds of SW testing. In short time, it became obviously, automated testing can cause many problems such as increasing product cost, decreasing reliability and even project fail. This thesis describes automated testing process, its concept, lists main problems, and gives an algorithm for automated test tools selection. Also this work presents an overview of the main automated test tools for embedded systems.
Resumo:
O desenvolvimento de software orientado a modelos defende a utilização dos modelos como um artefacto que participa activamente no processo de desenvolvimento. O modelo ocupa uma posição que se encontra ao mesmo nível do código. Esta é uma abordagem importante que tem sido alvo de atenção crescente nos últimos tempos. O Object Management Group (OMG) é o responsável por uma das principais especificações utilizadas na definição da arquitectura dos sistemas cujo desenvolvimento é orientado a modelos: o Model Driven Architecture (MDA). Os projectos que têm surgido no âmbito da modelação e das linguagens específicas de domínio para a plataforma Eclipse são um bom exemplo da atenção dada a estas áreas. São projectos totalmente abertos à comunidade, que procuram respeitar os standards e que constituem uma excelente oportunidade para testar e por em prática novas ideias e abordagens. Nesta dissertação foram usadas ferramentas criadas no âmbito do Amalgamation Project, desenvolvido para a plataforma Eclipse. Explorando o UML e usando a linguagem QVT, desenvolveu-se um processo automático para extrair elementos da arquitectura do sistema a partir da definição de requisitos. Os requisitos são representados por modelos UML que são transformados de forma a obter elementos para uma aproximação inicial à arquitectura do sistema. No final, obtêm-se um modelo UML que agrega os componentes, interfaces e tipos de dados extraídos a partir dos modelos dos requisitos. É uma abordagem orientada a modelos que mostrou ser exequível, capaz de oferecer resultados práticos e promissora no que concerne a trabalho futuro.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
Resumo:
Pavements require maintenance in order to provide good service levels during their life period. Because of the significant costs of this operation and the importance of a proper planning, a pavement evaluation methodology, named Pavement Condition Index (PCI), was created by the U.S. Army Corps of Engineers. This methodology allows for the evaluation of the pavement condition along the life period, generally yearly, with minimum costs and, in this way, it is possible to plan the maintenance action and to adopt adequate measures, minimising the rehabilitation costs. The PCI methodology provides an evaluation based on visual inspection, namely on the distresses observed on the pavement. This condition index of the pavement is classified from 0 to 100, where 0 it is the worst possible condition and 100 the best possible condition. This methodology of pavement assessment represents a significant tool for management methods such as airport pavement management system (APMS) and life-cycle costs analysis (LCCA). Nevertheless, it has some limitations which can jeopardize the correct evaluation of the pavement behavior. Therefore the objective of this dissertation is to help reducing its limitations and make it easier and faster to use. Thus, an automated process of PCI calculation was developed, avoiding the abaci consultation, and consequently, minimizing the human error. To facilitate also the visual inspection a Tablet application was developed to replace the common inspection data sheet and thus making the survey easier to be undertaken. Following, an airport pavement condition was study accordingly with the methodology described at Standard Test Method for Airport Pavement Condition Index Surveys D5340, 2011 where its original condition level is compared with the condition level after iterate possible erroneous considered distresses as well as possible rehabilitations. Afterwards, the results obtained were analyzed and the main conclusions presented together with some future developments.
Resumo:
Kasvava kilpailu ja alati globalisoituvat markkinat ovat pakottaneet yrityksiä hakemaan tehokkuutta myös taloushallinnon kaltaisista prosesseista, joiden tehokkuuteen ei aikaisemmin kiinnitetty juurikaan huomiota. Tutkielman ensimmäisenä keskeisenä tavoitteena on lisätä tietoa taloushallinnon tehostamisesta peilaamalla sitä tutkielman teoreettisen viitekehyksen kolmen keskeisen elementin – liiketoimintaprosessien tehostamisen, sähköisen taloushallinnon innovaatioiden ja taloushallinnon rajat ylittävän ulkoistamisen – kautta. Tutkimuksen toisena keskeisenä tavoitteena on syventää tätä teoreettista tietoa ja analysoida taloushallinnon tehostamiseksi kehitettyjen menetelmien toimivuutta Suomessa toimivan yrityksen ostolaskuprosessin osalta. Tutkielmassa pystyttiin selkeästi tuomaan esiin taloushallinnon tehostumiseen vaikuttaneita tekijöitä ja osoittamaan näiden ostolaskuprosessia tehostava vaikutus. Analysoitaessa eri menetelmien keskinäistä tehokkuutta voidaan suomalaisen yrityksen näkökulmasta perustellusti olettaa, että rajat ylittävät ulkoistamiset ovat vain välivaihe matkalla mahdollisimman pitkälle automatisoituun prosessiin – kun laskut vastaanotetaan sähköisinä verkkolaskuina ja täsmäytetään automaattisesti sähköisiin ostotilauksiin, voidaankin kysyä mitä ostolaskuprosessista voitaisiin ylipäänsä ulkoistaa.
Resumo:
It is rare for data's history to include computational processes alone. Even when software generates data, users ultimately decide to execute software procedures, choose their configuration and inputs, reconfigure, halt and restart processes, and so on. Understanding the provenance of data thus involves understanding the reasoning of users behind these decisions, but demanding that users explicitly document decisions could be intrusive if implemented naively, and impractical in some cases. In this paper, therefore, we explore an approach to transparently deriving the provenance of user decisions at query time. The user reasoning is simulated, and if the result of the simulation matches the documented decision, the simulation is taken to approximate the actual reasoning. The plausibility of this approach requires that the simulation mirror human decision -making, so we adopt an automated process explicitly modelled on human psychology. The provenance of the decision is modelled in OPM, allowing it to be queried as part of a larger provenance graph, and an OPM profile is provided to allow consistent querying of provenance across user decisions.
Resumo:
In any welding process is of utmost importance by welders and responsible qualities of the area understand the process and the variables involved in it, in order to have maximum efficiency in welding both in terms of quality as the final cost , never forgetting, of course, the process conditions which the welder or welding operator shall be submitted. Therefore, we sought to understand the variables relevant to the welding process and develop an EPS (Welding Procedure Specification) as ASME IX for cored wire welding process (FCAW Specification AWS) with shielding gas and automated process for base material ASTM a 131, with 5/16 thick, using a single pass weld, for conditions with pre-and post-heating and the destructive testing for verification and analysis of the resulting weld bead
Resumo:
Automated Production Systems Development involves aspects concerning the integration of technological components that exist on the market, such as: Programmable Logic Controllers (PLC), robot manipulators, various sensors and actuators, image processing systems, communication networks and collaborative supervisory systems; all integrated into a single application. This paper proposes an automated platform for experimentation, implemented through typical architecture for Automated Production Systems, which integrates the technological components described above, in order to allow researchers and students to carry out practical laboratory activities. These activities will complement the theoretical knowledge acquired by the students in the classroom, thus improving their training and professional skills. A platform designed using this generic structure will allow users to work within an educational environment that reflects most aspects found in Industrial Automated Manufacturing Systems, such as technology integration, communication networks, process control and production management. In addition, this platform offers the possibility complete automated process of control and supervision via remote connection through the internet (WebLab), enabling knowledge sharing between different teaching and research groups.
Resumo:
Ontology design and population -core aspects of semantic technologies- re- cently have become fields of great interest due to the increasing need of domain-specific knowledge bases that can boost the use of Semantic Web. For building such knowledge resources, the state of the art tools for ontology design require a lot of human work. Producing meaningful schemas and populating them with domain-specific data is in fact a very difficult and time-consuming task. Even more if the task consists in modelling knowledge at a web scale. The primary aim of this work is to investigate a novel and flexible method- ology for automatically learning ontology from textual data, lightening the human workload required for conceptualizing domain-specific knowledge and populating an extracted schema with real data, speeding up the whole ontology production process. Here computational linguistics plays a fundamental role, from automati- cally identifying facts from natural language and extracting frame of relations among recognized entities, to producing linked data with which extending existing knowledge bases or creating new ones. In the state of the art, automatic ontology learning systems are mainly based on plain-pipelined linguistics classifiers performing tasks such as Named Entity recognition, Entity resolution, Taxonomy and Relation extraction [11]. These approaches present some weaknesses, specially in capturing struc- tures through which the meaning of complex concepts is expressed [24]. Humans, in fact, tend to organize knowledge in well-defined patterns, which include participant entities and meaningful relations linking entities with each other. In literature, these structures have been called Semantic Frames by Fill- 6 Introduction more [20], or more recently as Knowledge Patterns [23]. Some NLP studies has recently shown the possibility of performing more accurate deep parsing with the ability of logically understanding the structure of discourse [7]. In this work, some of these technologies have been investigated and em- ployed to produce accurate ontology schemas. The long-term goal is to collect large amounts of semantically structured information from the web of crowds, through an automated process, in order to identify and investigate the cognitive patterns used by human to organize their knowledge.
Resumo:
Computer-assisted translation (or computer-aided translation or CAT) is a form of language translation in which a human translator uses computer software in order to facilitate the translation process. Machine translation (MT) is the automated process by which a computerized system produces a translated text or speech from one natural language to another. Both of them are leading and promising technologies in the translation industry; it therefore seems important that translation students and professional translators become familiar with this relatively new types of technology. Whether used together, not only might these two different types of systems reduce translation time, but also lead to a further improvement in the field of translation technologies. The dissertation consists of four chapters. The first one surveys the chronological development of MT and CAT tools, the emergence of pre-editing, post-editing and controlled language and the very last frontiers in this sector. The second one provide a general overview on the four main CAT tools that are used nowadays and tested hereto. The third chapter is dedicated to the experimentations that have been conducted in order to analyze and evaluate the performance of the four integrated systems that are the core subject of this dissertation. Finally, the fourth chapter deals with the issue of terminological equivalence in interlinguistic translation. The purpose of this dissertation is not to provide an objective and definitive solution to the complex issues that arise at any time in the field of translation technologies, this aim being well away from being achieved, but to supply information about the limits and potentiality that are typical of those instruments which are now essential to any professional translator.
Resumo:
ab-initio Hartree Fock (HF), density functional theory (DFT) and hybrid potentials were employed to compute the optimized lattice parameters and elastic properties of perovskite 3-d transition metal oxides. The optimized lattice parameters and elastic properties are interdependent in these materials. An interaction is observed between the electronic charge, spin and lattice degrees of freedom in 3-d transition metal oxides. The coupling between the electronic charge, spin and lattice structures originates due to localization of d-atomic orbitals. The coupling between the electronic charge, spin and crystalline lattice also contributes in the ferroelectric and ferromagnetic properties in perovskites. The cubic and tetragonal crystalline structures of perovskite transition metal oxides of ABO3 are studied. The electronic structure and the physics of 3-d perovskite materials is complex and less well considered. Moreover, the novelty of the electronic structure and properties of these perovskites transition metal oxides exceeds the challenge offered by their complex crystalline structures. To achieve the objective of understanding the structure and property relationship of these materials the first-principle computational method is employed. CRYSTAL09 code is employed for computing crystalline structure, elastic, ferromagnetic and other electronic properties. Second-order elastic constants (SOEC) and bulk moduli (B) are computed in an automated process by employing ELASTCON (elastic constants) and EOS (equation of state) programs in CRYSTAL09 code. ELASTCON, EOS and other computational algorithms are utilized to determine the elastic properties of tetragonal BaTiO3, rutile TiO2, cubic and tetragonal BaFeO3 and the ferromagentic properties of 3-d transition metal oxides. Multiple methods are employed to crosscheck the consistency of our computational results. Computational results have motivated us to explore the ferromagnetic properties of 3-d transition metal oxides. Billyscript and CRYSTAL09 code are employed to compute the optimized geometry of the cubic and tetragonal crystalline structure of transition metal oxides of Sc to Cu. Cubic crystalline structure is initially chosen to determine the effect of lattice strains on ferromagnetism due to the spin angular momentum of an electron. The 3-d transition metals and their oxides are challenging as the basis functions and potentials are not fully developed to address the complex physics of the transition metals. Moreover, perovskite crystalline structures are extremely challenging with respect to the quality of computations as the latter requires the well established methods. Ferroelectric and ferromagnetic properties of bulk, surfaces and interfaces are explored by employing CRYSTAL09 code. In our computations done on cubic TMOs of Sc-Fe it is observed that there is a coupling between the crystalline structure and FM/AFM spin polarization. Strained crystalline structures of 3-d transition metal oxides are subjected to changes in the electromagnetic and electronic properties. The electronic structure and properties of bulk, composites, surfaces of 3-d transition metal oxides are computed successfully.
Resumo:
Background: Ceramic materials are used in a growing proportion of hip joint prostheses due to their wear resistance and biocompatibility properties. However, ceramics have not been applied successfully in total knee joint endoprostheses to date. One reason for this is that with strict surface quality requirements, there are significant challenges with regard to machining. High-toughness bioceramics can only be machined by grinding and polishing processes. The aim of this study was to develop an automated process chain for the manufacturing of an all-ceramic knee implant. Methods: A five-axis machining process was developed for all-ceramic implant components. These components were used in an investigation of the influence of surface conformity on wear behavior under simplified knee joint motion. Results: The implant components showed considerably reduced wear compared to conventional material combinations. Contact area resulting from a variety of component surface shapes, with a variety of levels of surface conformity, greatly influenced wear rate. Conclusions: It is possible to realize an all-ceramic knee endoprosthesis device, with a precise and affordable manufacturing process. The shape accuracy of the component surfaces, as specified by the design and achieved during the manufacturing process, has a substantial influence on the wear behavior of the prosthesis. This result, if corroborated by results with a greater sample size, is likely to influence the design parameters of such devices.
Resumo:
Background: Statistical analysis of DNA microarray data provides a valuable diagnostic tool for the investigation of genetic components of diseases. To take advantage of the multitude of available data sets and analysis methods, it is desirable to combine both different algorithms and data from different studies. Applying ensemble learning, consensus clustering and cross-study normalization methods for this purpose in an almost fully automated process and linking different analysis modules together under a single interface would simplify many microarray analysis tasks. Results: We present ArrayMining.net, a web-application for microarray analysis that provides easy access to a wide choice of feature selection, clustering, prediction, gene set analysis and cross-study normalization methods. In contrast to other microarray-related web-tools, multiple algorithms and data sets for an analysis task can be combined using ensemble feature selection, ensemble prediction, consensus clustering and cross-platform data integration. By interlinking different analysis tools in a modular fashion, new exploratory routes become available, e.g. ensemble sample classification using features obtained from a gene set analysis and data from multiple studies. The analysis is further simplified by automatic parameter selection mechanisms and linkage to web tools and databases for functional annotation and literature mining. Conclusion: ArrayMining.net is a free web-application for microarray analysis combining a broad choice of algorithms based on ensemble and consensus methods, using automatic parameter selection and integration with annotation databases.
Resumo:
The importance of efficient supply chain management has increased due to globalization and the blurring of organizational boundaries. Various supply chain management technologies have been identified to drive organizational profitability and financial performance. Organizations have historically been concentrating heavily on the flow of goods and services, while less attention has been dedicated to the flow of money. While supply chains are becoming more transparent and automated, new opportunities for financial supply chain management have emerged through information technology solutions and comprehensive financial supply chain management strategies. This research concentrates on the end part of the purchasing process which is the handling of invoices. Efficient invoice processing can have an impact on organizations working capital management and thus provide companies with better readiness to face the challenges related to cash management. Leveraging a process mining solution the aim of this research was to examine the automated invoice handling process of four different organizations. The invoice data was collected from each organizations invoice processing system. The sample included all the invoices organizations had processed during the year 2012. The main objective was to find out whether e-invoices are faster to process in an automated invoice processing solution than scanned invoices (post entry into invoice processing solution). Other objectives included looking into the longest lead times between process steps and the impact of manual process steps on cycle time. Processing of invoices from maverick purchases was also examined. Based on the results of the research and previous literature on the subject, suggestions for improving the process were proposed. The results of the research indicate that scanned invoices were processed faster than e-invoices. This is mostly due to the more complex processing of e-invoices. It should be noted however that the manual tasks related to turning a paper invoice into electronic format through scanning are ignored in this research. The transitions with the longest lead times in the invoice handling process included both pre-automated steps as well as manual steps performed by humans. When the most common manual steps were examined in more detail, it was clear that these steps had a prolonging impact on the process. Regarding invoices from maverick purchases the evidence shows that these invoices were slower to process than invoices from purchases conducted through e-procurement systems and from preferred suppliers. Suggestions on how to improve the process included: increasing invoice matching, reducing of manual steps and leveraging of different value added services such as invoice validation service, mobile solutions and supply chain financing services. For companies that have already reaped all the process efficiencies the next step is to engage in collaborative financial supply chain management strategies that can benefit the whole supply chain.