904 resultados para New cutting tool
Resumo:
This paper presents the creation and development of technological schools directly linked to the business community and to higher public education. Establishing themselves as the key interface between the two sectors they make a signigicant contribution by having a greater competitive edge when faced with increasing competition in the tradional markets. The development of new business strategies supported by references of excellence, quality and competitiveness also provides a good link between the estalishment of partnerships aiming at the qualification of education boards at a medium level between the technological school and higher education with a technological foundation. We present a case study as an example depicting the success of Escola Tecnológica de Vale de Cambra.
Resumo:
A new method, based on linear correlation and phase diagrams was successfully developed for processes like the sedimentary process, where the deposition phase can have different time duration - represented by repeated values in a series - and where the erosion can play an important rule deleting values of a series. The sampling process itself can be the cause of repeated values - large strata twice sampled - or deleted values: tiny strata fitted between two consecutive samples. What we developed was a mathematical procedure which, based upon the depth chemical composition evolution, allows the establishment of frontiers as well as the periodicity of different sedimentary environments. The basic tool isn't more than a linear correlation analysis which allow us to detect the existence of eventual evolution rules, connected with cyclical phenomena within time series (considering the space assimilated to time), with the final objective of prevision. A very interesting discovery was the phenomenon of repeated sliding windows that represent quasi-cycles of a series of quasi-periods. An accurate forecast can be obtained if we are inside a quasi-cycle (it is possible to predict the other elements of the cycle with the probability related with the number of repeated and deleted points). We deal with an innovator methodology, reason why it's efficiency is being tested in some case studies, with remarkable results that shows it's efficacy. Keywords: sedimentary environments, sequence stratigraphy, data analysis, time-series, conditional probability.
Resumo:
The objective of this contribution is to extend the models of cellular/composite material design to nonlinear material behaviour and apply them for design of materials for passive vibration control. As a first step a computational tool allowing determination of optimised one-dimensional isolator behaviour was developed. This model can serve as a representation for idealised macroscopic behaviour. Optimal isolator behaviour to a given set of loads is obtained by generic probabilistic metaalgorithm, simulated annealing. Cost functional involves minimization of maximum response amplitude in a set of predefined time intervals and maximization of total energy absorbed in the first loop. Dependence of the global optimum on several combinations of leading parameters of the simulated annealing procedure, like neighbourhood definition and annealing schedule, is also studied and analyzed. Obtained results facilitate the design of elastomeric cellular materials with improved behaviour in terms of dynamic stiffness for passive vibration control.
Resumo:
Mestrado em Gestão e Avaliação de Tecnologias em Saúde
Resumo:
To meet the increasing demands of the complex inter-organizational processes and the demand for continuous innovation and internationalization, it is evident that new forms of organisation are being adopted, fostering more intensive collaboration processes and sharing of resources, in what can be called collaborative networks (Camarinha-Matos, 2006:03). Information and knowledge are crucial resources in collaborative networks, being their management fundamental processes to optimize. Knowledge organisation and collaboration systems are thus important instruments for the success of collaborative networks of organisations having been researched in the last decade in the areas of computer science, information science, management sciences, terminology and linguistics. Nevertheless, research in this area didn’t give much attention to multilingual contexts of collaboration, which pose specific and challenging problems. It is then clear that access to and representation of knowledge will happen more and more on a multilingual setting which implies the overcoming of difficulties inherent to the presence of multiple languages, through the use of processes like localization of ontologies. Although localization, like other processes that involve multilingualism, is a rather well-developed practice and its methodologies and tools fruitfully employed by the language industry in the development and adaptation of multilingual content, it has not yet been sufficiently explored as an element of support to the development of knowledge representations - in particular ontologies - expressed in more than one language. Multilingual knowledge representation is then an open research area calling for cross-contributions from knowledge engineering, terminology, ontology engineering, cognitive sciences, computational linguistics, natural language processing, and management sciences. This workshop joined researchers interested in multilingual knowledge representation, in a multidisciplinary environment to debate the possibilities of cross-fertilization between knowledge engineering, terminology, ontology engineering, cognitive sciences, computational linguistics, natural language processing, and management sciences applied to contexts where multilingualism continuously creates new and demanding challenges to current knowledge representation methods and techniques. In this workshop six papers dealing with different approaches to multilingual knowledge representation are presented, most of them describing tools, approaches and results obtained in the development of ongoing projects. In the first case, Andrés Domínguez Burgos, Koen Kerremansa and Rita Temmerman present a software module that is part of a workbench for terminological and ontological mining, Termontospider, a wiki crawler that aims at optimally traverse Wikipedia in search of domainspecific texts for extracting terminological and ontological information. The crawler is part of a tool suite for automatically developing multilingual termontological databases, i.e. ontologicallyunderpinned multilingual terminological databases. In this paper the authors describe the basic principles behind the crawler and summarized the research setting in which the tool is currently tested. In the second paper, Fumiko Kano presents a work comparing four feature-based similarity measures derived from cognitive sciences. The purpose of the comparative analysis presented by the author is to verify the potentially most effective model that can be applied for mapping independent ontologies in a culturally influenced domain. For that, datasets based on standardized pre-defined feature dimensions and values, which are obtainable from the UNESCO Institute for Statistics (UIS) have been used for the comparative analysis of the similarity measures. The purpose of the comparison is to verify the similarity measures based on the objectively developed datasets. According to the author the results demonstrate that the Bayesian Model of Generalization provides for the most effective cognitive model for identifying the most similar corresponding concepts existing for a targeted socio-cultural community. In another presentation, Thierry Declerck, Hans-Ulrich Krieger and Dagmar Gromann present an ongoing work and propose an approach to automatic extraction of information from multilingual financial Web resources, to provide candidate terms for building ontology elements or instances of ontology concepts. The authors present a complementary approach to the direct localization/translation of ontology labels, by acquiring terminologies through the access and harvesting of multilingual Web presences of structured information providers in the field of finance, leading to both the detection of candidate terms in various multilingual sources in the financial domain that can be used not only as labels of ontology classes and properties but also for the possible generation of (multilingual) domain ontologies themselves. In the next paper, Manuel Silva, António Lucas Soares and Rute Costa claim that despite the availability of tools, resources and techniques aimed at the construction of ontological artifacts, developing a shared conceptualization of a given reality still raises questions about the principles and methods that support the initial phases of conceptualization. These questions become, according to the authors, more complex when the conceptualization occurs in a multilingual setting. To tackle these issues the authors present a collaborative platform – conceptME - where terminological and knowledge representation processes support domain experts throughout a conceptualization framework, allowing the inclusion of multilingual data as a way to promote knowledge sharing and enhance conceptualization and support a multilingual ontology specification. In another presentation Frieda Steurs and Hendrik J. Kockaert present us TermWise, a large project dealing with legal terminology and phraseology for the Belgian public services, i.e. the translation office of the ministry of justice, a project which aims at developing an advanced tool including expert knowledge in the algorithms that extract specialized language from textual data (legal documents) and whose outcome is a knowledge database including Dutch/French equivalents for legal concepts, enriched with the phraseology related to the terms under discussion. Finally, Deborah Grbac, Luca Losito, Andrea Sada and Paolo Sirito report on the preliminary results of a pilot project currently ongoing at UCSC Central Library, where they propose to adapt to subject librarians, employed in large and multilingual Academic Institutions, the model used by translators working within European Union Institutions. The authors are using User Experience (UX) Analysis in order to provide subject librarians with a visual support, by means of “ontology tables” depicting conceptual linking and connections of words with concepts presented according to their semantic and linguistic meaning. The organizers hope that the selection of papers presented here will be of interest to a broad audience, and will be a starting point for further discussion and cooperation.
Resumo:
Consider scheduling of real-time tasks on a multiprocessor where migration is forbidden. Specifically, consider the problem of determining a task-to-processor assignment for a given collection of implicit-deadline sporadic tasks upon a multiprocessor platform in which there are two distinct types of processors. For this problem, we propose a new algorithm, LPC (task assignment based on solving a Linear Program with Cutting planes). The algorithm offers the following guarantee: for a given task set and a platform, if there exists a feasible task-to-processor assignment, then LPC succeeds in finding such a feasible task-to-processor assignment as well but on a platform in which each processor is 1.5 × faster and has three additional processors. For systems with a large number of processors, LPC has a better approximation ratio than state-of-the-art algorithms. To the best of our knowledge, this is the first work that develops a provably good real-time task assignment algorithm using cutting planes.
Resumo:
Mestrado em Engenharia Mecânica - Construções Mecânicas
Resumo:
This paper presents a decision support tool methodology to help virtual power players (VPPs) in the Smart Grid (SGs) context to solve the day-ahead energy resource scheduling considering the intensive use of Distributed Generation (DG) and Vehicle-To-Grid (V2G). The main focus is the application of a new hybrid method combing a particle swarm approach and a deterministic technique based on mixedinteger linear programming (MILP) to solve the day-ahead scheduling minimizing total operation costs from the aggregator point of view. A realistic mathematical formulation, considering the electric network constraints and V2G charging and discharging efficiencies is presented. Full AC power flow calculation is included in the hybrid method to allow taking into account the network constraints. A case study with a 33-bus distribution network and 1800 V2G resources is used to illustrate the performance of the proposed method.
Resumo:
The study of electricity markets operation has been gaining an increasing importance in the last years, as result of the new challenges that the restructuring process produced. Currently, lots of information concerning electricity markets is available, as market operators provide, after a period of confidentiality, data regarding market proposals and transactions. These data can be used as source of knowledge to define realistic scenarios, which are essential for understanding and forecast electricity markets behavior. The development of tools able to extract, transform, store and dynamically update data, is of great importance to go a step further into the comprehension of electricity markets and of the behaviour of the involved entities. In this paper an adaptable tool capable of downloading, parsing and storing data from market operators’ websites is presented, assuring constant updating and reliability of the stored data.
Resumo:
The study of Electricity Markets operation has been gaining an increasing importance in the last years, as result of the new challenges that the restructuring produced. Currently, lots of information concerning Electricity Markets is available, as market operators provide, after a period of confidentiality, data regarding market proposals and transactions. These data can be used as source of knowledge, to define realistic scenarios, essential for understanding and forecast Electricity Markets behaviour. The development of tools able to extract, transform, store and dynamically update data, is of great importance to go a step further into the comprehension of Electricity Markets and the behaviour of the involved entities. In this paper we present an adaptable tool capable of downloading, parsing and storing data from market operators’ websites, assuring actualization and reliability of stored data.
Resumo:
An intensive use of dispersed energy resources is expected for future power systems, including distributed generation, especially based on renewable sources, and electric vehicles. The system operation methods and tool must be adapted to the increased complexity, especially the optimal resource scheduling problem. Therefore, the use of metaheuristics is required to obtain good solutions in a reasonable amount of time. This paper proposes two new heuristics, called naive electric vehicles charge and discharge allocation and generation tournament based on cost, developed to obtain an initial solution to be used in the energy resource scheduling methodology based on simulated annealing previously developed by the authors. The case study considers two scenarios with 1000 and 2000 electric vehicles connected in a distribution network. The proposed heuristics are compared with a deterministic approach and presenting a very small error concerning the objective function with a low execution time for the scenario with 2000 vehicles.
Resumo:
Guilhotinas são máquinas robustas, de corte rectilíneo, normalmente associadas a equipamentos de baixo custo, devido à pequena quantidade de dispositivos tecnológicos incorporados. No entanto, esta situação pode ser alterada através da criatividade dos projetistas deste tipo de equipamento. Analisando algumas operações específicas, pode-se observar que algumas ferramentas, quando associadas ao equipamento, podem aumentar substancialmente a produtividade do processo de corte e a qualidade do produto final. Em relação ao processo de corte de chapas finas de metal, pode-se observar que na fase final de corte, o peso do material a cortar é suspenso por uma pequena porção de material que ainda não foi sujeita ao corte. Este facto leva a uma deformação plástica nesta última zona, causando problemas de qualidade no produto final, que não ficará completamente plano. Este trabalho foi desenvolvido em torno deste problema, estudando a melhor solução para desenvolver uma nova ferramenta, capaz de evitar a falta de nivelamento da placa, após corte. Um novo equipamento foi concebido, capaz de ser facilmente incorporado na guilhotina, permitindo o acompanhamento da inclinação da lâmina durante a operação de corte. O sistema é totalmente automatizado, sendo operado por uma única instrução de corte dada pelo operador da máquina. Este sistema permite à empresa fabricante aumentar o valor agregado de cada máquina, oferecendo aos clientes soluções avançadas, contribuindo desta forma para a sustentabilidade do negócio da empresa.
Resumo:
Dissertação apresentada para obtenção do Grau de Doutor em Ciências do Ambiente pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecn
Resumo:
The aim of this contribution is to extend the techniques of composite materials design to non-linear material behaviour and apply it for design of new materials for passive vibration control. As a first step a computational tool allowing determination of macroscopic optimized one-dimensional isolator behaviour was developed. Voigt, Maxwell, standard and more complex material models can be implemented. Objective function considers minimization of the initial reaction and/or displacement peak as well as minimization of the steady-state amplitude of reaction and/or displacement. The complex stiffness approach is used to formulate the governing equations in an efficient way. Material stiffness parameters are assumed as non-linear functions of the displacement. The numerical solution is performed in the complex space. The steady-state solution in the complex space is obtained by an iterative process based on the shooting method which imposes the conditions of periodicity with respect to the known value of the period. Extension of the shooting method to the complex space is presented and verified. Non-linear behaviour of material parameters is then optimized by generic probabilistic meta-algorithm, simulated annealing. Dependence of the global optimum on several combinations of leading parameters of the simulated annealing procedure, like neighbourhood definition and annealing schedule, is also studied and analyzed. Procedure is programmed in MATLAB environment.
Resumo:
Since the middle of the first decade of this century, several authors have announced the dawn of a new Age, following the Information/ Knowledge Age (1970-2005?). We are certainly living in a Shift Age (Houle, 2007), but no standard designation has been broadly adopted so far, and others, such as Conceptual Age (Pink, 2005) or Social Age (Azua, 2009), are only some of the proposals to name current times. Due to the amount of information available nowadays, meaning making and understanding seem to be common features of this new age of change; change related to (i) how individuals and organizations engage with each other, to (ii) the way we deal with technology, to (iii) how we engage and communicate within communities to create meaning, i.e., also social networking-driven changes. The Web 2.0 and the social networks have strongly altered the way we learn, live, work and, of course, communicate. Within all the possible dimensions we could address this change, we chose to focus on language – a taken-for-granted communication tool, used, translated and recreated in personal and geographical variants, by the many users and authors of the social networks and other online communities and platforms. In this paper, we discuss how the Web 2.0, and specifically social networks, have contributed to changes in the communication process and, in bi- or multilingual environments, to the evolution and freeware use of the so called “international language”: English. Next, we discuss some of the impacts and challenges of this language diversity in international communication in the shift age of understanding and social networking, focusing on specialized networks. Then we point out some skills and strategies to avoid babelization and to build meaningful and effective content in mono or multilingual networks, through the use of common and shared concepts and designations in social network environments. For this purpose, we propose a social and collaborative approach to terminology management, as a shared, strategic and sense making tool for specialized communication in Web 2.0 environments.