923 resultados para other issues
Resumo:
In the last twenty years genetic algorithms (GAs) were applied in a plethora of fields such as: control, system identification, robotics, planning and scheduling, image processing, and pattern and speech recognition (Bäck et al., 1997). In robotics the problems of trajectory planning, collision avoidance and manipulator structure design considering a single criteria has been solved using several techniques (Alander, 2003). Most engineering applications require the optimization of several criteria simultaneously. Often the problems are complex, include discrete and continuous variables and there is no prior knowledge about the search space. These kind of problems are very more complex, since they consider multiple design criteria simultaneously within the optimization procedure. This is known as a multi-criteria (or multiobjective) optimization, that has been addressed successfully through GAs (Deb, 2001). The overall aim of multi-criteria evolutionary algorithms is to achieve a set of non-dominated optimal solutions known as Pareto front. At the end of the optimization procedure, instead of a single optimal (or near optimal) solution, the decision maker can select a solution from the Pareto front. Some of the key issues in multi-criteria GAs are: i) the number of objectives, ii) to obtain a Pareto front as wide as possible and iii) to achieve a Pareto front uniformly spread. Indeed, multi-objective techniques using GAs have been increasing in relevance as a research area. In 1989, Goldberg suggested the use of a GA to solve multi-objective problems and since then other researchers have been developing new methods, such as the multi-objective genetic algorithm (MOGA) (Fonseca & Fleming, 1995), the non-dominated sorted genetic algorithm (NSGA) (Deb, 2001), and the niched Pareto genetic algorithm (NPGA) (Horn et al., 1994), among several other variants (Coello, 1998). In this work the trajectory planning problem considers: i) robots with 2 and 3 degrees of freedom (dof ), ii) the inclusion of obstacles in the workspace and iii) up to five criteria that are used to qualify the evolving trajectory, namely the: joint traveling distance, joint velocity, end effector / Cartesian distance, end effector / Cartesian velocity and energy involved. These criteria are used to minimize the joint and end effector traveled distance, trajectory ripple and energy required by the manipulator to reach at destination point. Bearing this ideas in mind, the paper addresses the planning of robot trajectories, meaning the development of an algorithm to find a continuous motion that takes the manipulator from a given starting configuration up to a desired end position without colliding with any obstacle in the workspace. The chapter is organized as follows. Section 2 describes the trajectory planning and several approaches proposed in the literature. Section 3 formulates the problem, namely the representation adopted to solve the trajectory planning and the objectives considered in the optimization. Section 4 studies the algorithm convergence. Section 5 studies a 2R manipulator (i.e., a robot with two rotational joints/links) when the optimization trajectory considers two and five objectives. Sections 6 and 7 show the results for the 3R redundant manipulator with five goals and for other complementary experiments are described, respectively. Finally, section 8 draws the main conclusions.
Resumo:
In practice the robotic manipulators present some degree of unwanted vibrations. The advent of lightweight arm manipulators, mainly in the aerospace industry, where weight is an important issue, leads to the problem of intense vibrations. On the other hand, robots interacting with the environment often generate impacts that propagate through the mechanical structure and produce also vibrations. In order to analyze these phenomena a robot signal acquisition system was developed. The manipulator motion produces vibrations, either from the structural modes or from endeffector impacts. The instrumentation system acquires signals from several sensors that capture the joint positions, mass accelerations, forces and moments, and electrical currents in the motors. Afterwards, an analysis package, running off-line, reads the data recorded by the acquisition system and extracts the signal characteristics. Due to the multiplicity of sensors, the data obtained can be redundant because the same type of information may be seen by two or more sensors. Because of the price of the sensors, this aspect can be considered in order to reduce the cost of the system. On the other hand, the placement of the sensors is an important issue in order to obtain the suitable signals of the vibration phenomenon. Moreover, the study of these issues can help in the design optimization of the acquisition system. In this line of thought a sensor classification scheme is presented. Several authors have addressed the subject of the sensor classification scheme. White (White, 1987) presents a flexible and comprehensive categorizing scheme that is useful for describing and comparing sensors. The author organizes the sensors according to several aspects: measurands, technological aspects, detection means, conversion phenomena, sensor materials and fields of application. Michahelles and Schiele (Michahelles & Schiele, 2003) systematize the use of sensor technology. They identified several dimensions of sensing that represent the sensing goals for physical interaction. A conceptual framework is introduced that allows categorizing existing sensors and evaluates their utility in various applications. This framework not only guides application designers for choosing meaningful sensor subsets, but also can inspire new systems and leads to the evaluation of existing applications. Today’s technology offers a wide variety of sensors. In order to use all the data from the diversity of sensors a framework of integration is needed. Sensor fusion, fuzzy logic, and neural networks are often mentioned when dealing with problem of combing information from several sensors to get a more general picture of a given situation. The study of data fusion has been receiving considerable attention (Esteban et al., 2005; Luo & Kay, 1990). A survey of the state of the art in sensor fusion for robotics can be found in (Hackett & Shah, 1990). Henderson and Shilcrat (Henderson & Shilcrat, 1984) introduced the concept of logic sensor that defines an abstract specification of the sensors to integrate in a multisensor system. The recent developments of micro electro mechanical sensors (MEMS) with unwired communication capabilities allow a sensor network with interesting capacity. This technology was applied in several applications (Arampatzis & Manesis, 2005), including robotics. Cheekiralla and Engels (Cheekiralla & Engels, 2005) propose a classification of the unwired sensor networks according to its functionalities and properties. This paper presents a development of a sensor classification scheme based on the frequency spectrum of the signals and on a statistical metrics. Bearing these ideas in mind, this paper is organized as follows. Section 2 describes briefly the robotic system enhanced with the instrumentation setup. Section 3 presents the experimental results. Finally, section 4 draws the main conclusions and points out future work.
Resumo:
This text is based on a research, which is still in progress, whose main objective is to identify and understand what are the main difficulties of future mathematics teachers of basic education are, regarding their content knowledge in geometry in the context of the curricular unit of Geometry during their undergraduate degree. We chose a qualitative approach in the form of case study, in which data collection was done through observation, interviews, a diverse set of tasks, a diagnostic test and other documents. This paper focuses on the test given to prospective teachers at the beginning of the course. The preliminary analysis of the data points to a weak performance of preservice teachers in the test issues addressing elementary knowledge of Geometry
Resumo:
Dissertação de Mestrado em Engenharia Informática
Resumo:
Friction stir welding (FSW) is now well established as a welding process capable of joining some different types of metallic materials, as it was (1) found to be a reliable and economical way of producing high quality welds, and (2) considered a "clean" welding process that does not involve fusion of metal, as is the case with other traditional welding processes. The aim of this study was to determine whether the emission of particles during FSW in the nanorange of the most commonly used aluminum (Al) alloys, AA 5083 and AA 6082, originated from the Al alloy itself due to friction of the welding tool against the item that was being welded. Another goal was to measure Al alloys in the alveolar deposited surface area during FSW. Nanoparticles dimensions were predominantly in the 40- and 70-nm range. This study demonstrated that microparticles were also emitted during FSW but due to tool wear. However, the biological relevance and toxic manifestations of these microparticles remain to be determined.
Resumo:
OBJECTIVE To analyze whether sociodemographic, occupational, and health-related data are associated with the use of hearing protection devices at work, according to gender. METHODS A cross-sectional study was conducted in 2006, using a random sample of 2,429 workers, aged between 18 and 65 years old, from residential sub-areas in Salvador, BA, Northeastern Brazil. Questionnaires were used to obtain sociodemographic, occupational, and health-related data. Workers who reported that they worked in places where they needed to shout in order to be heard were considered to be exposed to noise. Exposed workers were asked whether they used hearing protection devices, and if so, how frequently. Analyses were conducted according to gender, with estimates made about prevalence of the use of hearing protection devices, prevalence ratios, and their respective 95% confidence intervals. RESULTS Twelve percent (12.3%) of study subjects reported that they were exposed to noise while working. Prevalence of the use of hearing protection devices was 59.3% for men and 21.4% for women. Men from higher socioeconomic levels (PR = 1.47; 95%CI 1.14;1.90) and who had previous audiometric tests (PR = 1.47; 95%CI 1.15;1.88) were more likely to use hearing protection devices. For women, greater perceived safety was associated with the use of protection devices (PR = 2.92; 95%CI 1.34;6.34). This perception was specifically related to the presence of supervisors committed to safety (PR = 2.09; 95%CI 1.04;4.21), the existence of clear rules to prevent workplace injuries (PR = 2.81; 95%CI 1.41;5.59), and whether they were informed about workplace safety (PR = 2.42; 95%CI 1.23;4.76). CONCLUSIONS There is a gender bias regarding the use of hearing protection devices that is less favorable to women. The use of such devices among women is positively influenced by their perception of a safe workplace, suggesting that gender should be considered as a factor in hearing conservation programs.
Resumo:
The great majority of the courses on science and technology areas where lab work is a fundamental part of the apprenticeship was not until recently available to be taught at distance. This reality is changing with the dissemination of remote laboratories. Supported by resources based on new information and communication technologies, it is now possible to remotely control a wide variety of real laboratories. However, most of them are designed specifically to this purpose, are inflexible and only on its functionality they resemble the real ones. In this paper, an alternative remote lab infrastructure devoted to the study of electronics is presented. Its main characteristics are, from a teacher's perspective, reusability and simplicity of use, and from a students' point of view, an exact replication of the real lab, enabling them to complement or finish at home the work started at class. The remote laboratory is integrated in the Learning Management System in use at the school, and therefore, may be combined with other web experiments and e-learning strategies, while safeguarding security access issues.
Resumo:
Recent advances in vacuum sciences and applications are reviewed. Novel optical interferometer cavity devices enable pressure measurements with ppm accuracy. The innovative dynamic vacuum standard allows for pressure measurements with temporal resolution of 2 ms. Vacuum issues in the construction of huge ultra-high vacuum devices worldwide are reviewed. Recent advances in surface science and thin films include new phenomena observed in electron transport near solid surfaces as well as novel results on the properties of carbon nanomaterials. Precise techniques for surface and thin-film characterization have been applied in the conservation technology of cultural heritage objects and recent advances in the characterization of biointerfaces are presented. The combination of various vacuum and atmospheric-pressure techniques enables an insight into the complex phenomena of protein and other biomolecule conformations on solid surfaces. Studying these phenomena at solid-liquid interfaces is regarded as the main issue in the development of alternative techniques for drug delivery, tissue engineering and thus the development of innovative techniques for curing cancer and cardiovascular diseases. A review on recent advances in plasma medicine is presented as well as novel hypotheses on cell apoptosis upon treatment with gaseous plasma. Finally, recent advances in plasma nanoscience are illustrated with several examples and a roadmap for future activities is presented.
Resumo:
Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies
Resumo:
This text is based on a research, which is still in progress, whose main objective is to identify and understand what are the main difficulties of future mathematics teachers of basic education are, regarding their content knowledge in geometry in the context of the curricular unit of Geometry during their undergraduate degree. We chose a qualitative approach in the form of case study, in which data collection was done through observation, interviews, a diverse set of tasks, a diagnostic test and other documents. This paper focuses on the test given to prospective teachers at the beginning of the course. The preliminary analysis of the data points to a weak performance of preservice teachers in the test issues addressing elementary knowledge of Geometry.
Resumo:
The aim of this study was to contribute to the assessment of exposure levels of ultrafine particles in the urban environment of Lisbon, Portugal, due to automobile traffic, by monitoring lung deposited alveolar surface area (resulting from exposure to ultrafine particles) in a major avenue leading to the town center during late spring, as well as in indoor buildings facing it. Data revealed differentiated patterns for week days and weekends, consistent with PM2.5 and PM10 patterns currently monitored by air quality stations in Lisbon. The observed ultrafine particulate levels may be directly correlated with fluxes in automobile traffic. During a typical week, amounts of ultrafine particles per alveolar deposited surface area varied between 35 and 89.2 mu m2/cm3, which are comparable with levels reported for other towns in Germany and the United States. The measured values allowed for determination of the number of ultrafine particles per cubic centimeter, which are comparable to levels reported for Madrid and Brisbane. In what concerns outdoor/indoor levels, we observed higher levels (32 to 63%) outdoors, which is somewhat lower than levels observed in houses in Ontario.
Resumo:
Nos últimos anos começaram a ser vulgares os computadores dotados de multiprocessadores e multi-cores. De modo a aproveitar eficientemente as novas características desse hardware começaram a surgir ferramentas para facilitar o desenvolvimento de software paralelo, através de linguagens e frameworks, adaptadas a diferentes linguagens. Com a grande difusão de redes de alta velocidade, tal como Gigabit Ethernet e a última geração de redes Wi-Fi, abre-se a oportunidade de, além de paralelizar o processamento entre processadores e cores, poder em simultâneo paralelizá-lo entre máquinas diferentes. Ao modelo que permite paralelizar processamento localmente e em simultâneo distribuí-lo para máquinas que também têm capacidade de o paralelizar, chamou-se “modelo paralelo distribuído”. Nesta dissertação foram analisadas técnicas e ferramentas utilizadas para fazer programação paralela e o trabalho que está feito dentro da área de programação paralela e distribuída. Tendo estes dois factores em consideração foi proposta uma framework que tenta aplicar a simplicidade da programação paralela ao conceito paralelo distribuído. A proposta baseia-se na disponibilização de uma framework em Java com uma interface de programação simples, de fácil aprendizagem e legibilidade que, de forma transparente, é capaz de paralelizar e distribuir o processamento. Apesar de simples, existiu um esforço para a tornar configurável de forma a adaptar-se ao máximo de situações possível. Nesta dissertação serão exploradas especialmente as questões relativas à execução e distribuição de trabalho, e a forma como o código é enviado de forma automática pela rede, para outros nós cooperantes, evitando assim a instalação manual das aplicações em todos os nós da rede. Para confirmar a validade deste conceito e das ideias defendidas nesta dissertação foi implementada esta framework à qual se chamou DPF4j (Distributed Parallel Framework for JAVA) e foram feitos testes e retiradas métricas para verificar a existência de ganhos de performance em relação às soluções já existentes.
Resumo:
O presente trabalho centra-se na gestão de resíduos produzidos no sistema de drenagem e tratamento de águas residuais do município de Vila Nova de Gaia. A entidade onde decorreu o trabalho é uma empresa responsável pela distribuição de água e pela drenagem e tratamento de águas residuais. A empresa está certificada pela norma NP EN ISO 14001, desde 2001, sendo então um dos objectivos o enquadramento da gestão dos resíduos em estudo na referida norma, acompanhando os requisitos da mesma com vista ao seu total cumprimento. Outros dos objectivos foi estudar qual a opção de tratamento mais adequada a aplicar ao resíduo no seu local de armazenamento temporário com vista a minorar os seus impactes ambientais. De acordo com a caracterização analítica do resíduo e com os aspetos legais aplicáveis, foram também analisados os destinos finais possíveis e ambientalmente adequados ao resíduo. A medida proposta para a minimização de impactes no local de armazenamento temporário do resíduo foi a estabilização com cal nos leitos de secagem, disponíveis numa antiga ETAR de loteamento. O doseamento de cal a aplicar ao resíduo será de 10 kg de cal apagada comercial (Ca (OH)2) por uma tonelada de resíduo fresco com um período mínimo de secagem de 2 meses. Outra das medidas de minimização de impactes selecionada foi a implantação de uma cortina arbórea ao redor da instalação. Sendo o resíduo em estudo muito heterogéneo, constituído principalmente por areias, terras e gradados, a valorização foi equacionada mas não foram encontrados alternativas viáveis. O destino final considerado como mais adequado tendo em conta todas as características do resíduo e eluato, analisadas de acordo com o previsto no Decreto-Lei n.º 183/2009 de 10 de Agosto, foi o aterro para resíduos não perigosos. Foi também objecto do estudo a identificação e análise de todos os aspectos ambientais relacionados com a gestão de resíduos e a avaliação da sua significância. Dos aspectos ambientais identificados como significativos, destacam-se aqueles que ocorrem presentemente, os resíduos armazenados (gradados/limpeza de redes), e os que podem ocorrer em situações de emergência, fuga/derrame de óleos/combustíveis e cheiros/odores. De forma a minimizar os aspectos ambientais identificados, e de acordo com a norma NP EN ISO 14001, foram propostas ações que constam de um programa de gestão elaborado para este trabalho, onde se definem os objectivos, metas e prazos. As principais medidas propostas no programa de gestão foram: Estabilização com cal (inicial e reforço se necessário); melhoria do espaço envolvente; análise de questões de saúde ocupacional/segurança; adjudicação de prestação de serviços da recolha por operador licenciado; implantação da cortina arbórea; registo no SIRAPA; criação de planos de emergência ambiental e de segurança.
Resumo:
Os mercados de energia elétrica são atualmente uma realidade um pouco por todo o mundo. Contudo, não é consensual o modelo regulatório a utilizar, o que origina a utilização de diferentes modelos nos diversos países que deram início ao processo de liberalização e de reestruturação do sector elétrico. A esses países, dado que a energia elétrica não é um bem armazenável, pelo menos em grandes quantidades, colocam-se questões importantes relacionadas com a gestão propriamente dita do seu sistema elétrico. Essas questões implicam a adoção de regras impostas pelo regulador que permitam ultrapassar essas questões. Este trabalho apresenta um estudo feito aos mercados de energia elétrica existentes um pouco por todo o mundo e que o autor considerou serem os mais importantes. Foi também feito um estudo de ferramentas de otimização essencialmente baseado em meta-heurísticas aplicadas a problemas relacionados com a operação dos mercados e com os sistemas elétricos de energia, como é o exemplo da resolução do problema do Despacho Económico. Foi desenvolvida uma aplicação que simula o funcionamento de um mercado que atua com o modelo Pool Simétrico, em que são transmitidas as ofertas de venda e compra de energia elétrica por parte dos produtores, por um lado, e dos comercializadores, consumidores elegíveis ou intermediários financeiros, por outro, analisando a viabilidade técnica do Despacho Provisório. A análise da viabilidade técnica do Despacho Provisório é verificada através do modelo DC de trânsito de potências. No caso da inviabilidade do Despacho Provisório, por violação de restrições afetas ao problema, são determinadas medidas corretivas a esse despacho, com base nas ofertas realizadas e recorrendo a um Despacho Ótimo. Para a determinação do Despacho Ótimo recorreu-se à meta-heurística Algoritmos Genéticos. A aplicação foi desenvolvida no software MATLAB utilizando a ferramenta Graphical User Interfaces. A rede de teste utilizada foi a rede de 14 barramentos do Institute of Electrical and Electronics Engineers (IEEE). A aplicação mostra-se competente no que concerne à simulação de um mercado com tipo de funcionamento Pool Simétrico onde são efetuadas ofertas simples e onde as transações ocorrem no mercado diário, porém, não reflete o problema real relacionado a este tipo de mercados. Trata-se, portanto, de um simulador básico de um mercado de energia cujo modelo de funcionamento se baseia no tipo Pool Simétrico.
Resumo:
A principal causa de morte e incapacidade em Portugal deve-se a Acidentes Vasculares Cerebrais (AVC). Assim, este trabalho de investigação pretende identificar e quantificar quais os fatores que contribuem para a ocorrência de um AVC (por tipo e com sequelas), duração do internamento e potenciais soluções de encaminhamento terapêutico para o doente após a ocorrência de AVC. Identificando e quantificando tais fatores é possível traçar um perfil de doente em risco, atuando sobre ele através de medidas preventivas de forma a minimizar o impacto deste problema em termos pessoais, sociais e financeiros. Para atingir este objetivo foi analisada uma amostra de indivíduos internados em 2012 na Unidade de AVC do Centro Hospitalar do Tâmega e Sousa. Dos casos analisados 87,8% são causados por AVCI (isquémicos) e 12,2% por casos de AVCH (hemorrágicos). Do total dos casos, 58,9% apresentam sequelas. A hipertensão, a diabetes de Mellitus e o Colesterol apresentam-se como antecedentes clínicos com elevado fator de risco. O tabagismo regista grande importância na propulsão dos anteriores fatores analisados assim como o alcoolismo. Conclui-se que a prevenção do AVC e outras doenças cardiovasculares é importante desde a idade escolar, dando-se especial importância ao período que antecede os 36 anos de idade, altura em que se começa a verificar uma subida agravada de ocorrências. O investimento na prevenção e vigilância médica do cidadão é um fator crucial neste período podendo reduzir em grande escala os custos associados a médio-longo prazo para todas as partes intervenientes.