937 resultados para ISE and ITSE optimization


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabalho foi desenvolvido na empresa Amorim & Irmãos, SA e teve dois objectivos fundamentais. O primeiro centrou-se na análise do processo de tratamento de superfície das rolhas, procurando um produto alternativo ao actualmente implementado na empresa e a sua optimização. O segundo objectivo foi a elaboração de um novo método de determinação da absorção em garrafa, que permitisse a sua determinação sem o conhecimento da massa inicial da rolha. Para a concretização do primeiro objectivo foram estudados doze produtos químicos em comparação com o actualmente utilizado, em que o objectivo foi obter-se forças de extracção entre 15 e 20 daN. Após realização do tratamento de superfície para cada produto foram realizados vários testes laboratoriais, nomeadamente: forças de extracção, vedação em tubo, absorção em garrafa, capilaridade, análise de risco à quantidade de produto adicionado e análise de risco ao tempo de distribuição do tratamento. Após análise global dos resultados obtidos verificou-se que o produto T13, embora apresente forças de extracção no limite inferior ao desejado possui uma boa estabilização. Os produtos T5 e T6 são bons alternativos ao produto actualmente implementado (T8), embora seja necessário ter alguns cuidados no seu manuseamento. O produto T5 como foi considerado mau no teste da absorção em garrafa não poderá ser utilizado para mercados mais distantes (EUA, Austrália e África do Sul) devido ao risco de ocorrência de migração de vinho através da rolha de cortiça. O produto T6 como apresentou um comportamento irregular na análise de risco à quantidade de produto adicionado, e na análise de risco à distribuição de produto, deve-se ter muita atenção à quantidade inserida no tambor assim como o tempo de distribuição. Para a concretização do segundo objectivo foi determinada a absorção em garrafa pelo método actual e comparou-se com o novo método. Apesar do desvio padrão ser de aproximadamente 0,85, pode-se afirmar que o novo método de determinação da absorção em garrafa é um método eficaz que pode ser aprovado pela empresa. Desta forma, foi possível solucionar esta questão e permitir ao laboratório de controlo de qualidade determinar a absorção em garrafa de garrafas de vinho provenientes de clientes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mestrado em Contabilidade e Gestão de Instituições Financeiras

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We introduce an innovative approach to the simultaneous control of growth mode and magnetotransport properties of manganite thin films, based on an easy-to-implement film/substrate interface engineering. The deposition of a manganite seed layer and the optimization of the substrate temperature allows a persistent bi-dimensional epitaxy and robust ferromagnetic properties at the same time. Structural measurements confirm that in such interface-engineered films, the optimal properties are related to improved epitaxy. A new growth scenario is envisaged, compatible with a shift from heteroepitaxy towards pseudo-homoepitaxy. Relevant growth parameters such as formation energy, roughening temperature, strain profile and chemical states are derived.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A presente dissertação é o resultado de um estudo realizado entre Março de 2015 e Março de 2016 centrado no tema Eficiência Energética nos Edifícios, no âmbito da Dissertação do 2º ano do Mestrado em Engenharia Eletrotécnica – Sistemas Elétricos de Energia no Instituto Superior de Engenharia do Porto (ISEP). Atualmente, os edifícios são responsáveis por cerca de 40% do consumo de energia na maioria dos países da europa. Energia consumida, principalmente, no aquecimento, arrefecimento e na alimentação de aparelhos elétricos. Os hospitais, como grandes edifícios, são grandes consumidores de energia e, na maioria dos países europeus, situam-se entre os edifícios públicos menos eficientes. Neste contexto, representam um tipo de edifícios cuja atividade apresenta um potencial de poupança energético importante. O tipo de atividade aí desenvolvida, aliada às especificidades do sector da saúde, faz deste tipo de edifícios um alvo de análise e otimização energética bastante apetecível. O presente trabalho passa pelo estudo do potencial para a eficiência energética de um hospital situado na zona do Porto. Foi, inicialmente, efetuado um levantamento das necessidades energéticas, de modo a identificar os sectores prioritários de atuação. Este estudo conta com a análise dos consumos obtidos através do processo de monitorização, substituição da iluminação existente por uma mais eficiente, a instalação de painéis solares para reduzir o consumo destinado às águas quentes sanitárias, a substituição de caldeira a diesel por caldeira a biomassa, substituição de um chiller por um mais eficiente, entre outros. Os consumos registados no hospital em estudo serão comparados com um plano nacional (Eficiência Energética e Hídrica no Sistema Nacional de Saúde), para, desta forma, se perceber quais os consumos do hospital em estudo, quando comparados com outros hospitais.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wydział Chemii: Zakład Syntezy i Struktury Związków Organicznych

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La formación y preparación constante del personal de TI es una de las estrategias más efectivas para mejorar la calidad, estabilidad y seguridad de las redes y servicios asociados. En esta línea, el CEDIA ha venido implementando cursos y talleres de capacitación dirigidos a sus miembros y, dentro del CSIRT-CEDIA, se ha pensado en la posibilidad de optimizar los procesos asociados al despliegue de la infraestructura necesaria para proveer a los participantes de éstas capacitaciones, con el material personalizado adecuado, en las áreas de seguridad informática. Es así que se decidió usar técnicas de virtualización para aprovechar los recursos disponibles, pero aun cuando esto en sí no es una tendencia nueva, el uso de una copia completa del disco virtual para cada participante, no sólo resulta impráctico en cuestión de tiempo, sino también en cuanto al consumo de almacenamiento necesario. Este trabajo se orienta justamente a la optimización en los tiempos y consumos asociados a los procesos de replicación de un mismo equipo y disco virtuales para uso particularizado de varios participantes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The performance, energy efficiency and cost improvements due to traditional technology scaling have begun to slow down and present diminishing returns. Underlying reasons for this trend include fundamental physical limits of transistor scaling, the growing significance of quantum effects as transistors shrink, and a growing mismatch between transistors and interconnects regarding size, speed and power. Continued Moore's Law scaling will not come from technology scaling alone, and must involve improvements to design tools and development of new disruptive technologies such as 3D integration. 3D integration presents potential improvements to interconnect power and delay by translating the routing problem into a third dimension, and facilitates transistor density scaling independent of technology node. Furthermore, 3D IC technology opens up a new architectural design space of heterogeneously-integrated high-bandwidth CPUs. Vertical integration promises to provide the CPU architectures of the future by integrating high performance processors with on-chip high-bandwidth memory systems and highly connected network-on-chip structures. Such techniques can overcome the well-known CPU performance bottlenecks referred to as memory and communication wall. However the promising improvements to performance and energy efficiency offered by 3D CPUs does not come without cost, both in the financial investments to develop the technology, and the increased complexity of design. Two main limitations to 3D IC technology have been heat removal and TSV reliability. Transistor stacking creates increases in power density, current density and thermal resistance in air cooled packages. Furthermore the technology introduces vertical through silicon vias (TSVs) that create new points of failure in the chip and require development of new BEOL technologies. Although these issues can be controlled to some extent using thermal-reliability aware physical and architectural 3D design techniques, high performance embedded cooling schemes, such as micro-fluidic (MF) cooling, are fundamentally necessary to unlock the true potential of 3D ICs. A new paradigm is being put forth which integrates the computational, electrical, physical, thermal and reliability views of a system. The unification of these diverse aspects of integrated circuits is called Co-Design. Independent design and optimization of each aspect leads to sub-optimal designs due to a lack of understanding of cross-domain interactions and their impacts on the feasibility region of the architectural design space. Co-Design enables optimization across layers with a multi-domain view and thus unlocks new high-performance and energy efficient configurations. Although the co-design paradigm is becoming increasingly necessary in all fields of IC design, it is even more critical in 3D ICs where, as we show, the inter-layer coupling and higher degree of connectivity between components exacerbates the interdependence between architectural parameters, physical design parameters and the multitude of metrics of interest to the designer (i.e. power, performance, temperature and reliability). In this dissertation we present a framework for multi-domain co-simulation and co-optimization of 3D CPU architectures with both air and MF cooling solutions. Finally we propose an approach for design space exploration and modeling within the new Co-Design paradigm, and discuss the possible avenues for improvement of this work in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les métaheuristiques sont très utilisées dans le domaine de l'optimisation discrète. Elles permettent d’obtenir une solution de bonne qualité en un temps raisonnable, pour des problèmes qui sont de grande taille, complexes, et difficiles à résoudre. Souvent, les métaheuristiques ont beaucoup de paramètres que l’utilisateur doit ajuster manuellement pour un problème donné. L'objectif d'une métaheuristique adaptative est de permettre l'ajustement automatique de certains paramètres par la méthode, en se basant sur l’instance à résoudre. La métaheuristique adaptative, en utilisant les connaissances préalables dans la compréhension du problème, des notions de l'apprentissage machine et des domaines associés, crée une méthode plus générale et automatique pour résoudre des problèmes. L’optimisation globale des complexes miniers vise à établir les mouvements des matériaux dans les mines et les flux de traitement afin de maximiser la valeur économique du système. Souvent, en raison du grand nombre de variables entières dans le modèle, de la présence de contraintes complexes et de contraintes non-linéaires, il devient prohibitif de résoudre ces modèles en utilisant les optimiseurs disponibles dans l’industrie. Par conséquent, les métaheuristiques sont souvent utilisées pour l’optimisation de complexes miniers. Ce mémoire améliore un procédé de recuit simulé développé par Goodfellow & Dimitrakopoulos (2016) pour l’optimisation stochastique des complexes miniers stochastiques. La méthode développée par les auteurs nécessite beaucoup de paramètres pour fonctionner. Un de ceux-ci est de savoir comment la méthode de recuit simulé cherche dans le voisinage local de solutions. Ce mémoire implémente une méthode adaptative de recherche dans le voisinage pour améliorer la qualité d'une solution. Les résultats numériques montrent une augmentation jusqu'à 10% de la valeur de la fonction économique.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mobile sensor networks have unique advantages compared with wireless sensor networks. The mobility enables mobile sensors to flexibly reconfigure themselves to meet sensing requirements. In this dissertation, an adaptive sampling method for mobile sensor networks is presented. Based on the consideration of sensing resource constraints, computing abilities, and onboard energy limitations, the adaptive sampling method follows a down sampling scheme, which could reduce the total number of measurements, and lower sampling cost. Compressive sensing is a recently developed down sampling method, using a small number of randomly distributed measurements for signal reconstruction. However, original signals cannot be reconstructed using condensed measurements, as addressed by Shannon Sampling Theory. Measurements have to be processed under a sparse domain, and convex optimization methods should be applied to reconstruct original signals. Restricted isometry property would guarantee signals can be recovered with little information loss. While compressive sensing could effectively lower sampling cost, signal reconstruction is still a great research challenge. Compressive sensing always collects random measurements, whose information amount cannot be determined in prior. If each measurement is optimized as the most informative measurement, the reconstruction performance can perform much better. Based on the above consideration, this dissertation is focusing on an adaptive sampling approach, which could find the most informative measurements in unknown environments and reconstruct original signals. With mobile sensors, measurements are collect sequentially, giving the chance to uniquely optimize each of them. When mobile sensors are about to collect a new measurement from the surrounding environments, existing information is shared among networked sensors so that each sensor would have a global view of the entire environment. Shared information is analyzed under Haar Wavelet domain, under which most nature signals appear sparse, to infer a model of the environments. The most informative measurements can be determined by optimizing model parameters. As a result, all the measurements collected by the mobile sensor network are the most informative measurements given existing information, and a perfect reconstruction would be expected. To present the adaptive sampling method, a series of research issues will be addressed, including measurement evaluation and collection, mobile network establishment, data fusion, sensor motion, signal reconstruction, etc. Two dimensional scalar field will be reconstructed using the method proposed. Both single mobile sensors and mobile sensor networks will be deployed in the environment, and reconstruction performance of both will be compared.In addition, a particular mobile sensor, a quadrotor UAV is developed, so that the adaptive sampling method can be used in three dimensional scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing trend of disaster victims globally is posing a complex challenge for disaster management authorities. Moreover, to accomplish successful transition between preparedness and response, it is important to consider the different features inherent to each type of disaster. Floods are portrayed as one of the most frequent and harmful disasters, hence introducing the necessity to develop a tool for disaster preparedness to perform efficient and effective flood management. The purpose of the article is to introduce a method to simultaneously define the proper location of shelters and distribution centers, along with the allocation of prepositioned goods and distribution decisions required to satisfy flood victims. The tool combines the use of a raster geographical information system (GIS) and an optimization model. The GIS determines the flood hazard of the city areas aiming to assess the flood situation and to discard floodable facilities. Then, the multi-commodity multimodal optimization model is solved to obtain the Pareto frontier of two criteria: distance and cost. The methodology was applied to a case study in the flood of Villahermosa, Mexico, in 2007, and the results were compared to an optimized scenario of the guidelines followed by Mexican authorities, concluding that the value of the performance measures was improved using the developed method. Furthermore, the results exhibited the possibility to provide adequate care for people affected with less facilities than the current approach and the advantages of considering more than one distribution center for relief prepositioning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les métaheuristiques sont très utilisées dans le domaine de l'optimisation discrète. Elles permettent d’obtenir une solution de bonne qualité en un temps raisonnable, pour des problèmes qui sont de grande taille, complexes, et difficiles à résoudre. Souvent, les métaheuristiques ont beaucoup de paramètres que l’utilisateur doit ajuster manuellement pour un problème donné. L'objectif d'une métaheuristique adaptative est de permettre l'ajustement automatique de certains paramètres par la méthode, en se basant sur l’instance à résoudre. La métaheuristique adaptative, en utilisant les connaissances préalables dans la compréhension du problème, des notions de l'apprentissage machine et des domaines associés, crée une méthode plus générale et automatique pour résoudre des problèmes. L’optimisation globale des complexes miniers vise à établir les mouvements des matériaux dans les mines et les flux de traitement afin de maximiser la valeur économique du système. Souvent, en raison du grand nombre de variables entières dans le modèle, de la présence de contraintes complexes et de contraintes non-linéaires, il devient prohibitif de résoudre ces modèles en utilisant les optimiseurs disponibles dans l’industrie. Par conséquent, les métaheuristiques sont souvent utilisées pour l’optimisation de complexes miniers. Ce mémoire améliore un procédé de recuit simulé développé par Goodfellow & Dimitrakopoulos (2016) pour l’optimisation stochastique des complexes miniers stochastiques. La méthode développée par les auteurs nécessite beaucoup de paramètres pour fonctionner. Un de ceux-ci est de savoir comment la méthode de recuit simulé cherche dans le voisinage local de solutions. Ce mémoire implémente une méthode adaptative de recherche dans le voisinage pour améliorer la qualité d'une solution. Les résultats numériques montrent une augmentation jusqu'à 10% de la valeur de la fonction économique.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El departamento de mercadeo y ventas es fundamental en una empresa debido a que es el encargado de desarrollar e implementar estrategias que satisfagan las necesidades y requerimientos del cliente. Es aquí donde más se puede ver reflejado el aumento de las ventas de la empresa. El servicio al cliente, la relación con el mismo y el acompañamiento, es un tema muy importante a tratar, tanto para atraer a nuevos clientes como también para conservar a los clientes actuales. Este trabajo se desarrolló con base en la problemática de la creciente pérdida de clientes de la empresa Leader Ltda., y tiene como objetivo diseñar y crear un plan de mercadeo y ventas para la misma. Por medio de un estudio no experimental, descriptivo e interpretativo  se enfocó en diferentes análisis internos y externos de la compañía para poder desarrollar un plan de acción que se pueda implementar en la compañía.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stroke stands for one of the most frequent causes of death, without distinguishing age or genders. Despite representing an expressive mortality fig-ure, the disease also causes long-term disabilities with a huge recovery time, which goes in parallel with costs. However, stroke and health diseases may also be prevented considering illness evidence. Therefore, the present work will start with the development of a decision support system to assess stroke risk, centered on a formal framework based on Logic Programming for knowledge rep-resentation and reasoning, complemented with a Case Based Reasoning (CBR) approach to computing. Indeed, and in order to target practically the CBR cycle, a normalization and an optimization phases were introduced, and clustering methods were used, then reducing the search space and enhancing the cases re-trieval one. On the other hand, and aiming at an improvement of the CBR theo-retical basis, the predicates` attributes were normalized to the interval 0…1, and the extensions of the predicates that match the universe of discourse were re-written, and set not only in terms of an evaluation of its Quality-of-Information (QoI), but also in terms of an assessment of a Degree-of-Confidence (DoC), a measure of one`s confidence that they fit into a given interval, taking into account their domains, i.e., each predicate attribute will be given in terms of a pair (QoI, DoC), a simple and elegant way to represent data or knowledge of the type incomplete, self-contradictory, or even unknown.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The spatial distribution of the magnetic field and the coupling between the coils in the Wireless Power Transfer (WPT) systems is an important aspect to consider in the system design and efficiency optimization. The presented study in this paper is based on tests performed on a physical model. The transmitting (primary) equipment, is an electrical three-phase system, capable to be connected in star or delta (both electrically and geometrically). The measured results allow to describe graphically the magnetic field distribution in three dimensions. The analytical formulas aim to help to understand and to quantify the physical phenomena but they cannot be considered a universal approach and the measurement results help to understand better the observable facts. In the WPT, the key issues that will influence the efficiency, are the alignment of the coils, the spatial orientation of the magnetic field, the detachment and the tilt between the windings, all they changing the magnetic coupling between the transmitter and the receiver of energy. This research is directed not only to the magnetic field distribution but finally, to optimize the energy transfer efficiency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent technological development has enabled research- ers to gather data from different performance scenarios while considering players positioning and action events within a specific time frame. This technology varies from global positioning systems to radio frequency devices and computer vision tracking, to name the most common, and aims to collect players’ time motion data and enable the dynamical analysis of performance. Team sports—and in particular, invasion games—present a complex dynamic by nature based on the interaction between 2 opposing sides trying to outperform 1 another. During match and training situations, players’ actions are coupled to their performance context at different interaction levels. As expected, ball, teammates’, and opponents’ positioning play an important role in this interaction process. But other factors, such as final score, teams’ development level, and players’ expertise, seem to affect the match dynamics. In this symposium, we will focus on how different constraints affect invasion games dynamics during both match and training situations. This relation will be established while underpinning the importance of these effects to game teaching and performance optimization. Regarding the match, different performance indicators based on spatial-temporal relations between players and teams will be presented to reveal the interaction processes that form the crucial component of game analysis. Considering the training, this symposium will address the relationship of small-sided games with full- sized matches and will present how players’ dynamical interaction affects different performance indicators.