976 resultados para Work flow
Resumo:
El uso de aritmética de punto fijo es una opción de diseño muy extendida en sistemas con fuertes restricciones de área, consumo o rendimiento. Para producir implementaciones donde los costes se minimicen sin impactar negativamente en la precisión de los resultados debemos llevar a cabo una asignación cuidadosa de anchuras de palabra. Encontrar la combinación óptima de anchuras de palabra en coma fija para un sistema dado es un problema combinatorio NP-hard al que los diseñadores dedican entre el 25 y el 50 % del ciclo de diseño. Las plataformas hardware reconfigurables, como son las FPGAs, también se benefician de las ventajas que ofrece la aritmética de coma fija, ya que éstas compensan las frecuencias de reloj más bajas y el uso más ineficiente del hardware que hacen estas plataformas respecto a los ASICs. A medida que las FPGAs se popularizan para su uso en computación científica los diseños aumentan de tamaño y complejidad hasta llegar al punto en que no pueden ser manejados eficientemente por las técnicas actuales de modelado de señal y ruido de cuantificación y de optimización de anchura de palabra. En esta Tesis Doctoral exploramos distintos aspectos del problema de la cuantificación y presentamos nuevas metodologías para cada uno de ellos: Las técnicas basadas en extensiones de intervalos han permitido obtener modelos de propagación de señal y ruido de cuantificación muy precisos en sistemas con operaciones no lineales. Nosotros llevamos esta aproximación un paso más allá introduciendo elementos de Multi-Element Generalized Polynomial Chaos (ME-gPC) y combinándolos con una técnica moderna basada en Modified Affine Arithmetic (MAA) estadístico para así modelar sistemas que contienen estructuras de control de flujo. Nuestra metodología genera los distintos caminos de ejecución automáticamente, determina las regiones del dominio de entrada que ejercitarán cada uno de ellos y extrae los momentos estadísticos del sistema a partir de dichas soluciones parciales. Utilizamos esta técnica para estimar tanto el rango dinámico como el ruido de redondeo en sistemas con las ya mencionadas estructuras de control de flujo y mostramos la precisión de nuestra aproximación, que en determinados casos de uso con operadores no lineales llega a tener tan solo una desviación del 0.04% con respecto a los valores de referencia obtenidos mediante simulación. Un inconveniente conocido de las técnicas basadas en extensiones de intervalos es la explosión combinacional de términos a medida que el tamaño de los sistemas a estudiar crece, lo cual conlleva problemas de escalabilidad. Para afrontar este problema presen tamos una técnica de inyección de ruidos agrupados que hace grupos con las señales del sistema, introduce las fuentes de ruido para cada uno de los grupos por separado y finalmente combina los resultados de cada uno de ellos. De esta forma, el número de fuentes de ruido queda controlado en cada momento y, debido a ello, la explosión combinatoria se minimiza. También presentamos un algoritmo de particionado multi-vía destinado a minimizar la desviación de los resultados a causa de la pérdida de correlación entre términos de ruido con el objetivo de mantener los resultados tan precisos como sea posible. La presente Tesis Doctoral también aborda el desarrollo de metodologías de optimización de anchura de palabra basadas en simulaciones de Monte-Cario que se ejecuten en tiempos razonables. Para ello presentamos dos nuevas técnicas que exploran la reducción del tiempo de ejecución desde distintos ángulos: En primer lugar, el método interpolativo aplica un interpolador sencillo pero preciso para estimar la sensibilidad de cada señal, y que es usado después durante la etapa de optimización. En segundo lugar, el método incremental gira en torno al hecho de que, aunque es estrictamente necesario mantener un intervalo de confianza dado para los resultados finales de nuestra búsqueda, podemos emplear niveles de confianza más relajados, lo cual deriva en un menor número de pruebas por simulación, en las etapas iniciales de la búsqueda, cuando todavía estamos lejos de las soluciones optimizadas. Mediante estas dos aproximaciones demostramos que podemos acelerar el tiempo de ejecución de los algoritmos clásicos de búsqueda voraz en factores de hasta x240 para problemas de tamaño pequeño/mediano. Finalmente, este libro presenta HOPLITE, una infraestructura de cuantificación automatizada, flexible y modular que incluye la implementación de las técnicas anteriores y se proporciona de forma pública. Su objetivo es ofrecer a desabolladores e investigadores un entorno común para prototipar y verificar nuevas metodologías de cuantificación de forma sencilla. Describimos el flujo de trabajo, justificamos las decisiones de diseño tomadas, explicamos su API pública y hacemos una demostración paso a paso de su funcionamiento. Además mostramos, a través de un ejemplo sencillo, la forma en que conectar nuevas extensiones a la herramienta con las interfaces ya existentes para poder así expandir y mejorar las capacidades de HOPLITE. ABSTRACT Using fixed-point arithmetic is one of the most common design choices for systems where area, power or throughput are heavily constrained. In order to produce implementations where the cost is minimized without negatively impacting the accuracy of the results, a careful assignment of word-lengths is required. The problem of finding the optimal combination of fixed-point word-lengths for a given system is a combinatorial NP-hard problem to which developers devote between 25 and 50% of the design-cycle time. Reconfigurable hardware platforms such as FPGAs also benefit of the advantages of fixed-point arithmetic, as it compensates for the slower clock frequencies and less efficient area utilization of the hardware platform with respect to ASICs. As FPGAs become commonly used for scientific computation, designs constantly grow larger and more complex, up to the point where they cannot be handled efficiently by current signal and quantization noise modelling and word-length optimization methodologies. In this Ph.D. Thesis we explore different aspects of the quantization problem and we present new methodologies for each of them: The techniques based on extensions of intervals have allowed to obtain accurate models of the signal and quantization noise propagation in systems with non-linear operations. We take this approach a step further by introducing elements of MultiElement Generalized Polynomial Chaos (ME-gPC) and combining them with an stateof- the-art Statistical Modified Affine Arithmetic (MAA) based methodology in order to model systems that contain control-flow structures. Our methodology produces the different execution paths automatically, determines the regions of the input domain that will exercise them, and extracts the system statistical moments from the partial results. We use this technique to estimate both the dynamic range and the round-off noise in systems with the aforementioned control-flow structures. We show the good accuracy of our approach, which in some case studies with non-linear operators shows a 0.04 % deviation respect to the simulation-based reference values. A known drawback of the techniques based on extensions of intervals is the combinatorial explosion of terms as the size of the targeted systems grows, which leads to scalability problems. To address this issue we present a clustered noise injection technique that groups the signals in the system, introduces the noise terms in each group independently and then combines the results at the end. In this way, the number of noise sources in the system at a given time is controlled and, because of this, the combinato rial explosion is minimized. We also present a multi-way partitioning algorithm aimed at minimizing the deviation of the results due to the loss of correlation between noise terms, in order to keep the results as accurate as possible. This Ph.D. Thesis also covers the development of methodologies for word-length optimization based on Monte-Carlo simulations in reasonable times. We do so by presenting two novel techniques that explore the reduction of the execution times approaching the problem in two different ways: First, the interpolative method applies a simple but precise interpolator to estimate the sensitivity of each signal, which is later used to guide the optimization effort. Second, the incremental method revolves on the fact that, although we strictly need to guarantee a certain confidence level in the simulations for the final results of the optimization process, we can do it with more relaxed levels, which in turn implies using a considerably smaller amount of samples, in the initial stages of the process, when we are still far from the optimized solution. Through these two approaches we demonstrate that the execution time of classical greedy techniques can be accelerated by factors of up to ×240 for small/medium sized problems. Finally, this book introduces HOPLITE, an automated, flexible and modular framework for quantization that includes the implementation of the previous techniques and is provided for public access. The aim is to offer a common ground for developers and researches for prototyping and verifying new techniques for system modelling and word-length optimization easily. We describe its work flow, justifying the taken design decisions, explain its public API and we do a step-by-step demonstration of its execution. We also show, through an example, the way new extensions to the flow should be connected to the existing interfaces in order to expand and improve the capabilities of HOPLITE.
Resumo:
Os prestadores de serviços de telecomunicações e operadores de telecomunicações deparam-se com um aumento exponencial das necessidades de largura de banda. A evolução e massificação dos serviços Internet e Intranet pelos serviços públicos e privados deixaram de ser uma mera adaptação do protocolo TCP, à qualidade da ligação sendo uma necessidade a diferenciação do tráfego. As metodologias que asseguram uma qualidade de serviço no âmbito dos fornecedores de serviços internet são a forma de garantir uma qualidade de serviço adequada a cada tipo de tráfego. Estas metodologias são suportadas pela rede IP MPLS dos diversos operadores de telecomunicações no transporte dos diversos serviços dos seus clientes empresarias e domésticos no acesso à internet dos diversos serviços públicos de dados e voz e nas redes virtuais privadas. Os portais aplicacionais são a interface directa com o cliente para definição dos acordos de nível de serviço “Service Level Agreements” e a sua associação à especificação dos níveis de serviço “Service Level Specification”, para posterior relação com a definição de métricas adequadas à qualidade de serviço acordada com o cliente no desenho dos serviços de uma rede IP “MultiProtocol Label Switch”. A proposta consiste em criar uma metodologia para mapear as necessidades de serviços dos clientes em SLAs e registá-los numa base de dados, separando claramente a qualidade do serviço vista na óptica do operador em: arquitectura de rede de transporte, arquitectura do serviço e arquitectura de monitoria. Estes dados são mapeados em parâmetros e especificações de implementação dos serviços de suporte ao negócio do operador tendo em vista a criação de um “Work Flow” fim a fim. Paralelamente define-se os serviços a disponibilizar comercialmente, o conjunto de serviços suportados pela rede e tecnologia IP MPLS com a parametrização de ”Quality of Service Assurance” adequada a cada um, cria-se uma arquitectura de rede de suporte ao transporte base entre os diversos equipamentos agregadores de acessos através do “Backbone”, define-se uma arquitectura de suporte para cada tipo de serviço independente da arquitectura de transporte. Neste trabalho implementam-se algumas arquitecturas de QoS estudadas no IP MPLS em simuladores disponibilizados pela comunidade “Open Source” e analisamos as vantagens de desvantagens de cada uma. Todas as necessidades são devidamente equacionadas, prevendo o seu crescimento, desempenho, estabelecendo regras de distribuição de largura de banda e análise de desempenho, criando redes escaláveis e com estimativas de crescimento optimistas. Os serviços são desenhados de forma a adaptarem-se à evolução das necessidades aplicacionais, ao crescimento do número de utilizadores e evolução do próprio serviço.
Resumo:
This master’s thesis addresses the maintenance of pre-computed structures, which store a frequent or expensive query, for the nested bag data type in the high level work-flow language Pig Latin. This thesis defines a model suitable to accommodate incremental expressions over nested bags on Pig Latin. Afterwards, the partitioned normal form for sets is extended with further restrictions, in order to accommodate the nested bag model, allow the Pig Latin nest and unnest operators revert each other, and create a suitable environment to the incremental computations. Subsequently, the extended operators – extended union and extended difference – are defined for the nested bag data model with the partitioned normal form for bags (PNF Bag) restriction, and semantics for the extended operators are given. Finally, incremental data propagation expressions are proposed for the nest and unnest operators on the data model proposed with the PNF Bag restriction, and the proof of correctness is given.
Resumo:
The use of wastes and industrial by-products as building materials is an important issue in order to decrease costs with waste management and the embodied energy of building products. In this study scrap tire rubber was used as additional aggregate of mortars based on natural hydraulic lime NHL 3.5 and natural sand. Different particle size fractions and proportions of scrap tire rubber were used: a mix obtained directly from industry and separated fine, medium and coarse fractions; 0 %, 18 %, 36 % and 54 % of the weight of binder, corresponding to 2.5 %, 5 % and 7.5 % of the weight of sand. As mortars based on NHL specifications became stricter with the current version of EN 459–1:2015, the influence of the rubber’s additions on the mortars’ fresh state, mechanical and physical performance is presented in this work: flow table consistency, water retention, dynamic elasticity modulus, flexural and compressive strength, open porosity and bulk density, capillary absorption, drying and thermal conductivity are studied. The use of the rubber mix coming from the waste tire industry seems advantageous and may open possibilities for use as raw material by the mortars industry.
Resumo:
L'objectiu d'aquest treball és seleccionar un sistema de gestió de continguts de codi obert i dissenyar el web d'uns estudis de la UOC amb el programari escollit. Per a això, es comproven les capacitats d'aquest programari en la gestió del contingut del web: creació de nous continguts, expansió, permisos d'accés, flux de treball, etc.
Resumo:
HR-394 was a software and database development project. Via funding provided by the Iowa Highway Research Board, the Iowa County Engineer's Association Service Bureau oversaw the planning and implementation of an Internet based application that supports two major local-government transportation project activities: Project programming and Development tracking. The goals were to reduce errors and inconsistencies, speed up the processes, link people to both project data and each other, and build a framework that could eventually support a 'paperless' work flow. The work started in 1999 and initial development was completed by the fall of 2002. Since going live, several 'piggy back' applications have been required to make the Programming side better fit actual work procedures. This part of the system has proven adequate but will be rewritten in 2004 to make it easier to use. The original development side module was rejected by the users and so had to be rewritten in 2003. The second version has proven much better, is heavily used, and is interconnected with Iowa DOT project data systems. Now that the system is in operation, it will be maintained and operated by the ICEA Service Bureau as an ongoing service function.
Resumo:
Työn tarkoituksena on kehittää uusi liiketoimintaprosessien koordinointirakenteen kehittämismenetelmä perinteisten, prosessin työnkulun kehittämiseen keskittyvien, prosessinkehittämismenetelmien tueksi. Menetelmäkehityksen pohjustamiseksi on työssä ensin tarkasteltu kirjallisuudessa esitettyjä prosessien kehittämismenetelmiä sekä liiketoimintaprosessien ja organisaatioiden koordinointitarpeita ja koordinointimekanismeja. Näidenperusteella prosessin koordinointirakenteen kehittämismenetelmälle on ideoitu pienryhmässä vaiheet ja sisältö. Menetelmän toimivuutta testattiin yhdessä caseprosessissa. Käytännön kehittämistyössä menetelmä toimi pääasiassa ideoidun rungon mukaisesti ja koordinointirakenteen kehittäminen koettiin hyödylliseksi caseprosessin toiminnan kannalta. Testiprosessin kehittämisestä saatujen tulosten perusteella on menetelmälle laadittu jatkokehityssuunnitelma, jokasisältää viisi potentiaalista menetelmän jatkokehityssuuntaa.
Resumo:
Työn tarkoitus on konelinjaan tutustumalla ja kyselyllä selvittää merkittävimmät kehitystä hidastavat ei-tekniset tekijät, jotka vaativat nykyistä enemmän panostusta suorituskyvyn nostamiseksi ja toiminnan tehostamiseksi valitulla konelinjalla. Työn tavoitteena on antaa suositus toimenpiteistä, joilla tehdyt ja tulevaisuudessa tehtävät parannustoimenpiteet viedään nykyistä paremmin käytäntöön. Kyselyn vastausten perusteella lähdetään etsimään haastatteluilla syvällisempiä vastauksia ja toimintaehdotuksia kyselyllä löydettyihin kehitystä vaativiin kohteisiin konelinjalla. Ongelmana eivät ole suorituskyvyn mittarit, vaan parannusten ja uusien toimintatapojen vieminen käytännön toiminnaksi.Nykyään ei-taloudelliset ja ei-tekniset tekijät ovat merkittävässä asemassa suorituskyvyn parantamisessa ja ylläpitämisessä. Henkilöstö on merkittävä voimavara, ja tämän vuoksi esimiesten yksi tärkeimmistä tehtävistä on motivoida työntekijät ja saada sitä kautta piilevät voimavarat tehokkaampaan käyttöön.Konelinjan henkilöstön tehokkaampi työskentely edellyttää läheistä yhteistyötä esimiesten ja työntekijöiden välillä, mitä edesauttaa esimerkiksi toimiva tiedonkulku. On löydettävä uusia toimintamalleja, joilla työntekijöiden osaaminen saadaan paremmin otettua huomioon
Resumo:
Expatriation has become increasingly common due to the global trade expansion. Many large companies base their production facilities in far-flung countries, where experts are sent from their own countries to launch the operations. Working in a foreign environment demands from so-called expatriates considerable adaptability. This study aimed to investigate if following expatriation mental health difficulties were experienced by the employees themselves or their family members. This study investigated by a questionnaire and interviews how expatriate employees in Finnish companies operating in different regions of Brazil and their families adjusted. Investigated employees were required to be at least 6 months in expatriation. Data were collected in Brazil during their stay at least 3 months after the arrival. The survey covered 121 expatriate employees, that operated in 17 different companies, from which 71 employees from 10 different companies responded to the questionnaire. All the employees from the two largest enterprises and their spouses were invited to focus groups; in total 43 persons (22 employees and 21 employees’ spouses) participated in a group or individual interviews. No significant mental health difficulties were found among the expatriate employees. Only a tenth of the expatriate employees reported strain. The experience of strain symptoms was found to be related to long working days, intense working rhythm and lack of friends. Work satisfaction seemed to be an important mediator in the coping process. While abroad, the expatriate employees were highly recognized for their work. Due to the immature organization of work they could often use their creative capacities to improve the work flow. The opportunity to see the effects of their own contribution with their own eyes to the development of the enterprise made them feel good. The association between the expatriate employees’ adjustment and that of their spouses’ was evident. The spouses’ situation was markedly different than that of the expatriate employees’ themselves. Expatriation changed the family members’ previous division of tasks considerably. The expatriate spouses had to change their roles more than the expatriate employees themselves; since most of them were highly educated women, who were leaded through an identity crisis due to at least temporary renunciation of own work and career.
Resumo:
Tuulivoima on Euroopassa nopeimmin kasvava energian tuotantomuoto. Tuulivoimateollisuuden arvioidaan kasvavan Suomessa huomattavasti lähivuosien aikana ennakoidun syöttötariffipäätöksen myötä, jolloin kilpailu alalla tulee kasvamaan. Tavoitteena oli kehittää tuulivoimalan tornin valmistusta Levator Oy:ssä hitsaustuotantoa tehostamalla ja tuotannon ohjattavuutta parantamalla. Kehitystyöhön kuului toisen hitsauslinjan käyttöönoton suunnittelu ja ohjeiston laatiminen työnjohdolle. Toisen hitsauslinjan käyttöönoton suunnittelun tarkoituksena oli suunnitella muutokset nykyiseen tuotantoon uuden linjan käyttöönoton mahdollistamiseksi. Suunnittelu aloitettiin valitsemalla hitsausprosessit, jonka jälkeen suunniteltiin laitetarpeet työvaihe-analyysien pohjalta. Tuotantolayout muutettiin nykyisestä funktionaalisesta tuotannosta tuotantosoluista koostuvaksi tuotantolinjaksi, jolloin materiaalien virtautus parani huomattavasti. Tuotannon ohjaustavaksi valittiin kapeikko-ohjaus. Ohjeiston laatimisen tarkoituksena oli kerätä ja dokumentoida kaikki tuotannossa tarvittava tieto. Ohjeiston sisältää laadunohjaus, materiaalivirtojen ohjaus ja työnohjaus osiot, joiden tarkoituksena on helpottaa työnjohtamista. Ohjeisto määrittelee yhtenäiset tuotannon toimintatavat, jolloin tuotannon ohjattavuus helpottuu. Tavoitteet täyttyivät, kun toisen tuotantolinjan käyttöönoton vaatimat muutokset aloitettiin suunnitelmien mukaisesti syyskuussa 2009. Ohjeiston sisältö saatiin määriteltyä ja eri osioiden pilotit saatiin valmiiksi joulukuun aikana. Tuotannon ohjattavuus kehittyi huomattavasti ja samalla tuottavuus parani merkittävästi.
Resumo:
In R&D organizations multiple projects are executed concurrently. Problems arises in managing shared resources since they are needed by multiple projects simultaneously. The objective of this thesis was to study how the project and resource management could be developed in a public sector R&D organization. The qualitative research was carried out in the Magnetic Measurements section at CERN where the section measures magnets for particle accelerators and builds state of the art measurement devices for various needs. Hence, the R&D and measurement projects are very time consuming and very complex. Based on the previous research and the requirements from the organization the best alter- native for resource management was to build a project management information system. A centralized database was constructed and on top of it was built an application for interacting and visualizing the project data. The application allows handling project data, which works as a basis for resource planning before and during the projects are executed. It is one way to standardize the work-flow of projects, which strengthens the project process. Additionally, it was noted that the inner customer’s database, the measurement system and the new application needed to be integrated. Further integration ensures that the project data is received efficiently from customers and available not only within the application but also during the concrete work. The research results introduced a new integrated application, which centralizes the project information flow with better visibility.
Resumo:
This book argues for novel strategies to integrate engineering design procedures and structural analysis data into architectural design. Algorithmic procedures that recently migrated into the architectural practice are utilized to improve the interface of both disciplines. Architectural design is predominately conducted as a negotiation process of various factors but often lacks rigor and data structures to link it to quantitative procedures. Numerical structural design on the other hand could act as a role model for handling data and robust optimization but it often lacks the complexity of architectural design. The goal of this research is to bring together robust methods from structural design and complex dependency networks from architectural design processes. The book presents three case studies of tools and methods that are developed to exemplify, analyze and evaluate a collaborative work flow.