883 resultados para applications in logistics


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rachit Agarwal, Rafael V. Martinez-Catala, Sean Harte, Cedric Segard, Brendan O'Flynn, "Modeling Power in Multi-functionality Sensor Network Applications," sensorcomm, pp.507-512, 2008 Proceedings of the Second International Conference on Sensor Technologies and Applications, August 25-August 31 2008, Cap Esterel, France

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe a 42.6 Gbit/s all-optical pattern recognition system which uses semiconductor optical amplifiers (SOAs). A circuit with three SOA-based logic gates is used to identify the presence of specific port numbers in an optical packet header.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computational Fluid Dynamics (CFD) is gradually becoming a powerful and almost essential tool for the design, development and optimization of engineering applications. However the mathematical modelling of the erratic turbulent motion remains the key issue when tackling such flow phenomena. The reliability of CFD analysis depends heavily on the turbulence model employed together with the wall functions implemented. In order to resolve the abrupt changes in the turbulent energy and other parameters situated at near wall regions a particularly fine mesh is necessary which inevitably increases the computer storage and run-time requirements. Turbulence modelling can be considered to be one of the three key elements in CFD. Precise mathematical theories have evolved for the other two key elements, grid generation and algorithm development. The principal objective of turbulence modelling is to enhance computational procedures of efficient accuracy to reproduce the main structures of three dimensional fluid flows. The flow within an electronic system can be characterized as being in a transitional state due to the low velocities and relatively small dimensions encountered. This paper presents simulated CFD results for an investigation into the predictive capability of turbulence models when considering both fluid flow and heat transfer phenomena. Also a new two-layer hybrid kε / kl turbulence model for electronic application areas will be presented which holds the advantages of being cheap in terms of the computational mesh required and is also economical with regards to run-time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The pseudo-spectral solution method offers a flexible and fast alternative to the more usual finite element/volume/difference methods, particularly when the long-time transient behaviour of a system is of interest. Since the exact solution is obtained at the grid collocation points superior accuracy can be achieved on modest grid resolution. Furthermore, the grid can be freely adapted with time and in space, to particular flow conditions or geometric variations. This is especially advantageous where strongly coupled, time-dependent, multi-physics solutions are investigated. Examples include metallurgical applications involving the interaction of electromagnetic fields and conducting liquids with a free sutface. The electromagnetic field then determines the instantaneous liquid volume shape and the liquid shape affects in turn the electromagnetic field. In AC applications a thin "skin effect" region results on the free surface that dominates grid requirements. Infinitesimally thin boundary cells can be introduced using Chebyshev polynomial expansions without detriment to the numerical accuracy. This paper presents a general methodology of the pseudo-spectral approach and outlines the solution procedures used. Several instructive example applications are given: the aluminium electrolysis MHD problem, induction melting and stirring and the dynamics of magnetically levitated droplets in AC and DC fields. Comparisons to available analytical solutions and to experimental measurements will be discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Solder paste is the most widely used bonding material in the assembly of surface mount devices in electronic industries. It generally has a flocculated structure (show aggregation of solder particles), and hence are known to exhibit a thixotropic behavior. This is recognized by the decrease in apparent viscosity of paste material with time when subjected to a constant shear rate. The proper characterisation of this timedependent rheological behaviour of solder pastes is crucial for establishing the relationships between the pastes’ structure and flow behaviour; and for correlating the physical parameters with paste printing performance. In this paper, we present a novel method which has been developed for characterising the timedependent and non-Newtonian rheological behaviour of solder pastes as a function of shear rates. The objective of the study reported in this paper is to investigate the thixotropic build-up behaviour of solder pastes. The stretched exponential model(SEM) has been used to model the structural changes during the build-up process and to correlate model parameters with the paste printing process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thermosetting polymer materials are widely utilised in modern microelectronics packaging technology. These materials are used for a number of functions, such as for device bonding, for structural support applications and for physical protection of semiconductor dies. Typically, convection heating systems are used to raise the temperature of the materials to expedite the polymerisation process. The convection cure process has a number of drawbacks including process durations generally in excess of 1 hour and the requirement to heat the entire printed circuit board assembly, inducing thermomechanical stresses which effect device reliability. Microwave energy is able to raise the temperature of materials in a rapid, controlled manner. As the microwave energy penetrates into the polymer materials, the heating can be considered volumetric – i.e. the rate of heating is approximately constant throughout the material. This enables a maximal heating rate far greater than is available with convection oven systems which only raise the surface temperature of the polymer material and rely on thermal conductivity to transfer heat energy into the bulk. The high heating rate, combined with the ability to vary the operating power of the microwave system, enables the extremely rapid cure processes. Microwave curing of a commercially available encapsulation material has been studied experimentally and through use of numerical modelling techniques. The material assessed is Henkel EO-1080, a single component thermosetting epoxy. The producer has suggested three typical convection oven cure options for EO1080: 20 min at 150C or 90 min at 140C or 120 min at 110C. Rapid curing of materials of this type using advanced microwave systems, such as the FAMOBS system [1], is of great interest to microelectronics system manufacturers as it has the potential to reduce manufacturing costs, increase device reliability and enables new device designs. Experimental analysis has demonstrated that, in a realistic chip-on-board encapsulation scenario, the polymer material can be fully cured in approximately one minute. This corresponds to a reduction in cure time of approximately 95 percent relative to the convection oven process. Numerical assessment of the process [2] also suggests that cure times of approximately 70 seconds are feasible whilst indicating that the decrease in process duration comes at the expense of variation in degree of cure within the polymer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rule testing in transport scheduling is a complex and potentially costly business problem. This paper proposes an automated method for the rule-based testing of business rules using the extensible Markup Language for rule representation and transportation. A compiled approach to rule execution is also proposed for performance-critical scheduling systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Inherently error-resilient applications in areas such as signal processing, machine learning and data analytics provide opportunities for relaxing reliability requirements, and thereby reducing the overhead incurred by conventional error correction schemes. In this paper, we exploit the tolerable imprecision of such applications by designing an energy-efficient fault-mitigation scheme for unreliable data memories to meet target yield. The proposed approach uses a bit-shuffling mechanism to isolate faults into bit locations with lower significance. This skews the bit-error distribution towards the low order bits, substantially limiting the output error magnitude. By controlling the granularity of the shuffling, the proposed technique enables trading-off quality for power, area, and timing overhead. Compared to error-correction codes, this can reduce the overhead by as much as 83% in read power, 77% in read access time, and 89% in area, when applied to various data mining applications in 28nm process technology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As a leading facility in laser-driven nuclear physics, ELI-NP will develop innovative research in the fields of materials behavior in extreme environments and radiobiology, with applications in the development of accelerator components, new materials for next generation fusion and fission reactors, shielding solutions for equipment and human crew in long term space missions and new biomedical technologies. The specific properties of the laser-driven radiation produced with two lasers of 1 PW at a pulse repetition rate of 1 Hz each are an ultra-short time scale, a relatively broadband spectrum and the possibility to provide simultaneously several types of radiation. Complex, cosmic-like radiation will be produced in a ground-based laboratory allowing comprehensive investigations of their effects on materials and biological systems. The expected maximum energy and intensity of the radiation beams are 19 MeV with 10^9 photon/pulse for photon radiation, 2 GeV with 108 electron/pulse for electron beams, 60 MeV with 10^12 proton/pulse for proton and ion beams and 60 MeV with 107 neutron/pulse for a neutron source. Research efforts will be directed also towards measurements for radioprotection of the prompt and activated dose, as a function of laser and target characteristics and to the development and testing of various dosimetric methods and equipment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Value chain collaboration has been a prevailing topic for research, and there is a constantly growing interest in developing collaborative models for improved efficiency in logistics. One area of collaboration is demand information management, which enables improved visibility and decrease of inventories in the value chain. Outsourcing of non-core competencies has changed the nature of collaboration from intra-enterprise to cross-enterprise activity, and this together with increasing competition in the globalizing markets have created a need for methods and tools for collaborative work. The retailer part in the value chain of consumer packaged goods (CPG) has been studied relatively widely, proven models have been defined, and there exist several best practice collaboration cases. The information and communications technology has developed rapidly, offering efficient solutions and applications to exchange information between value chain partners. However, the majority of CPG industry still works with traditional business models and practices. This concerns especially companies operating in the upstream of the CPG value chain. Demand information for consumer packaged goods originates at retailers' counters, based on consumers' buying decisions. As this information does not get transferred along the value chain towards the upstream parties, each player needs to optimize their part, causing safety margins for inventories and speculation in purchasing decisions. The safety margins increase with each player, resulting in a phenomenon known as the bullwhip effect. The further the company is from the original demand information source, the more distorted the information is. This thesis concentrates on the upstream parts of the value chain of consumer packaged goods, and more precisely the packaging value chain. Packaging is becoming a part of the product with informative and interactive features, and therefore is not just a cost item needed to protect the product. The upstream part of the CPG value chain is distinctive, as the product changes after each involved party, and therefore the original demand information from the retailers cannot be utilized as such – even if it were transferred seamlessly. The objective of this thesis is to examine the main drivers for collaboration, and barriers causing the moderate adaptation level of collaborative models. Another objective is to define a collaborative demand information management model and test it in a pilot business situation in order to see if the barriers can be eliminated. The empirical part of this thesis contains three parts, all related to the research objective, but involving different target groups, viewpoints and research approaches. The study shows evidence that the main barriers for collaboration are very similar to the barriers in the lower part of the same value chain; lack of trust, lack of business case and lack of senior management commitment. Eliminating one of them – the lack of business case – is not enough to eliminate the two other barriers, as the operational model in this thesis shows. The uncertainty of the future, fear of losing an independent position in purchasing decision making and lack of commitment remain strong enough barriers to prevent the implementation of the proposed collaborative business model. The study proposes a new way of defining the value chain processes: it divides the contracting and planning process into two processes, one managing the commercial parts and the other managing the quantity and specification related issues. This model can reduce the resistance to collaboration, as the commercial part of the contracting process would remain the same as in the traditional model. The quantity/specification-related issues would be managed by the parties with the best capabilities and resources, as well as access to the original demand information. The parties in between would be involved in the planning process as well, as their impact for the next party upstream is significant. The study also highlights the future challenges for companies operating in the CPG value chain. The markets are becoming global, with toughening competition. Also, the technology development will most likely continue with a speed exceeding the adaptation capabilities of the industry. Value chains are also becoming increasingly dynamic, which means shorter and more agile business relationships, and at the same time the predictability of consumer demand is getting more difficult due to shorter product life cycles and trends. These changes will certainly have an effect on companies' operational models, but it is very difficult to estimate when and how the proven methods will gain wide enough adaptation to become standards.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cette thèse est principalement constituée de trois articles traitant des processus markoviens additifs, des processus de Lévy et d'applications en finance et en assurance. Le premier chapitre est une introduction aux processus markoviens additifs (PMA), et une présentation du problème de ruine et de notions fondamentales des mathématiques financières. Le deuxième chapitre est essentiellement l'article "Lévy Systems and the Time Value of Ruin for Markov Additive Processes" écrit en collaboration avec Manuel Morales et publié dans la revue European Actuarial Journal. Cet article étudie le problème de ruine pour un processus de risque markovien additif. Une identification de systèmes de Lévy est obtenue et utilisée pour donner une expression de l'espérance de la fonction de pénalité actualisée lorsque le PMA est un processus de Lévy avec changement de régimes. Celle-ci est une généralisation des résultats existant dans la littérature pour les processus de risque de Lévy et les processus de risque markoviens additifs avec sauts "phase-type". Le troisième chapitre contient l'article "On a Generalization of the Expected Discounted Penalty Function to Include Deficits at and Beyond Ruin" qui est soumis pour publication. Cet article présente une extension de l'espérance de la fonction de pénalité actualisée pour un processus subordinateur de risque perturbé par un mouvement brownien. Cette extension contient une série de fonctions escomptée éspérée des minima successives dus aux sauts du processus de risque après la ruine. Celle-ci a des applications importantes en gestion de risque et est utilisée pour déterminer la valeur espérée du capital d'injection actualisé. Finallement, le quatrième chapitre contient l'article "The Minimal entropy martingale measure (MEMM) for a Markov-modulated exponential Lévy model" écrit en collaboration avec Romuald Hervé Momeya et publié dans la revue Asia-Pacific Financial Market. Cet article présente de nouveaux résultats en lien avec le problème de l'incomplétude dans un marché financier où le processus de prix de l'actif risqué est décrit par un modèle exponentiel markovien additif. Ces résultats consistent à charactériser la mesure martingale satisfaisant le critère de l'entropie. Cette mesure est utilisée pour calculer le prix d'une option, ainsi que des portefeuilles de couverture dans un modèle exponentiel de Lévy avec changement de régimes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Optische Spektroskopie ist eine sehr wichtige Messtechnik mit einem hohen Potential für zahlreiche Anwendungen in der Industrie und Wissenschaft. Kostengünstige und miniaturisierte Spektrometer z.B. werden besonders für moderne Sensorsysteme “smart personal environments” benötigt, die vor allem in der Energietechnik, Messtechnik, Sicherheitstechnik (safety and security), IT und Medizintechnik verwendet werden. Unter allen miniaturisierten Spektrometern ist eines der attraktivsten Miniaturisierungsverfahren das Fabry Pérot Filter. Bei diesem Verfahren kann die Kombination von einem Fabry Pérot (FP) Filterarray und einem Detektorarray als Mikrospektrometer funktionieren. Jeder Detektor entspricht einem einzelnen Filter, um ein sehr schmales Band von Wellenlängen, die durch das Filter durchgelassen werden, zu detektieren. Ein Array von FP-Filter wird eingesetzt, bei dem jeder Filter eine unterschiedliche spektrale Filterlinie auswählt. Die spektrale Position jedes Bandes der Wellenlänge wird durch die einzelnen Kavitätshöhe des Filters definiert. Die Arrays wurden mit Filtergrößen, die nur durch die Array-Dimension der einzelnen Detektoren begrenzt werden, entwickelt. Allerdings erfordern die bestehenden Fabry Pérot Filter-Mikrospektrometer komplizierte Fertigungsschritte für die Strukturierung der 3D-Filter-Kavitäten mit unterschiedlichen Höhen, die nicht kosteneffizient für eine industrielle Fertigung sind. Um die Kosten bei Aufrechterhaltung der herausragenden Vorteile der FP-Filter-Struktur zu reduzieren, wird eine neue Methode zur Herstellung der miniaturisierten FP-Filtern mittels NanoImprint Technologie entwickelt und präsentiert. In diesem Fall werden die mehreren Kavitäten-Herstellungsschritte durch einen einzigen Schritt ersetzt, die hohe vertikale Auflösung der 3D NanoImprint Technologie verwendet. Seit dem die NanoImprint Technologie verwendet wird, wird das auf FP Filters basierende miniaturisierte Spectrometer nanospectrometer genannt. Ein statischer Nano-Spektrometer besteht aus einem statischen FP-Filterarray auf einem Detektorarray (siehe Abb. 1). Jeder FP-Filter im Array besteht aus dem unteren Distributed Bragg Reflector (DBR), einer Resonanz-Kavität und einen oberen DBR. Der obere und untere DBR sind identisch und bestehen aus periodisch abwechselnden dünnen dielektrischen Schichten von Materialien mit hohem und niedrigem Brechungsindex. Die optischen Schichten jeder dielektrischen Dünnfilmschicht, die in dem DBR enthalten sind, entsprechen einen Viertel der Design-Wellenlänge. Jeder FP-Filter wird einer definierten Fläche des Detektorarrays zugeordnet. Dieser Bereich kann aus einzelnen Detektorelementen oder deren Gruppen enthalten. Daher werden die Seitenkanal-Geometrien der Kavität aufgebaut, die dem Detektor entsprechen. Die seitlichen und vertikalen Dimensionen der Kavität werden genau durch 3D NanoImprint Technologie aufgebaut. Die Kavitäten haben Unterschiede von wenigem Nanometer in der vertikalen Richtung. Die Präzision der Kavität in der vertikalen Richtung ist ein wichtiger Faktor, der die Genauigkeit der spektralen Position und Durchlässigkeit des Filters Transmissionslinie beeinflusst.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the Biodiversity World (BDW) project we have created a flexible and extensible Web Services-based Grid environment for biodiversity researchers to solve problems in biodiversity and analyse biodiversity patterns. In this environment, heterogeneous and globally distributed biodiversity-related resources such as data sets and analytical tools are made available to be accessed and assembled by users into workflows to perform complex scientific experiments. One such experiment is bioclimatic modelling of the geographical distribution of individual species using climate variables in order to predict past and future climate-related changes in species distribution. Data sources and analytical tools required for such analysis of species distribution are widely dispersed, available on heterogeneous platforms, present data in different formats and lack interoperability. The BDW system brings all these disparate units together so that the user can combine tools with little thought as to their availability, data formats and interoperability. The current Web Servicesbased Grid environment enables execution of the BDW workflow tasks in remote nodes but with a limited scope. The next step in the evolution of the BDW architecture is to enable workflow tasks to utilise computational resources available within and outside the BDW domain. We describe the present BDW architecture and its transition to a new framework which provides a distributed computational environment for mapping and executing workflows in addition to bringing together heterogeneous resources and analytical tools.