8 resultados para order-picking system
em AMS Tesi di Laurea - Alm@DL - Università di Bologna
Resumo:
Questa tesi si propone di illustrare il lavoro da me svolto presso l’azienda Ocem Airfield Technology all’interno del magazzino componentistica mirato alla riorganizzazione del supermarket delle linee di assemblaggio nell’ottica della minimizzazione dei tempi di prelievo. La continua ricerca di flessibilità sulla base delle esigenza della clientela ha un drastico effetto sul magazzino e sulla sua gestione. Il magazzino rappresenta un elemento estremamente oneroso nel bilancio aziendale e contribuisce per il 20% ai costi interni dell’impresa. Hanno un ruolo cruciale nel funzionamento aziendale: i magazzini sono il luogo di attesa per i prodotti dal punto di origine a quello di consumo, garantiscono l’immagazzinamento e lo stoccaggio della merce ed infine sono un driver fondamentale per ridurre l’incertezza. Tra le attività più costose si colloca il prelievo frazionato della merce o order picking, responsabile del 55% dei costi operativi logistici. Nell'ambito Ocem è stato implementato un progetto per la riduzione dei tempi di prelievo ispirandosi a modelli nati in letteratura. Il progetto inizialmente è stato testato su una linea pilota a bassi volumi per monitorare i possibili vantaggi. Una volta confermata la riduzione effettiva dei tempi di prelievo, il progetto è stato replicato con le opportune modifiche sulle restanti linee di assemblaggio raggiungendo incrementi di efficienza significativi. Il parametro adottato come performance index è stata la riduzione percentuale del tempo di prelievo dall’as-is al to-be.
Resumo:
In the recent decade, the request for structural health monitoring expertise increased exponentially in the United States. The aging issues that most of the transportation structures are experiencing can put in serious jeopardy the economic system of a region as well as of a country. At the same time, the monitoring of structures is a central topic of discussion in Europe, where the preservation of historical buildings has been addressed over the last four centuries. More recently, various concerns arose about security performance of civil structures after tragic events such the 9/11 or the 2011 Japan earthquake: engineers looks for a design able to resist exceptional loadings due to earthquakes, hurricanes and terrorist attacks. After events of such a kind, the assessment of the remaining life of the structure is at least as important as the initial performance design. Consequently, it appears very clear that the introduction of reliable and accessible damage assessment techniques is crucial for the localization of issues and for a correct and immediate rehabilitation. The System Identification is a branch of the more general Control Theory. In Civil Engineering, this field addresses the techniques needed to find mechanical characteristics as the stiffness or the mass starting from the signals captured by sensors. The objective of the Dynamic Structural Identification (DSI) is to define, starting from experimental measurements, the modal fundamental parameters of a generic structure in order to characterize, via a mathematical model, the dynamic behavior. The knowledge of these parameters is helpful in the Model Updating procedure, that permits to define corrected theoretical models through experimental validation. The main aim of this technique is to minimize the differences between the theoretical model results and in situ measurements of dynamic data. Therefore, the new model becomes a very effective control practice when it comes to rehabilitation of structures or damage assessment. The instrumentation of a whole structure is an unfeasible procedure sometimes because of the high cost involved or, sometimes, because it’s not possible to physically reach each point of the structure. Therefore, numerous scholars have been trying to address this problem. In general two are the main involved methods. Since the limited number of sensors, in a first case, it’s possible to gather time histories only for some locations, then to move the instruments to another location and replay the procedure. Otherwise, if the number of sensors is enough and the structure does not present a complicate geometry, it’s usually sufficient to detect only the principal first modes. This two problems are well presented in the works of Balsamo [1] for the application to a simple system and Jun [2] for the analysis of system with a limited number of sensors. Once the system identification has been carried, it is possible to access the actual system characteristics. A frequent practice is to create an updated FEM model and assess whether the structure fulfills or not the requested functions. Once again the objective of this work is to present a general methodology to analyze big structure using a limited number of instrumentation and at the same time, obtaining the most information about an identified structure without recalling methodologies of difficult interpretation. A general framework of the state space identification procedure via OKID/ERA algorithm is developed and implemented in Matlab. Then, some simple examples are proposed to highlight the principal characteristics and advantage of this methodology. A new algebraic manipulation for a prolific use of substructuring results is developed and implemented.
Resumo:
The central objective of research in Information Retrieval (IR) is to discover new techniques to retrieve relevant information in order to satisfy an Information Need. The Information Need is satisfied when relevant information can be provided to the user. In IR, relevance is a fundamental concept which has changed over time, from popular to personal, i.e., what was considered relevant before was information for the whole population, but what is considered relevant now is specific information for each user. Hence, there is a need to connect the behavior of the system to the condition of a particular person and his social context; thereby an interdisciplinary sector called Human-Centered Computing was born. For the modern search engine, the information extracted for the individual user is crucial. According to the Personalized Search (PS), two different techniques are necessary to personalize a search: contextualization (interconnected conditions that occur in an activity), and individualization (characteristics that distinguish an individual). This movement of focus to the individual's need undermines the rigid linearity of the classical model overtaken the ``berry picking'' model which explains that the terms change thanks to the informational feedback received from the search activity introducing the concept of evolution of search terms. The development of Information Foraging theory, which observed the correlations between animal foraging and human information foraging, also contributed to this transformation through attempts to optimize the cost-benefit ratio. This thesis arose from the need to satisfy human individuality when searching for information, and it develops a synergistic collaboration between the frontiers of technological innovation and the recent advances in IR. The search method developed exploits what is relevant for the user by changing radically the way in which an Information Need is expressed, because now it is expressed through the generation of the query and its own context. As a matter of fact the method was born under the pretense to improve the quality of search by rewriting the query based on the contexts automatically generated from a local knowledge base. Furthermore, the idea of optimizing each IR system has led to develop it as a middleware of interaction between the user and the IR system. Thereby the system has just two possible actions: rewriting the query, and reordering the result. Equivalent actions to the approach was described from the PS that generally exploits information derived from analysis of user behavior, while the proposed approach exploits knowledge provided by the user. The thesis went further to generate a novel method for an assessment procedure, according to the "Cranfield paradigm", in order to evaluate this type of IR systems. The results achieved are interesting considering both the effectiveness achieved and the innovative approach undertaken together with the several applications inspired using a local knowledge base.
Resumo:
After a first theoric introduction about Business Process Re-engineering (BPR), are considered in particular the possible options found in literature regarding the following three macro-elements: the methodologies, the modelling notations and the tools employed for process mapping. The theoric section is the base for the analysis of the same elements into the specific case of Rosetti Marino S.p.A., an EPC contractor, operating in the Oil&Gas industry. Rosetti Marino implemented a tool developped internally in order to satisfy its needs in the most suitable way possible and buit a Map of all business processes,navigable on the Company Intranet. Moreover it adopted a methodology based upon participation, interfunctional communication and sharing. The GIGA introduction is analysed from a structural, human resources, political and symbolic point of view.
Resumo:
One of the most serious problems of the modern medicine is the growing emergence of antibiotic resistance among pathogenic bacteria. In this circumstance, different and innovative approaches for treating infections caused by multidrug-resistant bacteria are imperatively required. Bacteriophage Therapy is one among the fascinating approaches to be taken into account. This consists of the use of bacteriophages, viruses that infect bacteria, in order to defeat specific bacterial pathogens. Phage therapy is not an innovative idea, indeed, it was widely used around the world in the 1930s and 1940s, in order to treat various infection diseases, and it is still used in Eastern Europe and the former Soviet Union. Nevertheless, Western scientists mostly lost interest in further use and study of phage therapy and abandoned it after the discovery and the spread of antibiotics. The advancement of scientific knowledge of the last years, together with the encouraging results from recent animal studies using phages to treat bacterial infections, and above all the urgent need for novel and effective antimicrobials, have given a prompt for additional rigorous researches in this field. In particular, in the laboratory of synthetic biology of the department of Life Sciences at the University of Warwick, a novel approach was adopted, starting from the original concept of phage therapy, in order to study a concrete alternative to antibiotics. The innovative idea of the project consists in the development of experimental methodologies, which allow to engineer a programmable synthetic phage system using a combination of directed evolution, automation and microfluidics. The main aim is to make “the therapeutics of tomorrow individualized, specific, and self-regulated” (Jaramillo, 2015). In this context, one of the most important key points is the Bacteriophage Quantification. Therefore, in this research work, a mathematical model describing complex dynamics occurring in biological systems involving continuous growth of bacteriophages, modulated by the performance of the host organisms, was implemented as algorithms into a working software using MATLAB. The developed program is able to predict different unknown concentrations of phages much faster than the classical overnight Plaque Assay. What is more, it gives a meaning and an explanation to the obtained data, making inference about the parameter set of the model, that are representative of the bacteriophage-host interaction.
Resumo:
The performance of microchannel heat exchangers was assessed in gas-to-liquid applications in the order of several tens of kWth . The technology is suitable for exhaust heat recovery systems based on organic Rankine cycle. In order to design a light and compact microchannel heat exchanger, an optimization process is developed. The model employed in the procedure is validated through computational fluid-dynamics analysis with commercial software. It is shown that conjugate effects have a significant impact on the heat transfer performance of the device.
Night Vision Imaging System (NVIS) certification requirements analysis of an Airbus Helicopters H135
Resumo:
The safe operation of nighttime flight missions would be enhanced using Night Vision Imaging Systems (NVIS) equipment. This has been clear to the military since 1970s and to the civil helicopters since 1990s. In these last months, even Italian Emergency Medical Service (EMS) operators require Night Vision Goggles (NVG) devices that therefore amplify the ambient light. In order to fly with this technology, helicopters have to be NVIS-approved. The author have supported a company, to quantify the potentiality of undertaking the certification activity, through a feasibility study. Even before, NVG description and working principles have been done, then specifications analysis about the processes to make a helicopter NVIS-approved has been addressed. The noteworthy difference between military specifications and the civilian ones highlights non-irrevelant lacks in the latter. The activity of NVIS certification could be a good investment because the following targets have been achieved: Reductions of the certification cost, of the operating time and of the number of non-compliance.
Resumo:
In these last years, systems engineering has became one of the major research domains. The complexity of systems has increased constantly and nowadays Cyber-Physical Systems (CPS) are a category of particular interest: these, are systems composed by a cyber part (computer-based algorithms) that monitor and control some physical processes. Their development and simulation are both complex due to the importance of the interaction between the cyber and the physical entities: there are a lot of models written in different languages that need to exchange information among each other. Normally people use an orchestrator that takes care of the simulation of the models and the exchange of informations. This orchestrator is developed manually and this is a tedious and long work. Our proposition is to achieve to generate the orchestrator automatically through the use of Co-Modeling, i.e. by modeling the coordination. Before achieving this ultimate goal, it is important to understand the mechanisms and de facto standards that could be used in a co-modeling framework. So, I studied the use of a technology employed for co-simulation in the industry: FMI. In order to better understand the FMI standard, I realized an automatic export, in the FMI format, of the models realized in an existing software for discrete modeling: TimeSquare. I also developed a simple physical model in the existing open source openmodelica tool. Later, I started to understand how works an orchestrator, developing a simple one: this will be useful in future to generate an orchestrator automatically.