915 resultados para Engineering design--Data processing


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lean is usually associated with the ‘operations’ of a manufacturing enterprise; however, there is a growing awareness that these principles may be transferred readily to other functions and sectors. The application to knowledge-based activities such as engineering design is of particular relevance to UK plc. Hence, the purpose of this study has been to establish the state-of-the-art, in terms of the adoption of Lean in new product development, by carrying out a systematic review of the literature. The authors' findings confirm the view that Lean can be applied beneficially away from the factory; that an understanding and definition of value is key to success; that a set-based (or Toyota methodology) approach to design is favoured together with the strong leadership of a chief engineer; and that the successful implementation requires organization-wide changes to systems, practices, and behaviour. On this basis it is felt that this review paper provides a useful platform for further research in this topic.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This chapter discusses engineering design and performance of various types of biomass transformation reactors. These reactors vary in their operating principle depending on the processing capacity and the nature of the desired end product, that is, gas, chemicals or liquid bio-oil. Mass balance around a thermal conversion reactor is usually carried out to identify the degree of conversion and obtain the amount of the various components in the product. The energy balance around the reactors is essential for determining the optimum reactor temperature and the amount of heat required to complete the overall reactions. Experimental and pilot-plant testing is essential for proper reactor design. However, it is common practice to use correlation and valid parameter values in determining the realistic reactor dimensions and configurations. Despite the recent progress in thermochemical conversion technology, reactor performance and scale up potential are the subjects of continuing research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The authors analyse some of the research outcomes achieved during the implementation of the EC GUIDE research project “Creating an European Identity Management Architecture for eGovernment”, as well as their personal experience. The project goals and achievements are however considered in a broader context. The key role of Identity in the Information Society was emphasised, that the research and development in this field is in its initial phase. The scope of research related to Identity, including the one related to Identity Management and Interoperability of Identity Management Systems, is expected to be further extended. The authors analyse the abovementioned issues in the context established by the EC European Interoperability Framework (EIF) as a reference document on interoperability for the Interoperable Delivery of European eGovernment Services to Public Administrations, Business and Citizens (IDABC) Work Programme. This programme aims at supporting the pan-European delivery of electronic government services.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62J05, 62J10, 62F35, 62H12, 62P30.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traffic incidents are non-recurring events that can cause a temporary reduction in roadway capacity. They have been recognized as a major contributor to traffic congestion on our nation’s highway systems. To alleviate their impacts on capacity, automatic incident detection (AID) has been applied as an incident management strategy to reduce the total incident duration. AID relies on an algorithm to identify the occurrence of incidents by analyzing real-time traffic data collected from surveillance detectors. Significant research has been performed to develop AID algorithms for incident detection on freeways; however, similar research on major arterial streets remains largely at the initial stage of development and testing. This dissertation research aims to identify design strategies for the deployment of an Artificial Neural Network (ANN) based AID algorithm for major arterial streets. A section of the US-1 corridor in Miami-Dade County, Florida was coded in the CORSIM microscopic simulation model to generate data for both model calibration and validation. To better capture the relationship between the traffic data and the corresponding incident status, Discrete Wavelet Transform (DWT) and data normalization were applied to the simulated data. Multiple ANN models were then developed for different detector configurations, historical data usage, and the selection of traffic flow parameters. To assess the performance of different design alternatives, the model outputs were compared based on both detection rate (DR) and false alarm rate (FAR). The results show that the best models were able to achieve a high DR of between 90% and 95%, a mean time to detect (MTTD) of 55-85 seconds, and a FAR below 4%. The results also show that a detector configuration including only the mid-block and upstream detectors performs almost as well as one that also includes a downstream detector. In addition, DWT was found to be able to improve model performance, and the use of historical data from previous time cycles improved the detection rate. Speed was found to have the most significant impact on the detection rate, while volume was found to contribute the least. The results from this research provide useful insights on the design of AID for arterial street applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation introduces a novel automated book reader as an assistive technology tool for persons with blindness. The literature shows extensive work in the area of optical character recognition, but the current methodologies available for the automated reading of books or bound volumes remain inadequate and are severely constrained during document scanning or image acquisition processes. The goal of the book reader design is to automate and simplify the task of reading a book while providing a user-friendly environment with a realistic but affordable system design. This design responds to the main concerns of (a) providing a method of image acquisition that maintains the integrity of the source (b) overcoming optical character recognition errors created by inherent imaging issues such as curvature effects and barrel distortion, and (c) determining a suitable method for accurate recognition of characters that yields an interface with the ability to read from any open book with a high reading accuracy nearing 98%. This research endeavor focuses in its initial aim on the development of an assistive technology tool to help persons with blindness in the reading of books and other bound volumes. But its secondary and broader aim is to also find in this design the perfect platform for the digitization process of bound documentation in line with the mission of the Open Content Alliance (OCA), a nonprofit Alliance at making reading materials available in digital form. The theoretical perspective of this research relates to the mathematical developments that are made in order to resolve both the inherent distortions due to the properties of the camera lens and the anticipated distortions of the changing page curvature as one leafs through the book. This is evidenced by the significant increase of the recognition rate of characters and a high accuracy read-out through text to speech processing. This reasonably priced interface with its high performance results and its compatibility to any computer or laptop through universal serial bus connectors extends greatly the prospects for universal accessibility to documentation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Communication has become an essential function in our civilization. With the increasing demand for communication channels, it is now necessary to find ways to optimize the use of their bandwidth. One way to achieve this is by transforming the information before it is transmitted. This transformation can be performed by several techniques. One of the newest of these techniques is the use of wavelets. Wavelet transformation refers to the act of breaking down a signal into components called details and trends by using small waveforms that have a zero average in the time domain. After this transformation the data can be compressed by discarding the details, transmitting the trends. In the receiving end, the trends are used to reconstruct the image. In this work, the wavelet used for the transformation of an image will be selected from a library of available bases. The accuracy of the reconstruction, after the details are discarded, is dependent on the wavelets chosen from the wavelet basis library. The system developed in this thesis takes a 2-D image and decomposes it using a wavelet bank. A digital signal processor is used to achieve near real-time performance in this transformation task. A contribution of this thesis project is the development of DSP-based test bed for the future development of new real-time wavelet transformation algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Methods for accessing data on the Web have been the focus of active research over the past few years. In this thesis we propose a method for representing Web sites as data sources. We designed a Data Extractor data retrieval solution that allows us to define queries to Web sites and process resulting data sets. Data Extractor is being integrated into the MSemODB heterogeneous database management system. With its help database queries can be distributed over both local and Web data sources within MSemODB framework. Data Extractor treats Web sites as data sources, controlling query execution and data retrieval. It works as an intermediary between the applications and the sites. Data Extractor utilizes a two-fold "custom wrapper" approach for information retrieval. Wrappers for the majority of sites are easily built using a powerful and expressive scripting language, while complex cases are processed using Java-based wrappers that utilize specially designed library of data retrieval, parsing and Web access routines. In addition to wrapper development we thoroughly investigate issues associated with Web site selection, analysis and processing. Data Extractor is designed to act as a data retrieval server, as well as an embedded data retrieval solution. We also use it to create mobile agents that are shipped over the Internet to the client's computer to perform data retrieval on behalf of the user. This approach allows Data Extractor to distribute and scale well. This study confirms feasibility of building custom wrappers for Web sites. This approach provides accuracy of data retrieval, and power and flexibility in handling of complex cases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many engineers currently in professional practice will have gained a degree level qualification which involved studying a curriculum heavy with mathematics and engineering science. While this knowledge is vital to the engineering design process so also is manufacturing knowledge, if the resulting designs are to be both technically and commercially viable.
The methodology advanced by the CDIO Initiative aims to improve engineering education by teaching in the context of Conceiving, Designing, Implementing and Operating products, processes or systems. A key element of this approach is the use of Design-Built-Test (DBT) projects as the core of an integrated curriculum. This approach facilitates the development of professional skills as well as the application of technical knowledge and skills developed in other parts of the degree programme. This approach also changes the role of lecturer to that of facilitator / coach in an active learning environment in which students gain concrete experiences that support their development.
The case study herein describes Mechanical Engineering undergraduate student involvement in the manufacture and assembly of concept and functional prototypes of a folding bicycle.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The advancement of GPS technology has made it possible to use GPS devices as orientation and navigation tools, but also as tools to track spatiotemporal information. GPS tracking data can be broadly applied in location-based services, such as spatial distribution of the economy, transportation routing and planning, traffic management and environmental control. Therefore, knowledge of how to process the data from a standard GPS device is crucial for further use. Previous studies have considered various issues of the data processing at the time. This paper, however, aims to outline a general procedure for processing GPS tracking data. The procedure is illustrated step-by-step by the processing of real-world GPS data of car movements in Borlänge in the centre of Sweden.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent advances in the massively parallel computational abilities of graphical processing units (GPUs) have increased their use for general purpose computation, as companies look to take advantage of big data processing techniques. This has given rise to the potential for malicious software targeting GPUs, which is of interest to forensic investigators examining the operation of software. The ability to carry out reverse-engineering of software is of great importance within the security and forensics elds, particularly when investigating malicious software or carrying out forensic analysis following a successful security breach. Due to the complexity of the Nvidia CUDA (Compute Uni ed Device Architecture) framework, it is not clear how best to approach the reverse engineering of a piece of CUDA software. We carry out a review of the di erent binary output formats which may be encountered from the CUDA compiler, and their implications on reverse engineering. We then demonstrate the process of carrying out disassembly of an example CUDA application, to establish the various techniques available to forensic investigators carrying out black-box disassembly and reverse engineering of CUDA binaries. We show that the Nvidia compiler, using default settings, leaks useful information. Finally, we demonstrate techniques to better protect intellectual property in CUDA algorithm implementations from reverse engineering.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this novel experimental study is to investigate the behaviour of a 2m x 2m model of a masonry groin vault, which is built by the assembly of blocks made of a 3D-printed plastic skin filled with mortar. The choice of the groin vault is due to the large presence of this vulnerable roofing system in the historical heritage. Experimental tests on the shaking table are carried out to explore the vault response on two support boundary conditions, involving four lateral confinement modes. The data processing of markers displacement has allowed to examine the collapse mechanisms of the vault, based on the arches deformed shapes. There then follows a numerical evaluation, to provide the orders of magnitude of the displacements associated to the previous mechanisms. Given that these displacements are related to the arches shortening and elongation, the last objective is the definition of a critical elongation between two diagonal bricks and consequently of a diagonal portion. This study aims to continue the previous work and to take another step forward in the research of ground motion effects on masonry structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this work is to present a general overview of state-of-the-art related to design for uncertainty with a focus on aerospace structures. In particular, a simulation on a FCCZ lattice cell and on the profile shape of a nozzle will be performed. Optimization under uncertainty is characterized by the need to make decisions without complete knowledge of the problem data. When dealing with a complex problem, non-linearity, or optimization, two main issues are raised: the uncertainty of the feasibility of the solution and the uncertainty of the objective value of the function. In the first part, the Design Of Experiments (DOE) methodologies, Uncertainty Quantification (UQ), and then Uncertainty optimization will be deepened. The second part will show an application of the previous theories on through a commercial software. Nowadays multiobjective optimization on high non-linear problem can be a powerful tool to approach new concept solutions or to develop cutting-edge design. In this thesis an effective improvement have been reached on a rocket nozzle. Future work could include the introduction of multi scale modelling, multiphysics approach and every strategy useful to simulate as much possible real operative condition of the studied design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The alkali-aggregate reaction (AAR) is a chemical reaction that provokes a heterogeneous expansion of concrete and reduces important properties such as Young's modulus, leading to a reduction in the structure's useful life. In this study, a parametric model is employed to determine the spatial distribution of the concrete expansion, combining normalized factors that influence the reaction through an AAR expansion law. Optimization techniques were employed to adjust the numerical results and observations in a real structure. A three-dimensional version of the model has been implemented in a finite element commercial package (ANSYS(C)) and verified in the analysis of an accelerated mortar test. Comparisons were made between two AAR mathematical descriptions for the mechanical phenomenon, using the same methodology, and an expansion curve obtained from experiment. Some parametric studies are also presented. The numerical results compared very well with the experimental data validating the proposed method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The advantages offered by the electronic component LED (Light Emitting Diode) have resulted in a quick and extensive application of this device in the replacement of incandescent lights. In this combined application, however, the relationship between the design variables and the desired effect or result is very complex and renders it difficult to model using conventional techniques. This paper consists of the development of a technique using artificial neural networks that makes it possible to obtain the luminous intensity values of brake lights using SMD (Surface Mounted Device) LEDs from design data. This technique can be utilized to design any automotive device that uses groups of SMD LEDs. The results of industrial applications using SMD LED are presented to validate the proposed technique.