4 resultados para computer-aided engineering tool
em Digital Commons - Michigan Tech
Resumo:
To tackle the challenges at circuit level and system level VLSI and embedded system design, this dissertation proposes various novel algorithms to explore the efficient solutions. At the circuit level, a new reliability-driven minimum cost Steiner routing and layer assignment scheme is proposed, and the first transceiver insertion algorithmic framework for the optical interconnect is proposed. At the system level, a reliability-driven task scheduling scheme for multiprocessor real-time embedded systems, which optimizes system energy consumption under stochastic fault occurrences, is proposed. The embedded system design is also widely used in the smart home area for improving health, wellbeing and quality of life. The proposed scheduling scheme for multiprocessor embedded systems is hence extended to handle the energy consumption scheduling issues for smart homes. The extended scheme can arrange the household appliances for operation to minimize monetary expense of a customer based on the time-varying pricing model.
Resumo:
Traditional engineering design methods are based on Simon's (1969) use of the concept function, and as such collectively suffer from both theoretical and practical shortcomings. Researchers in the field of affordance-based design have borrowed from ecological psychology in an attempt to address the blind spots of function-based design, developing alternative ontologies and design processes. This dissertation presents function and affordance theory as both compatible and complimentary. We first present a hybrid approach to design for technology change, followed by a reconciliation and integration of function and affordance ontologies for use in design. We explore the integration of a standard function-based design method with an affordance-based design method, and demonstrate how affordance theory can guide the early application of function-based design. Finally, we discuss the practical and philosophical ramifications of embracing affordance theory's roots in ecology and ecological psychology, and explore the insights and opportunities made possible by an ecological approach to engineering design. The primary contribution of this research is the development of an integrated ontology for describing and designing technological systems using both function- and affordance-based methods.
Resumo:
This dissertation presents an effective quasi one-dimensional (1-D) computational simulation tool and a full two-dimensional (2-D) computational simulation methodology for steady annular/stratified internal condensing flows of pure vapor. These simulation tools are used to investigate internal condensing flows in both gravity as well as shear driven environments. Through accurate numerical simulations of the full two dimensional governing equations, results for laminar/laminar condensing flows inside mm-scale ducts are presented. The methodology has been developed using MATLAB/COMSOL platform and is currently capable of simulating film-wise condensation for steady (and unsteady flows). Moreover, a novel 1-D solution technique, capable of simulating condensing flows inside rectangular and circular ducts with different thermal boundary conditions is also presented. The results obtained from the 2-D scientific tool and 1-D engineering tool, are validated and synthesized with experimental results for gravity dominated flows inside vertical tube and inclined channel; and, also, for shear/pressure driven flows inside horizontal channels. Furthermore, these simulation tools are employed to demonstrate key differences of physics between gravity dominated and shear/pressure driven flows. A transition map that distinguishes shear driven, gravity driven, and “mixed” driven flow zones within the non-dimensional parameter space that govern these duct flows is presented along with the film thickness and heat transfer correlations that are valid in these zones. It has also been shown that internal condensing flows in a micro-meter scale duct experiences shear driven flow, even in different gravitational environments. The full 2-D steady computational tool has been employed to investigate the length of annularity. The result for a shear driven flow in a horizontal channel shows that in absence of any noise or pressure fluctuation at the inlet, the onset of non-annularity is partly due to insufficient shear at the liquid-vapor interface. This result is being further corroborated/investigated by R. R. Naik with the help of the unsteady simulation tool. The condensing flow results and flow physics understanding developed through these simulation tools will be instrumental in reliable design of modern micro-scale and spacebased thermal systems.
Resumo:
The main objectives of this thesis are to validate an improved principal components analysis (IPCA) algorithm on images; designing and simulating a digital model for image compression, face recognition and image detection by using a principal components analysis (PCA) algorithm and the IPCA algorithm; designing and simulating an optical model for face recognition and object detection by using the joint transform correlator (JTC); establishing detection and recognition thresholds for each model; comparing between the performance of the PCA algorithm and the performance of the IPCA algorithm in compression, recognition and, detection; and comparing between the performance of the digital model and the performance of the optical model in recognition and detection. The MATLAB © software was used for simulating the models. PCA is a technique used for identifying patterns in data and representing the data in order to highlight any similarities or differences. The identification of patterns in data of high dimensions (more than three dimensions) is too difficult because the graphical representation of data is impossible. Therefore, PCA is a powerful method for analyzing data. IPCA is another statistical tool for identifying patterns in data. It uses information theory for improving PCA. The joint transform correlator (JTC) is an optical correlator used for synthesizing a frequency plane filter for coherent optical systems. The IPCA algorithm, in general, behaves better than the PCA algorithm in the most of the applications. It is better than the PCA algorithm in image compression because it obtains higher compression, more accurate reconstruction, and faster processing speed with acceptable errors; in addition, it is better than the PCA algorithm in real-time image detection due to the fact that it achieves the smallest error rate as well as remarkable speed. On the other hand, the PCA algorithm performs better than the IPCA algorithm in face recognition because it offers an acceptable error rate, easy calculation, and a reasonable speed. Finally, in detection and recognition, the performance of the digital model is better than the performance of the optical model.