963 resultados para Engineering, Industrial|Engineering, System Science|Operations Research
Resumo:
The Virtual Learning Environment (VLE) is one of the fastest growing areas in educational technology research and development. In order to achieve learning effectiveness, ideal VLEs should be able to identify learning needs and customize solutions, with or without an instructor to supplement instruction. They are called Personalized VLEs (PVLEs). In order to achieve PVLEs success, comprehensive conceptual models corresponding to PVLEs are essential. Such conceptual modeling development is important because it facilitates early detection and correction of system development errors. Therefore, in order to capture the PVLEs knowledge explicitly, this paper focuses on the development of conceptual models for PVLEs, including models of knowledge primitives in terms of learner, curriculum, and situational models, models of VLEs in general pedagogical bases, and particularly, the definition of the ontology of PVLEs on the constructivist pedagogical principle. Based on those comprehensive conceptual models, a prototyped multiagent-based PVLE has been implemented. A field experiment was conducted to investigate the learning achievements by comparing personalized and non-personalized systems. The result indicates that the PVLE we developed under our comprehensive ontology successfully provides significant learning achievements. These comprehensive models also provide a solid knowledge representation framework for PVLEs development practice, guiding the analysis, design, and development of PVLEs. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
For repairable items, the manufacturer has the option to either repair or replace a failed item that is returned under warranty. In this paper, we look at a new warranty servicing strategy for items sold with two-dimensional warranty where the failed item is replaced by a new one when it fails for the first time in a specified region of the warranty and all other failures are repaired minimally. The region is characterised by two parameters and we derive the optimal values for these to minimise the total expected warranty servicing cost. We compare the results with other repair-replace strategies reported in the literature. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Document classification is a supervised machine learning process, where predefined category labels are assigned to documents based on the hypothesis derived from training set of labelled documents. Documents cannot be directly interpreted by a computer system unless they have been modelled as a collection of computable features. Rogati and Yang [M. Rogati and Y. Yang, Resource selection for domain-specific cross-lingual IR, in SIGIR 2004: Proceedings of the 27th annual international conference on Research and Development in Information Retrieval, ACM Press, Sheffied: United Kingdom, pp. 154-161.] pointed out that the effectiveness of document classification system may vary in different domains. This implies that the quality of document model contributes to the effectiveness of document classification. Conventionally, model evaluation is accomplished by comparing the effectiveness scores of classifiers on model candidates. However, this kind of evaluation methods may encounter either under-fitting or over-fitting problems, because the effectiveness scores are restricted by the learning capacities of classifiers. We propose a model fitness evaluation method to determine whether a model is sufficient to distinguish positive and negative instances while still competent to provide satisfactory effectiveness with a small feature subset. Our experiments demonstrated how the fitness of models are assessed. The results of our work contribute to the researches of feature selection, dimensionality reduction and document classification.
Resumo:
For leased equipment the lessor incurs penalty costs for failures occurring over the lease period and for not rectifying such failures within a specified time limit. Through preventive maintenance actions the penalty costs can be reduced but this is achieved at the expense of increased maintenance costs. The paper looks at a periodic preventive maintenance policy which achieves a tradeoff between the penalty and maintenance costs. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
There has been considerable debate about the need for more empirical, evidence based studies of the impact of various interventions and practices in engineering education. A number of resources including workshops to guide engineering faculty in the conduct of such studies have emerged over recent years. This paper presents a critique of the evolution of engineering education research and its underlying assumptions in the context of the systemic reform currently underway in engineering education. This critique leads to an analysis of the ways in which our current understanding of engineering, engineering education and research in engineering education is shaped by the traditions and cultural characteristics of the profession and grounded, albeit implicitly, in a particular suite of epistemological assumptions. It is argued that the whole enterprise of engineering education needs to be radically reconceptualized. A pluralistic approach to framing scholarship in engineering education is then proposed based on the principles of demonstrable practicality, critical interdisciplinarity and holistic reflexivity. This new framework has implications for engaging and developing faculty in the context of new teaching and learning paradigms, for the evaluation of the scholarship of teaching and for the research-teaching nexus.
Resumo:
Starting with the research question, "How can the Primary School Curriculum be developed so as to spark Children's Engineering Imaginations from an early age?" this paper sets out to critically analyse the issues around embedding Engineering in the Primary School Curriculum from the age of 5 years. Findings from an exploratory research project suggest that in order to promote the concept of Engineering Education to potential university students (and in doing so begin to address issues around recruitment / retention within Engineering) there is a real need to excite and engage children with the subject from a young age. Indeed, it may be argued that within today's digital society, the need to encourage children to engage with Engineering is vital to the future sustainable development of our society. Whilst UK Government policy documents highlight the value of embedding Engineering into the school curriculum there is little or no evidence to suggest that Engineering has been successfully embedded into the elementary level school curriculum. Building on the emergent findings of the first stage of a longitudinal study, this paper concludes by arguing that Engineering could be embedded into the curriculum through innovative pedagogical approaches which contextualise project-based learning experiences within more traditional subjects including science, history, geography, literacy and numeracy.
Resumo:
The aim of this paper is to explore the engineering lecturers' experiences of generic skills assessment within an active learning context in Malaysia. Using a case-study methodology, lecturers' assessment approaches were investigated regarding three generic skills; verbal communication, problem solving and team work. Because of the importance to learning of the assessment of such skills it is this assessment that is discussed. The findings show the lecturers' initial feedback to have been generally lacking in substance, since they have limited knowledge and experience of assessing generic skills. Typical barriers identified during the study included; generic skills not being well defined, inadequate alignment across the engineering curricula and teaching approaches, assessment practices that were too flexible, particular those to do with implementation; and a failure to keep up to date with industrial requirements. The emerging findings of the interviews reinforce the arguments that there is clearly much room for improvement in the present state of generic skills assessment.
Resumo:
Drawing from work found in the financial innovation literature, the main objective of this research is to explore the effect of religious orientation towards financial innovation and engineering in Islamic Financial Institutions (IFIs). The research also examines what constitutes this religious orientation and how it is enacted in the innovation process. Religious orientation towards financial innovation is conceptualised and defined, as a system, in this research study. In order to achieve this objective, the study employs multiple theoretical perspectives to develop its theoretical framework. It combines innovation orientation theory with the theory on boundary objects to explore the role of religion in the financial innovation processes in IFIs. Religious orientation
Resumo:
A major consequence of contamination at the local level’s population as it relates to environmental health and environmental engineering is childhood lead poisoning. Environmental contamination is one of the pressing environmental concerns facing the world today. Current approaches often focus on large contaminated industrial size sites that are designated by regulatory agencies for site remediation. Prior to this study, there were no known published studies conducted at the local and smaller scale, such as neighborhoods, where often much of the contamination is present to remediate. An environmental health study of local lead-poisoning data in Liberty City, Little Haiti and eastern Little Havana in Miami-Dade County, Florida accounted for a disproportionately high number of the county’s reported childhood lead poisoning cases. An engineering system was developed and designed for a comprehensive risk management methodology that is distinctively applicable to the geographical and environmental conditions of Miami-Dade County, Florida. Furthermore, a scientific approach for interpreting environmental health concerns, while involving detailed environmental engineering control measures and methods for site remediation in contained media was developed for implementation. Test samples were obtained from residents and sites in those specific communities in Miami-Dade County, Florida (Gasana and Chamorro 2002). Currently lead does not have an Oral Assessment, Inhalation Assessment, and Oral Slope Factor; variables that are required to run a quantitative risk assessment. However, various institutional controls from federal agencies’ standards and regulation for contaminated lead in media yield adequate maximum concentration limits (MCLs). For this study an MCL of .0015 (mg/L) was used. A risk management approach concerning contaminated media involving lead demonstrates that the linkage of environmental health and environmental engineering can yield a feasible solution.
Resumo:
Parallel processing is prevalent in many manufacturing and service systems. Many manufactured products are built and assembled from several components fabricated in parallel lines. An example of this manufacturing system configuration is observed at a manufacturing facility equipped to assemble and test web servers. Characteristics of a typical web server assembly line are: multiple products, job circulation, and paralleling processing. The primary objective of this research was to develop analytical approximations to predict performance measures of manufacturing systems with job failures and parallel processing. The analytical formulations extend previous queueing models used in assembly manufacturing systems in that they can handle serial and different configurations of paralleling processing with multiple product classes, and job circulation due to random part failures. In addition, appropriate correction terms via regression analysis were added to the approximations in order to minimize the gap in the error between the analytical approximation and the simulation models. Markovian and general type manufacturing systems, with multiple product classes, job circulation due to failures, and fork and join systems to model parallel processing were studied. In the Markovian and general case, the approximations without correction terms performed quite well for one and two product problem instances. However, it was observed that the flow time error increased as the number of products and net traffic intensity increased. Therefore, correction terms for single and fork-join stations were developed via regression analysis to deal with more than two products. The numerical comparisons showed that the approximations perform remarkably well when the corrections factors were used in the approximations. In general, the average flow time error was reduced from 38.19% to 5.59% in the Markovian case, and from 26.39% to 7.23% in the general case. All the equations stated in the analytical formulations were implemented as a set of Matlab scripts. By using this set, operations managers of web server assembly lines, manufacturing or other service systems with similar characteristics can estimate different system performance measures, and make judicious decisions - especially setting delivery due dates, capacity planning, and bottleneck mitigation, among others.
Resumo:
This research is motivated by the need for considering lot sizing while accepting customer orders in a make-to-order (MTO) environment, in which each customer order must be delivered by its due date. Job shop is the typical operation model used in an MTO operation, where the production planner must make three concurrent decisions; they are order selection, lot size, and job schedule. These decisions are usually treated separately in the literature and are mostly led to heuristic solutions. The first phase of the study is focused on a formal definition of the problem. Mathematical programming techniques are applied to modeling this problem in terms of its objective, decision variables, and constraints. A commercial solver, CPLEX is applied to solve the resulting mixed-integer linear programming model with small instances to validate the mathematical formulation. The computational result shows it is not practical for solving problems of industrial size, using a commercial solver. The second phase of this study is focused on development of an effective solution approach to this problem of large scale. The proposed solution approach is an iterative process involving three sequential decision steps of order selection, lot sizing, and lot scheduling. A range of simple sequencing rules are identified for each of the three subproblems. Using computer simulation as the tool, an experiment is designed to evaluate their performance against a set of system parameters. For order selection, the proposed weighted most profit rule performs the best. The shifting bottleneck and the earliest operation finish time both are the best scheduling rules. For lot sizing, the proposed minimum cost increase heuristic, based on the Dixon-Silver method performs the best, when the demand-to-capacity ratio at the bottleneck machine is high. The proposed minimum cost heuristic, based on the Wagner-Whitin algorithm is the best lot-sizing heuristic for shops of a low demand-to-capacity ratio. The proposed heuristic is applied to an industrial case to further evaluate its performance. The result shows it can improve an average of total profit by 16.62%. This research contributes to the production planning research community with a complete mathematical definition of the problem and an effective solution approach to solving the problem of industry scale.
Resumo:
The importance of checking the normality assumption in most statistical procedures especially parametric tests cannot be over emphasized as the validity of the inferences drawn from such procedures usually depend on the validity of this assumption. Numerous methods have been proposed by different authors over the years, some popular and frequently used, others, not so much. This study addresses the performance of eighteen of the available tests for different sample sizes, significance levels, and for a number of symmetric and asymmetric distributions by conducting a Monte-Carlo simulation. The results showed that considerable power is not achieved for symmetric distributions when sample size is less than one hundred and for such distributions, the kurtosis test is most powerful provided the distribution is leptokurtic or platykurtic. The Shapiro-Wilk test remains the most powerful test for asymmetric distributions. We conclude that different tests are suitable under different characteristics of alternative distributions.
Resumo:
Parallel processing is prevalent in many manufacturing and service systems. Many manufactured products are built and assembled from several components fabricated in parallel lines. An example of this manufacturing system configuration is observed at a manufacturing facility equipped to assemble and test web servers. Characteristics of a typical web server assembly line are: multiple products, job circulation, and paralleling processing. The primary objective of this research was to develop analytical approximations to predict performance measures of manufacturing systems with job failures and parallel processing. The analytical formulations extend previous queueing models used in assembly manufacturing systems in that they can handle serial and different configurations of paralleling processing with multiple product classes, and job circulation due to random part failures. In addition, appropriate correction terms via regression analysis were added to the approximations in order to minimize the gap in the error between the analytical approximation and the simulation models. Markovian and general type manufacturing systems, with multiple product classes, job circulation due to failures, and fork and join systems to model parallel processing were studied. In the Markovian and general case, the approximations without correction terms performed quite well for one and two product problem instances. However, it was observed that the flow time error increased as the number of products and net traffic intensity increased. Therefore, correction terms for single and fork-join stations were developed via regression analysis to deal with more than two products. The numerical comparisons showed that the approximations perform remarkably well when the corrections factors were used in the approximations. In general, the average flow time error was reduced from 38.19% to 5.59% in the Markovian case, and from 26.39% to 7.23% in the general case. All the equations stated in the analytical formulations were implemented as a set of Matlab scripts. By using this set, operations managers of web server assembly lines, manufacturing or other service systems with similar characteristics can estimate different system performance measures, and make judicious decisions - especially setting delivery due dates, capacity planning, and bottleneck mitigation, among others.
Resumo:
III-Nitride materials have recently become a promising candidate for superior applications over the current technologies. However, certain issues such as lack of native substrates, and high defect density have to be overcome for further development of III-Nitride technology. This work presents research on lattice engineering of III-Nitride materials, and the structural, optical, and electrical properties of its alloys, in order to approach the ideal material for various applications. We demonstrated the non-destructive and quantitative characterization of composition modulated nanostructure in InAlN thin films with X-ray diffraction. We found the development of the nanostructure depends on growth temperature, and the composition modulation has impacts on carrier recombination dynamics. We also showed that the controlled relaxation of a very thin AlN buffer (20 ~ 30 nm) or a graded composition InGaN buffer can significantly reduce the defect density of a subsequent epitaxial layer. Finally, we synthesized an InAlGaN thin films and a multi-quantum-well structure. Significant emission enhancement in the UVB range (280 – 320 nm) was observed compared to AlGaN thin films. The nature of the enhancement was investigated experimentally and numerically, suggesting carrier confinement in the In localization centers.