887 resultados para Electrical and Computer Engineering
Resumo:
Probabilistic robotics most often applied to the problem of simultaneous localisation and mapping (SLAM), requires measures of uncertainty to accompany observations of the environment. This paper describes how uncertainty can be characterised for a vision system that locates coloured landmarks in a typical laboratory environment. The paper describes a model of the uncertainty in segmentation, the internal cameral model and the mounting of the camera on the robot. It explains the implementation of the system on a laboratory robot, and provides experimental results that show the coherence of the uncertainty model.
Resumo:
In this paper, a novel approach is developed to evaluate the overall performance of a local area network as well as to monitor some possible intrusion detections. The data is obtained via system utility 'ping' and huge data is analyzed via statistical methods. Finally, an overall performance index is defined and simulation experiments in three months proved the effectiveness of the proposed performance index. A software package is developed based on these ideas.
Resumo:
DNA microarray is a powerful tool to measure the level of a mixed population of nucleic acids at one time, which has great impact in many aspects of life sciences research. In order to distinguish nucleic acids with very similar composition by hybridization, it is necessary to design probes with high specificities, i.e. uniqueness, and also sensitivities, i.e., suitable melting temperature and no secondary structure. We make use of available biology tools to gain necessary sequence information of human chromosome 12, and combined with evolutionary strategy (ES) to find unique subsequences representing all predicted exons. The results are presented and discussed.
Resumo:
In an exploding and fluctuating construction market, managers are facing a challenge, which is how to manage business on a wider scale and to utilize modern developments in information technology to promote productivity. The extraordinary development of telecommunications and computer technology makes it possible for people to plan, lead, control, organize and manage projects from a distance without the need to be on site on a daily basis. A modern management known as distance management (DM) or remote management is emerging. Physical distance no longer determines the boundary of management since managers can now operate projects through virtual teams that organize manpower, material and production without face-to-face communication. What organization prototype could overcome psychological and physical barriers to reengineer a successful project through information technology? What criteria distinguishes the adapted way of communication of individual activities in a teamwork and assist the integration of an efficient and effective communication between face-to-face and a physical distance? The entire methodology has been explained through a case application on refuse incineration plant projects in Taiwan.
Resumo:
Data Envelopment Analysis (DEA) is one of the most widely used methods in the measurement of the efficiency and productivity of Decision Making Units (DMUs). DEA for a large dataset with many inputs/outputs would require huge computer resources in terms of memory and CPU time. This paper proposes a neural network back-propagation Data Envelopment Analysis to address this problem for the very large scale datasets now emerging in practice. Neural network requirements for computer memory and CPU time are far less than that needed by conventional DEA methods and can therefore be a useful tool in measuring the efficiency of large datasets. Finally, the back-propagation DEA algorithm is applied to five large datasets and compared with the results obtained by conventional DEA.
Resumo:
Research on production systems design has in recent years tended to concentrate on ‘software’ factors such as organisational aspects, work design, and the planning of the production operations. In contrast, relatively little attention has been paid to maximising the contributions made by fixed assets, particularly machines and equipment. However, as the cost of unproductive machine time has increased, reliability, particularly of machine tools, has become ever more important. Reliability theory and research has traditionally been based in the main on electrical and electronic equipment whereas mechanical devices, especially machine tools, have not received sufficiently objective treatment. A recently completed research project has considered the reliability of machine tools by taking sample surveys of purchasers, maintainers and manufacturers. Breakdown data were also collected from a number of engineering companies and analysed using both manual and computer techniques. Results obtained have provided an indication of those factors most likely to influence reliability and which in turn could lead to improved design and selection of machine tool systems. Statistical analysis of long-term field data has revealed patterns of trends of failure which could help in the design of more meaningful maintenance schemes.
Resumo:
If product cycle time reduction is the mission, and the multifunctional team is the means of achieving the mission, what then is the modus operandi by which this means is to accomplish its mission? This paper asserts that a preferred modus operandi for the multifunctional team is to adopt a process-oriented view of the manufacturing enterprise, and for this it needs the medium of a process map [16] The substance of this paper is a methodology which enables the creation of such maps Specific examples of process models drawn from the product develop ment life cycle are presented and described in order to support the methodology's integrity and value The specific deliverables we have so far obtained are a methodology for process capture and analysis, a collection of process models spanning the product development cycle, and, an engineering handbook which hosts these models and presents a computer-based means of navigating through these processes in order to allow users a better understanding of the nature of the business, their role in it, and why the job that they do benefits the work of the company We assert that this kind of thinking is the essence of concurrent engineering implementation, and further that the systemigram process models uniquely stim ulate and organise such thinking.
Resumo:
The objective of this work has been to study the behaviour and performance of a batch chromatographic column under simultaneous bioreaction and separation conditions for several carbohydrate feedstocks. Four bioreactions were chosen, namely the hydrolysis of sucrose to glucose and fructose using the enzyme invertase, the hydrolysis of inulin to fructose and glucose using inulinase, the hydrolysis of lactose to glucose and galactose using lactase and the isomerization of glucose to fructose using glucose isomerase. The chromatographic columns employed were jacketed glass columns ranging from 1 m to 2 m long and the internal diameter ranging from 0.97 cm to 1.97 cm. The stationary phase used was a cation exchange resin (PUROLITE PCR-833) in the Ca2+ form for the hydrolysis and the Mg2+ form for the isomerization reactions. The mobile phase used was a diluted enzyme solution which was continuously pumped through the chromatographic bed. The substrate was injected at the top of the bed as a pulse. The effect of the parameters pulse size, the amount of substrate solution introduced into the system corresponding to a percentage of the total empty column volume (% TECV), pulse concentration, eluent flowrate and the enzyme activity of the eluent were investigated. For the system sucrose-invertase complete conversions of substrate were achieved for pulse sizes and pulse concentrations of up to 20% TECV and 60% w/v, respectively. Products with purity above 90% were obtained. The enzyme consumption was 45% of the amount theoretically required to produce the same amount of product as in a conventional batch reactor. A value of 27 kg sucrose/m3 resin/h for the throughput of the system was achieved. The systematic investigation of the factors affecting the performance of the batch chromatographic bioreactor-separator was carried out by employing a factorial experimental procedure. The main factors affecting the performance of the system were the flowrate and enzyme activity. For the system inulin-inulinase total conversions were also obtained for pulses sizes of up to 20 % TECV and a pulse concentration of 10 % w/v. Fructose rich fractions with 100 % purity and representing up to 99.4 % of the total fructose generated were obtained with an enzyme consumption of 32 % of the amount theoretically required to produce the same amount of product in a conventional batch reactor. The hydrolysis of lactose by lactase was studied in the glass columns and also in an SCCR-S unit adapted for batch operation, in co-operation with Dr. Shieh, a fellow researcher in the Chemical Engineering and Applied Chemistry Department at Aston University. By operating at up to 30 % w/v lactose feed concentrations complete conversions were obtained and the purities of the products generated were above 90%. An enzyme consumption of 48 % of the amount theoretically required to produce the same amount of product in a conventional batch reactor was achieved. On working with the system glucose-glucose isomerase, which is a reversible reaction, the separation obtained with the stationary phase conditioned in the magnesium form was very poor although the conversion obtained was compatible with those for conventional batch reactors. By working with a mixed pulse of enzyme and substrate, up to 82.5 % of the fructose generated with a purity of 100 % was obtained. The mathematical modelling and computer simulation of the batch chromatographic bioreaction-separation has been performed on a personal computer. A finite difference method was used to solve the partial differential equations and the simulation results showed good agreement with the experimental results.
Resumo:
This paper reports findings of a two year study concerning the development and implementation of a general-purpose computer-based assessment (CBA) system at a UK University. Data gathering took place over a period of nineteen months, involving a number of formative and summative assessments. Approximately 1,000 students, drawn from undergraduate courses, were involved in the exercise. The techniques used in gathering data included questionnaires, observation, interviews and an analysis of student scores in both conventional examinations and computer-based assessments. Comparisons with conventional assessment methods suggest that the use of CBA techniques may improve the overall performance of students. However it is clear that the technique must not be seen as a "quick fix" for problems such as rising student numbers. If one accepts that current systems test only a relatively narrow range of skills, then the hasty implementation of CBA systems will result in a distorted and inaccurate view of student performance. In turn, this may serve to reduce the overall quality of courses and - ultimately - detract from the student learning experience. On the other hand, if one adopts a considered and methodical approach to computer-based assessment, positive benefits might include increased efficiency and quality, leading to improved student learning.
Resumo:
System compositional approach to model construction and research of informational processes, which take place in biological hierarchical neural networks, is being discussed. A computer toolbox has been successfully developed for solution of tasks from this scientific sphere. A series of computational experiments investigating the work of this toolbox on olfactory bulb model has been carried out. The well-known psychophysical phenomena have been reproduced in experiments.
Resumo:
In article the problems of mutual adapting of the humans and computer environment are reviewed. Features of image-intuitive and physical-mathematical modes of perception and thinking are investigated. The problems of choice of means and methods of the differential education the computerized society are considered.
Resumo:
In this study, we showed various approachs implemented in Artificial Neural Networks for network resources management and Internet congestion control. Through a training process, Neural Networks can determine nonlinear relationships in a data set by associating the corresponding outputs to input patterns. Therefore, the application of these networks to Traffic Engineering can help achieve its general objective: “intelligent” agents or systems capable of adapting dataflow according to available resources. In this article, we analyze the opportunity and feasibility to apply Artificial Neural Networks to a number of tasks related to Traffic Engineering. In previous sections, we present the basics of each one of these disciplines, which are associated to Artificial Intelligence and Computer Networks respectively.
Resumo:
Recent advances in electronic and computer technologies lead to wide-spread deployment of wireless sensor networks (WSNs). WSNs have wide range applications, including military sensing and tracking, environment monitoring, smart environments, etc. Many WSNs have mission-critical tasks, such as military applications. Thus, the security issues in WSNs are kept in the foreground among research areas. Compared with other wireless networks, such as ad hoc, and cellular networks, security in WSNs is more complicated due to the constrained capabilities of sensor nodes and the properties of the deployment, such as large scale, hostile environment, etc. Security issues mainly come from attacks. In general, the attacks in WSNs can be classified as external attacks and internal attacks. In an external attack, the attacking node is not an authorized participant of the sensor network. Cryptography and other security methods can prevent some of external attacks. However, node compromise, the major and unique problem that leads to internal attacks, will eliminate all the efforts to prevent attacks. Knowing the probability of node compromise will help systems to detect and defend against it. Although there are some approaches that can be used to detect and defend against node compromise, few of them have the ability to estimate the probability of node compromise. Hence, we develop basic uniform, basic gradient, intelligent uniform and intelligent gradient models for node compromise distribution in order to adapt to different application environments by using probability theory. These models allow systems to estimate the probability of node compromise. Applying these models in system security designs can improve system security and decrease the overheads nearly in every security area. Moreover, based on these models, we design a novel secure routing algorithm to defend against the routing security issue that comes from the nodes that have already been compromised but have not been detected by the node compromise detecting mechanism. The routing paths in our algorithm detour those nodes which have already been detected as compromised nodes or have larger probabilities of being compromised. Simulation results show that our algorithm is effective to protect routing paths from node compromise whether detected or not.
Resumo:
A major consequence of contamination at the local level’s population as it relates to environmental health and environmental engineering is childhood lead poisoning. Environmental contamination is one of the pressing environmental concerns facing the world today. Current approaches often focus on large contaminated industrial size sites that are designated by regulatory agencies for site remediation. Prior to this study, there were no known published studies conducted at the local and smaller scale, such as neighborhoods, where often much of the contamination is present to remediate. An environmental health study of local lead-poisoning data in Liberty City, Little Haiti and eastern Little Havana in Miami-Dade County, Florida accounted for a disproportionately high number of the county’s reported childhood lead poisoning cases. An engineering system was developed and designed for a comprehensive risk management methodology that is distinctively applicable to the geographical and environmental conditions of Miami-Dade County, Florida. Furthermore, a scientific approach for interpreting environmental health concerns, while involving detailed environmental engineering control measures and methods for site remediation in contained media was developed for implementation. Test samples were obtained from residents and sites in those specific communities in Miami-Dade County, Florida (Gasana and Chamorro 2002). Currently lead does not have an Oral Assessment, Inhalation Assessment, and Oral Slope Factor; variables that are required to run a quantitative risk assessment. However, various institutional controls from federal agencies’ standards and regulation for contaminated lead in media yield adequate maximum concentration limits (MCLs). For this study an MCL of .0015 (mg/L) was used. A risk management approach concerning contaminated media involving lead demonstrates that the linkage of environmental health and environmental engineering can yield a feasible solution.