932 resultados para computer system emulation, multiprocessors, educational computer systems
Resumo:
Reasoning systems have reached a high degree of maturity in the last decade. However, even the most successful systems are usually not general purpose problem solvers but are typically specialised on problems in a certain domain. The MathWeb SOftware Bus (Mathweb-SB) is a system for combining reasoning specialists via a common osftware bus. We described the integration of the lambda-clam systems, a reasoning specialist for proofs by induction, into the MathWeb-SB. Due to this integration, lambda-clam now offers its theorem proving expertise to other systems in the MathWeb-SB. On the other hand, lambda-clam can use the services of any reasoning specialist already integrated. We focus on the latter and describe first experimnents on proving theorems by induction using the computational power of the MAPLE system within lambda-clam.
Resumo:
This thesis covers the challenges of creating and maintaining an introductory engineering laboratory. The history of the University of Illinois Electrical and Computer Engineering department’s introductory course, ECE 110, is recounted. The current state of the course, as of Fall 2008, is discussed along with current challenges arising from the use of a hand-wired prototyping board with logic gates. A plan for overcoming these issues using a new microcontroller-based board with a pseudo hardware description language is discussed. The new microcontroller based system implementation is extensively detailed along with its new accompanying description language. This new system was tried in several sections of the Fall 2008 semester alongside the old system; the students’ final performances with the two different approaches are compared in terms of design, performance, complexity, and enjoyment. The system in its first run shows great promise, increasing the students’ enjoyment, and improving the performance of their designs.
Resumo:
`Evolution of mylonitic microfabrics' (EMM) is an interactive Filemaker Pro 3.0 application that documents a series of see-through deformation experiments on polycrystalline norcamphor. The application comprises computer animations, graphics and text explanations designed to give students and researchers insight into the interaction and dynamic nature of small-scale, mylonitic processes like intracrystalline glide, dynamic recrystallization and strain localization (microshearing). EMM shows how mylonitic steady state is achieved at different strain rates and temperatures. First, rotational mechanisms like glide-induced vorticity, subgrain rotation recrystallization and rigid-body rotation bring grains' crystal lattices into orientations that are favorable for intracrystalline glide. In a second stage, selective elimination of grains whose lattices are poorly oriented for glide involves grain boundary migration. This strengthens the texture. Temperature and strain rate affect both the relative activity of different strain accommodation mechanisms and the rate of microfabric change. Steady-state microfabrics are characterized by stable texture, grain size and shape-preferred orientations of grains and domains. This involves the cyclical generation and elimination of dynamically recrystallized grains and microshear zones.
Reservoir system analysis, conservation : Hydrologic Engineering Center computer program 23-J2-L253.
Resumo:
At head of cover title: Generalized computer program.
Resumo:
Part 6: Engineering and Implementation of Collaborative Networks
Resumo:
Physiological signals, which are controlled by the autonomic nervous system (ANS), could be used to detect the affective state of computer users and therefore find applications in medicine and engineering. The Pupil Diameter (PD) seems to provide a strong indication of the affective state, as found by previous research, but it has not been investigated fully yet. In this study, new approaches based on monitoring and processing the PD signal for off-line and on-line affective assessment (“relaxation” vs. “stress”) are proposed. Wavelet denoising and Kalman filtering methods are first used to remove abrupt changes in the raw Pupil Diameter (PD) signal. Then three features (PDmean, PDmax and PDWalsh) are extracted from the preprocessed PD signal for the affective state classification. In order to select more relevant and reliable physiological data for further analysis, two types of data selection methods are applied, which are based on the paired t-test and subject self-evaluation, respectively. In addition, five different kinds of the classifiers are implemented on the selected data, which achieve average accuracies up to 86.43% and 87.20%, respectively. Finally, the receiver operating characteristic (ROC) curve is utilized to investigate the discriminating potential of each individual feature by evaluation of the area under the ROC curve, which reaches values above 0.90. For the on-line affective assessment, a hard threshold is implemented first in order to remove the eye blinks from the PD signal and then a moving average window is utilized to obtain the representative value PDr for every one-second time interval of PD. There are three main steps for the on-line affective assessment algorithm, which are preparation, feature-based decision voting and affective determination. The final results show that the accuracies are 72.30% and 73.55% for the data subsets, which were respectively chosen using two types of data selection methods (paired t-test and subject self-evaluation). In order to further analyze the efficiency of affective recognition through the PD signal, the Galvanic Skin Response (GSR) was also monitored and processed. The highest affective assessment classification rate obtained from GSR processing is only 63.57% (based on the off-line processing algorithm). The overall results confirm that the PD signal should be considered as one of the most powerful physiological signals to involve in future automated real-time affective recognition systems, especially for detecting the “relaxation” vs. “stress” states.
Resumo:
Research in human computer interaction (HCI) covers both technological and human behavioural concerns. As a consequence, the contributions made in HCI research tend to be aware to either engineering or the social sciences. In HCI the purpose of practical research contributions is to reveal unknown insights about human behaviour and its relationship to technology. Practical research methods normally used in HCI include formal experiments, field experiments, field studies, interviews, focus groups, surveys, usability tests, case studies, diary studies, ethnography, contextual inquiry, experience sampling, and automated data collection. In this paper, we report on our experience using the evaluation methods focus groups, surveys and interviews and how we adopted these methods to develop artefacts: either interface’s design or information and technological systems. Four projects are examples of the different methods application to gather information about user’s wants, habits, practices, concerns and preferences. The goal was to build an understanding of the attitudes and satisfaction of the people who might interact with a technological artefact or information system. Conversely, we intended to design for information systems and technological applications, to promote resilience in organisations (a set of routines that allow to recover from obstacles) and user’s experiences. Organisations can here also be viewed within a system approach, which means that the system perturbations even failures could be characterized and improved. The term resilience has been applied to everything from the real estate, to the economy, sports, events, business, psychology, and more. In this study, we highlight that resilience is also made up of a number of different skills and abilities (self-awareness, creating meaning from other experiences, self-efficacy, optimism, and building strong relationships) that are a few foundational ingredients, which people should use along with the process of enhancing an organisation’s resilience. Resilience enhances knowledge of resources available to people confronting existing problems.
Resumo:
One of the most visionary goals of Artificial Intelligence is to create a system able to mimic and eventually surpass the intelligence observed in biological systems including, ambitiously, the one observed in humans. The main distinctive strength of humans is their ability to build a deep understanding of the world by learning continuously and drawing from their experiences. This ability, which is found in various degrees in all intelligent biological beings, allows them to adapt and properly react to changes by incrementally expanding and refining their knowledge. Arguably, achieving this ability is one of the main goals of Artificial Intelligence and a cornerstone towards the creation of intelligent artificial agents. Modern Deep Learning approaches allowed researchers and industries to achieve great advancements towards the resolution of many long-standing problems in areas like Computer Vision and Natural Language Processing. However, while this current age of renewed interest in AI allowed for the creation of extremely useful applications, a concerningly limited effort is being directed towards the design of systems able to learn continuously. The biggest problem that hinders an AI system from learning incrementally is the catastrophic forgetting phenomenon. This phenomenon, which was discovered in the 90s, naturally occurs in Deep Learning architectures where classic learning paradigms are applied when learning incrementally from a stream of experiences. This dissertation revolves around the Continual Learning field, a sub-field of Machine Learning research that has recently made a comeback following the renewed interest in Deep Learning approaches. This work will focus on a comprehensive view of continual learning by considering algorithmic, benchmarking, and applicative aspects of this field. This dissertation will also touch on community aspects such as the design and creation of research tools aimed at supporting Continual Learning research, and the theoretical and practical aspects concerning public competitions in this field.
Resumo:
Vision systems are powerful tools playing an increasingly important role in modern industry, to detect errors and maintain product standards. With the enlarged availability of affordable industrial cameras, computer vision algorithms have been increasingly applied in industrial manufacturing processes monitoring. Until a few years ago, industrial computer vision applications relied only on ad-hoc algorithms designed for the specific object and acquisition setup being monitored, with a strong focus on co-designing the acquisition and processing pipeline. Deep learning has overcome these limits providing greater flexibility and faster re-configuration. In this work, the process to be inspected consists in vials’ pack formation entering a freeze-dryer, which is a common scenario in pharmaceutical active ingredient packaging lines. To ensure that the machine produces proper packs, a vision system is installed at the entrance of the freeze-dryer to detect eventual anomalies with execution times compatible with the production specifications. Other constraints come from sterility and safety standards required in pharmaceutical manufacturing. This work presents an overview about the production line, with particular focus on the vision system designed, and about all trials conducted to obtain the final performance. Transfer learning, alleviating the requirement for a large number of training data, combined with data augmentation methods, consisting in the generation of synthetic images, were used to effectively increase the performances while reducing the cost of data acquisition and annotation. The proposed vision algorithm is composed by two main subtasks, designed respectively to vials counting and discrepancy detection. The first one was trained on more than 23k vials (about 300 images) and tested on 5k more (about 75 images), whereas 60 training images and 52 testing images were used for the second one.
Resumo:
High-throughput screening of physical, genetic and chemical-genetic interactions brings important perspectives in the Systems Biology field, as the analysis of these interactions provides new insights into protein/gene function, cellular metabolic variations and the validation of therapeutic targets and drug design. However, such analysis depends on a pipeline connecting different tools that can automatically integrate data from diverse sources and result in a more comprehensive dataset that can be properly interpreted. We describe here the Integrated Interactome System (IIS), an integrative platform with a web-based interface for the annotation, analysis and visualization of the interaction profiles of proteins/genes, metabolites and drugs of interest. IIS works in four connected modules: (i) Submission module, which receives raw data derived from Sanger sequencing (e.g. two-hybrid system); (ii) Search module, which enables the user to search for the processed reads to be assembled into contigs/singlets, or for lists of proteins/genes, metabolites and drugs of interest, and add them to the project; (iii) Annotation module, which assigns annotations from several databases for the contigs/singlets or lists of proteins/genes, generating tables with automatic annotation that can be manually curated; and (iv) Interactome module, which maps the contigs/singlets or the uploaded lists to entries in our integrated database, building networks that gather novel identified interactions, protein and metabolite expression/concentration levels, subcellular localization and computed topological metrics, GO biological processes and KEGG pathways enrichment. This module generates a XGMML file that can be imported into Cytoscape or be visualized directly on the web. We have developed IIS by the integration of diverse databases following the need of appropriate tools for a systematic analysis of physical, genetic and chemical-genetic interactions. IIS was validated with yeast two-hybrid, proteomics and metabolomics datasets, but it is also extendable to other datasets. IIS is freely available online at: http://www.lge.ibi.unicamp.br/lnbio/IIS/.
Resumo:
This paper proposes an architecture for machining process and production monitoring to be applied in machine tools with open Computer numerical control (CNC). A brief description of the advantages of using open CNC for machining process and production monitoring is presented with an emphasis on the CNC architecture using a personal computer (PC)-based human-machine interface. The proposed architecture uses the CNC data and sensors to gather information about the machining process and production. It allows the development of different levels of monitoring systems with mininium investment, minimum need for sensor installation, and low intrusiveness to the process. Successful examples of the utilization of this architecture in a laboratory environment are briefly described. As a Conclusion, it is shown that a wide range of monitoring solutions can be implemented in production processes using the proposed architecture.
Resumo:
Several experimental studies have altered the phase relationship between photic and non-photic environmental, 24 h cycles (zeitgebers) in order to assess their role in the synchronization of circadian rhythms. To assist in the interpretation of the complex activity patterns that emerge from these ""conflicting zeitgeber'' protocols, we present computer simulations of coupled circadian oscillators forced by two independent zeitgebers. This circadian system configuration was first employed by Pittendrigh and Bruce (1959), to model their studies of the light and temperature entrainment of the eclosion oscillator in Drosophila. Whereas most of the recent experiments have restricted conflicting zeitgeber experiments to two experimental conditions, by comparing circadian oscillator phases under two distinct phase relationships between zeitgebers (usually 0 and 12 h), Pittendrigh and Bruce compared eclosion phase under 12 distinct phase relationships, spanning the 24 h interval. Our simulations using non-linear differential equations replicated complex non-linear phenomena, such as ""phase jumps'' and sudden switches in zeitgeber preferences, which had previously been difficult to interpret. Our simulations reveal that these phenomena generally arise when inter-oscillator coupling is high in relation to the zeitgeber strength. Manipulations in the structural symmetry of the model indicated that these results can be expected to apply to a wide range of system configurations. Finally, our studies recommend the use of the complete protocol employed by Pittendrigh and Bruce, because different system configurations can generate similar results when a ""conflicting zeitgeber experiment'' incorporates only two phase relationships between zeitgebers.
Resumo:
In this paper, artificial neural networks are employed in a novel approach to identify harmonic components of single-phase nonlinear load currents, whose amplitude and phase angle are subject to unpredictable changes, even in steady-state. The first six harmonic current components are identified through the variation analysis of waveform characteristics. The effectiveness of this method is tested by applying it to the model of a single-phase active power filter, dedicated to the selective compensation of harmonic current drained by an AC controller. Simulation and experimental results are presented to validate the proposed approach. (C) 2010 Elsevier B. V. All rights reserved.
Resumo:
A fuzzy control strategy for voltage regulation in electric power distribution systems is introduced in this article. This real-time controller would act on power transformers equipped with under-load tap changers. The fuzzy system was employed to turn the voltage-control relays into adaptive devices. The scope of the present study has been limited to the power distribution substation, and both the voltage measurements and control actions are carried out on the secondary bus. The capacity of fuzzy systems to handle approximate data, together with their unique ability to interpret qualitative information, make it possible to design voltage control strategies that satisfy both the requirements of the Brazilian regulatory bodies and the real concerns of the electric power distribution companies. A prototype based on the fuzzy control strategy proposed in this paper has also been implemented for validation purposes and its experimental results were highly satisfactory.
Resumo:
The design of supplementary damping controllers to mitigate the effects of electromechanical oscillations in power systems is a highly complex and time-consuming process, which requires a significant amount of knowledge from the part of the designer. In this study, the authors propose an automatic technique that takes the burden of tuning the controller parameters away from the power engineer and places it on the computer. Unlike other approaches that do the same based on robust control theories or evolutionary computing techniques, our proposed procedure uses an optimisation algorithm that works over a formulation of the classical tuning problem in terms of bilinear matrix inequalities. Using this formulation, it is possible to apply linear matrix inequality solvers to find a solution to the tuning problem via an iterative process, with the advantage that these solvers are widely available and have well-known convergence properties. The proposed algorithm is applied to tune the parameters of supplementary controllers for thyristor controlled series capacitors placed in the New England/New York benchmark test system, aiming at the improvement of the damping factor of inter-area modes, under several different operating conditions. The results of the linear analysis are validated by non-linear simulation and demonstrate the effectiveness of the proposed procedure.