868 resultados para Debugging in computer science.
Resumo:
It is necessary to generate automorphism group of chemical graph in computer-aided structure eluciation. In this paper, an algorithm is developed by all-path topological symmetry algorithm to build automorphism group of chemical graph. A comparison of several topological symmetry algorithm reveals that all-path algorthm can yield correct of class of chemical graph. It lays a foundation for ESESOC system for computer-aided structure elucidation.
Resumo:
Control of machines that exhibit flexibility becomes important when designers attempt to push the state of the art with faster, lighter machines. Three steps are necessary for the control of a flexible planet. First, a good model of the plant must exist. Second, a good controller must be designed. Third, inputs to the controller must be constructed using knowledge of the system dynamic response. There is a great deal of literature pertaining to modeling and control but little dealing with the shaping of system inputs. Chapter 2 examines two input shaping techniques based on frequency domain analysis. The first involves the use of the first deriviate of a gaussian exponential as a driving function template. The second, acasual filtering, involves removal of energy from the driving functions at the resonant frequencies of the system. Chapter 3 presents a linear programming technique for generating vibration-reducing driving functions for systems. Chapter 4 extends the results of the previous chapter by developing a direct solution to the new class of driving functions. A detailed analysis of the new technique is presented from five different perspectives and several extensions are presented. Chapter 5 verifies the theories of the previous two chapters with hardware experiments. Because the new technique resembles common signal filtering, chapter 6 compares the new approach to eleven standard filters. The new technique will be shown to result in less residual vibrations, have better robustness to system parameter uncertainty, and require less computation than other currently used shaping techniques.
Resumo:
"The Structure and Interpretation of Computer Programs" is the entry-level subject in Computer Science at the Massachusetts Institute of Technology. It is required of all students at MIT who major in Electrical Engineering or in Computer Science, as one fourth of the "common core curriculum," which also includes two subjects on circuits and linear systems and a subject on the design of digital systems. We have been involved in the development of this subject since 1978, and we have taught this material in its present form since the fall of 1980 to approximately 600 students each year. Most of these students have had little or no prior formal training in computation, although most have played with computers a bit and a few have had extensive programming or hardware design experience. Our design of this introductory Computer Science subject reflects two major concerns. First we want to establish the idea that a computer language is not just a way of getting a computer to perform operations, but rather that it is a novel formal medium for expressing ideas about methodology. Thus, programs must be written for people to read, and only incidentally for machines to execute. Secondly, we believe that the essential material to be addressed by a subject at this level, is not the syntax of particular programming language constructs, nor clever algorithms for computing particular functions of efficiently, not even the mathematical analysis of algorithms and the foundations of computing, but rather the techniques used to control the intellectual complexity of large software systems.
Resumo:
Meng, Q., & Lee, M. (2005). Novelty and Habituation: the Driving Forces in Early Stage Learning for Developmental Robotics. Wermter, S., Palm, G., & Elshaw, M. (Eds.), In: Biomimetic Neural Learning for Intelligent Robots: Intelligent Systems, Cognitive Robotics, and Neuroscience. (pp. 315-332). (Lecture Notes in Computer Science). Springer Berlin Heidelberg.
Resumo:
Rowland, J. J. (2003) Generalisation and Model Selection in Supervised Learning with Evolutionary Computation. European Workshop on Evolutionary Computation in Bioinformatics: EvoBio 2003. Lecture Notes in Computer Science (Springer), Vol 2611, pp 119-130
Resumo:
Existing work in Computer Science and Electronic Engineering demonstrates that Digital Signal Processing techniques can effectively identify the presence of stress in the speech signal. These techniques use datasets containing real or actual stress samples i.e. real-life stress such as 911 calls and so on. Studies that use simulated or laboratory-induced stress have been less successful and inconsistent. Pervasive, ubiquitous computing is increasingly moving towards voice-activated and voice-controlled systems and devices. Speech recognition and speaker identification algorithms will have to improve and take emotional speech into account. Modelling the influence of stress on speech and voice is of interest to researchers from many different disciplines including security, telecommunications, psychology, speech science, forensics and Human Computer Interaction (HCI). The aim of this work is to assess the impact of moderate stress on the speech signal. In order to do this, a dataset of laboratory-induced stress is required. While attempting to build this dataset it became apparent that reliably inducing measurable stress in a controlled environment, when speech is a requirement, is a challenging task. This work focuses on the use of a variety of stressors to elicit a stress response during tasks that involve speech content. Biosignal analysis (commercial Brain Computer Interfaces, eye tracking and skin resistance) is used to verify and quantify the stress response, if any. This thesis explains the basis of the author’s hypotheses on the elicitation of affectively-toned speech and presents the results of several studies carried out throughout the PhD research period. These results show that the elicitation of stress, particularly the induction of affectively-toned speech, is not a simple matter and that many modulating factors influence the stress response process. A model is proposed to reflect the author’s hypothesis on the emotional response pathways relating to the elicitation of stress with a required speech content. Finally the author provides guidelines and recommendations for future research on speech under stress. Further research paths are identified and a roadmap for future research in this area is defined.
Resumo:
The application of semantic technologies to the integration of biological data and the interoperability of bioinformatics analysis and visualization tools has been the common theme of a series of annual BioHackathons hosted in Japan for the past five years. Here we provide a review of the activities and outcomes from the BioHackathons held in 2011 in Kyoto and 2012 in Toyama. In order to efficiently implement semantic technologies in the life sciences, participants formed various sub-groups and worked on the following topics: Resource Description Framework (RDF) models for specific domains, text mining of the literature, ontology development, essential metadata for biological databases, platforms to enable efficient Semantic Web technology development and interoperability, and the development of applications for Semantic Web data. In this review, we briefly introduce the themes covered by these sub-groups. The observations made, conclusions drawn, and software development projects that emerged from these activities are discussed.
Resumo:
All biological phenomena depend on molecular recognition, which is either intermolecular like in ligand binding to a macromolecule or intramolecular like in protein folding. As a result, understanding the relationship between the structure of proteins and the energetics of their stability and binding with others (bio)molecules is a very interesting point in biochemistry and biotechnology. It is essential to the engineering of stable proteins and to the structure-based design of pharmaceutical ligands. The parameter generally used to characterize the stability of a system (the folded and unfolded state of the protein for example) is the equilibrium constant (K) or the free energy (deltaG(o)), which is the sum of enthalpic (deltaH(o)) and entropic (deltaS(o)) terms. These parameters are temperature dependent through the heat capacity change (deltaCp). The thermodynamic parameters deltaH(o) and deltaCp can be derived from spectroscopic experiments, using the van't Hoff method, or measured directly using calorimetry. Along with isothermal titration calorimetry (ITC), differential scanning calorimetry (DSC) is a powerful method, less described than ITC, for measuring directly the thermodynamic parameters which characterize biomolecules. In this article, we summarize the principal thermodynamics parameters, describe the DSC approach and review some systems to which it has been applied. DSC is much used for the study of the stability and the folding of biomolecules, but it can also be applied in order to understand biomolecular interactions and can thus be an interesting technique in the process of drug design.
Resumo:
In this paper, the framework is described for the modelling of granular material by employing Computational Fluid Dynamics (CFD). This is achieved through the use and implementation in the continuum theory of constitutive relations, which are derived in a granular dynamics framework and parametrise particle interactions that occur at the micro-scale level. The simulation of a process often met in bulk solids handling industrial plants involving granular matter, (i.e. filling of a flat-bottomed bin with a binary material mixture through pneumatic conveying-emptying of the bin in core flow mode-pneumatic conveying of the material coming out of a the bin) is presented. The results of the presented simulation demonstrate the capability of the numerical model to represent successfully key granular processes (i.e. segregation/degradation), the prediction of which is of great importance in the process engineering industry.
Resumo:
Computational results for the microwave heating of a porous material are presented in this paper. Combined finite difference time domain and finite volume methods were used to solve equations that describe the electromagnetic field and heat and mass transfer in porous media. The coupling between the two schemes is through a change in dielectric properties which were assumed to be dependent both on temperature and moisture content. The model was able to reflect the evolution of temperature and moisture fields as the moisture in the porous medium evaporates. Moisture movement results from internal pressure gradients produced by the internal heating and phase change.
Resumo:
A comprehensive simulation of solidification/melting processes requires the simultaneous representation of free surface fluid flow, heat transfer, phase change, non-linear solid mechanics and, possibly, electromagnetics together with their interactions in what is now referred to as "multi-physics" simulation. A 3D computational procedure and software tool, PHYSICA, embedding the above multi-physics models using finite volume methods on unstructured meshes (FV-UM) has been developed. Multi-physics simulations are extremely compute intensive and a strategy to parallelise such codes has, therefore, been developed. This strategy has been applied to PHYSICA and evaluated on a range of challenging multi-physics problems drawn from actual industrial cases.
Resumo:
The aim of this paper is to develop a mathematical model with the ability to predict particle degradation during dilute phase pneumatic conveying. A numerical procedure, based on a matrix representation of degradation processes, is presented to determine the particle impact degradation propensity from a small number of particle single impact tests carried out in a new designed laboratory scale degradation tester. A complete model of particle degradation during dilute phase pneumatic conveying is then described, where the calculation of degradation propensity is coupled with a flow model of the solids and gas phases in the pipeline. Numerical results are presented for degradation of granulated sugar in an industrial scale pneumatic conveyor.
Resumo:
In this paper, we address the use of CBR in collaboration with numerical engineering models. This collaborative combination has a particular application in engineering domains where numerical models are used. We term this domain “Case Based Engineering” (CBE), and present the general architecture of a CBE system. We define and discuss the general characteristics of CBE and the special problems which arise. These are: the handling of engineering constraints of both continuous and nominal kind; interpolation over both continuous and nominal variables, and conformability for interpolation. In order to illustrate the utility of the method proposed, and to provide practical examples of the general theory, the paper describes a practical application of the CBE architecture, known as CBE-CONVEYOR, which has been implemented by the authors.Pneumatic conveying is an important transportation technology in the solid bulks conveying industry. One of the major industry concerns is the attrition of powders and granules during pneumatic conveying. To minimize the fraction of particles during pneumatic conveying, engineers want to know what design parameters they should use in building a conveyor system. To do this, engineers often run simulations in a repetitive manner to find appropriate input parameters. CBE-Conveyor is shown to speed up conventional methods for searching for solutions, and to solve problems directly that would otherwise require considerable intervention from the engineer.
Resumo:
In this article, the representation of the merging process at the floor— stair interface is examined within a comprehensive evacuation model and trends found in experimental data are compared with model predictions. The analysis suggests that the representation of floor—stair merging within the comprehensive model appears to be consistent with trends observed within several published experiments of the merging process. In particular: (a) The floor flow rate onto the stairs decreases as the stair population density increases. (b) For a given stair population density, the floor population's flow rate onto the stairs can be maximized by connecting the floor to the landing adjacent to the incoming stair. (c) In situations where the floor is connected adjacent to the incoming stair, the merging process appears to be biased in favor of the floor population. It is further conjectured that when the floor is connected opposite the incoming stair, the merging process between the stair and floor streams is almost in balance for high stair population densities, with a slight bias in favor of the floor stream at low population densities. A key practical finding of this analysis is that the speed at which a floor can be emptied onto a stair can be enhanced simply by connecting the floor to the landing at a location adjacent to the incoming stair rather than opposite the stair. Configuring the stair in this way, while reducing the floor emptying time, results in a corresponding decrease in the descent flow rate of those already on the stairs. While this is expected to have a negligible impact on the overall time to evacuate the building, the evacuation time for those higher up in the building is extended while those on the lower flows is reduced. It is thus suggested that in high-rise buildings, floors should be connected to the landing on the opposite side to the incoming stair. Information of this type will allow engineers to better design stair—floor interfaces to meet specific design objectives.
Resumo:
Why a chapter on Perspectives and Integration in SOLAS Science in this book? SOLAS science by its nature deals with interactions that occur: across a wide spectrum of time and space scales, involve gases and particles, between the ocean and the atmosphere, across many disciplines including chemistry, biology, optics, physics, mathematics, computing, socio-economics and consequently interactions between many different scientists and across scientific generations. This chapter provides a guide through the remarkable diversity of cross-cutting approaches and tools in the gigantic puzzle of the SOLAS realm. Here we overview the existing prime components of atmospheric and oceanic observing systems, with the acquisition of ocean–atmosphere observables either from in situ or from satellites, the rich hierarchy of models to test our knowledge of Earth System functioning, and the tremendous efforts accomplished over the last decade within the COST Action 735 and SOLAS Integration project frameworks to understand, as best we can, the current physical and biogeochemical state of the atmosphere and ocean commons. A few SOLAS integrative studies illustrate the full meaning of interactions, paving the way for even tighter connections between thematic fields. Ultimately, SOLAS research will also develop with an enhanced consideration of societal demand while preserving fundamental research coherency.