986 resultados para Scilab electronics toolbox
Resumo:
In this paper, we describe the Vannotea system - an application designed to enable collaborating groups to discuss and annotate collections of high quality images, video, audio or 3D objects. The system has been designed specifically to capture and share scholarly discourse and annotations about multimedia research data by teams of trusted colleagues within a research or academic environment. As such, it provides: authenticated access to a web browser search interface for discovering and retrieving media objects; a media replay window that can incorporate a variety of embedded plug-ins to render different scientific media formats; an annotation authoring, editing, searching and browsing tool; and session logging and replay capabilities. Annotations are personal remarks, interpretations, questions or references that can be attached to whole files, segments or regions. Vannotea enables annotations to be attached either synchronously (using jabber message passing and audio/video conferencing) or asynchronously and stand-alone. The annotations are stored on an Annotea server, extended for multimedia content. Their access, retrieval and re-use is controlled via Shibboleth identity management and XACML access policies.
Resumo:
There are many techniques for electricity market price forecasting. However, most of them are designed for expected price analysis rather than price spike forecasting. An effective method of predicting the occurrence of spikes has not yet been observed in the literature so far. In this paper, a data mining based approach is presented to give a reliable forecast of the occurrence of price spikes. Combined with the spike value prediction techniques developed by the same authors, the proposed approach aims at providing a comprehensive tool for price spike forecasting. In this paper, feature selection techniques are firstly described to identify the attributes relevant to the occurrence of spikes. A simple introduction to the classification techniques is given for completeness. Two algorithms: support vector machine and probability classifier are chosen to be the spike occurrence predictors and are discussed in details. Realistic market data are used to test the proposed model with promising results.
Resumo:
The reconstruction of power industries has brought fundamental changes to both power system operation and planning. This paper presents a new planning method using multi-objective optimization (MOOP) technique, as well as human knowledge, to expand the transmission network in open access schemes. The method starts with a candidate pool of feasible expansion plans. Consequent selection of the best candidates is carried out through a MOOP approach, of which multiple objectives are tackled simultaneously, aiming at integrating the market operation and planning as one unified process in context of deregulated system. Human knowledge has been applied in both stages to ensure the selection with practical engineering and management concerns. The expansion plan from MOOP is assessed by reliability criteria before it is finalized. The proposed method has been tested with the IEEE 14-bus system and relevant analyses and discussions have been presented.
Resumo:
The BR algorithm is a novel and efficient method to find all eigenvalues of upper Hessenberg matrices and has never been applied to eigenanalysis for power system small signal stability. This paper analyzes differences between the BR and the QR algorithms with performance comparison in terms of CPU time based on stopping criteria and storage requirement. The BR algorithm utilizes accelerating strategies to improve its performance when computing eigenvalues of narrowly banded, nearly tridiagonal upper Hessenberg matrices. These strategies significantly reduce the computation time at a reasonable level of precision. Compared with the QR algorithm, the BR algorithm requires fewer iteration steps and less storage space without depriving of appropriate precision in solving eigenvalue problems of large-scale power systems. Numerical examples demonstrate the efficiency of the BR algorithm in pursuing eigenanalysis tasks of 39-, 68-, 115-, 300-, and 600-bus systems. Experiment results suggest that the BR algorithm is a more efficient algorithm for large-scale power system small signal stability eigenanalysis.
Resumo:
This paper presents an overview of the MPEG-7 Description Definition Language (DDL). The DDL provides the syntactic rules for creating, combining, extending and refining MPEG-7 Descriptors (Ds) and Description Schemes (DSs), In the interests of interoperability, the W3C's XML Schema language, with the addition of certain MPEG-7-specific extensions, has been chosen as the DDL. This paper describes the background to this decision and using examples, provides an overview of the core XML, schema features used within MPEG-7 and the extensions made in order to satisfy the MPEG-7 DDL requirements.
Resumo:
An inverse methodology is described to assist in the design of radio-frequency (RF) coils for magnetic resonance imaging (MRI) applications. The time-harmonic electromagnetic Green's functions are used to calculate current on the coil and shield cylinders that will generate a specified internal magnetic field. Stream function techniques and the method of moments are then used to implement this theoretical current density into an RF coil. A novel asymmetric coil operating for a 4.5 T MRI machine was designed and constructed using this methodology and the results are presented.
Resumo:
This paper is concerned with the use of scientific visualization methods for the analysis of feedforward neural networks (NNs). Inevitably, the kinds of data associated with the design and implementation of neural networks are of very high dimensionality, presenting a major challenge for visualization. A method is described using the well-known statistical technique of principal component analysis (PCA). This is found to be an effective and useful method of visualizing the learning trajectories of many learning algorithms such as back-propagation and can also be used to provide insight into the learning process and the nature of the error surface.
Resumo:
We are currently in the midst of a second quantum revolution. The first quantum revolution gave us new rules that govern physical reality. The second quantum revolution will take these rules and use them to develop new technologies. In this review we discuss the principles upon which quantum technology is based and the tools required to develop it. We discuss a number of examples of research programs that could deliver quantum technologies in coming decades including: quantum information technology, quantum electromechanical systems, coherent quantum electronics, quantum optics and coherent matter technology.
Resumo:
We describe a method by which the decoherence time of a solid-state qubit may be measured. The qubit is coded in the orbital degree of freedom of a single electron bound to a pair of donor impurities in a semiconductor host. The qubit is manipulated by adiabatically varying an external electric field. We show that by measuring the total probability of a successful qubit rotation as a function of the control field parameters, the decoherence rate may be determined. We estimate various system parameters, including the decoherence rates due to electromagnetic fluctuations and acoustic phonons. We find that, for reasonable physical parameters, the experiment is possible with existing technology. In particular, the use of adiabatic control fields implies that the experiment can be performed with control electronics with a time resolution of tens of nanoseconds.
Magnetic Investigation of CoFe(2)O(4) Nanoparticles Supported in Biocompatible Polymeric Microsphere
Resumo:
Magnetic investigation of spinel ferrite nanoparticles dispersed in biocompatible polymeric microspheres is reported in this study. X-ray diffraction data analysis confirms the presence of nanosized CoFe(2)O(4) particles (mean size of similar to 8 nm). This finding is corroborated by transmission electron microscopy micrographs. Magnetization isotherms suggest a spin disorder likely occurring at the nanoparticle`s surface. The saturation magnetization value is used to estimate particle concentration of 1.6 x 10(18) cm(-3) dispersed in the polymeric template. A T(1/2) dependence of the coercive field is determined in the low-temperature region (T < 30 K). The model of non-interacting mono-domains is used to estimate an effective magnetic anisotropy of K(eff) = 0.6 x 10(5) J/m(3). The K(eff) value we found is lower than the value reported for spherically-shaped CoFe(2)O(4) nanoparticles, though consistent with the low coercive field observed in the investigated sample.
Resumo:
Electrical impedance tomography is a technique to estimate the impedance distribution within a domain, based on measurements on its boundary. In other words, given the mathematical model of the domain, its geometry and boundary conditions, a nonlinear inverse problem of estimating the electric impedance distribution can be solved. Several impedance estimation algorithms have been proposed to solve this problem. In this paper, we present a three-dimensional algorithm, based on the topology optimization method, as an alternative. A sequence of linear programming problems, allowing for constraints, is solved utilizing this method. In each iteration, the finite element method provides the electric potential field within the model of the domain. An electrode model is also proposed (thus, increasing the accuracy of the finite element results). The algorithm is tested using numerically simulated data and also experimental data, and absolute resistivity values are obtained. These results, corresponding to phantoms with two different conductive materials, exhibit relatively well-defined boundaries between them, and show that this is a practical and potentially useful technique to be applied to monitor lung aeration, including the possibility of imaging a pneumothorax.
Resumo:
We have designed, built, and tested an early prototype of a novel subxiphoid access system intended to facilitate epicardial electrophysiology, but with possible applications elsewhere in the body. The present version of the system consists of a commercially available insertion needle, a miniature pressure sensor and interconnect tubing, read-out electronics to monitor the pressures measured during the access procedure, and a host computer with user-interface software. The nominal resolution of the system is <0.1 mmHg, and it has deviations from linearity of <1%. During a pilot series of human clinical studies with this system, as well as in an auxiliary study done with an independent method, we observed that the pericardial space contained pressure-frequency components related to both the heart rate and respiratory rate, while the thorax contained components related only to the respiratory rate, a previously unobserved finding that could facilitate access to the pericardial space. We present and discuss the design principles, details of construction, and performance characteristics of this system.
Resumo:
In this paper, we propose a method based on association rule-mining to enhance the diagnosis of medical images (mammograms). It combines low-level features automatically extracted from images and high-level knowledge from specialists to search for patterns. Our method analyzes medical images and automatically generates suggestions of diagnoses employing mining of association rules. The suggestions of diagnosis are used to accelerate the image analysis performed by specialists as well as to provide them an alternative to work on. The proposed method uses two new algorithms, PreSAGe and HiCARe. The PreSAGe algorithm combines, in a single step, feature selection and discretization, and reduces the mining complexity. Experiments performed on PreSAGe show that this algorithm is highly suitable to perform feature selection and discretization in medical images. HiCARe is a new associative classifier. The HiCARe algorithm has an important property that makes it unique: it assigns multiple keywords per image to suggest a diagnosis with high values of accuracy. Our method was applied to real datasets, and the results show high sensitivity (up to 95%) and accuracy (up to 92%), allowing us to claim that the use of association rules is a powerful means to assist in the diagnosing task.
Resumo:
Encyclopedia of Nanoscience and Nanotechnology® is the World's first encyclopedia ever published in the field of nanotechnology. The 10-volume Encyclopedia is an unprecedented single reference source that provides ideal introduction and overview of most recent advances and emerging new aspects of nanotechnology spanning from science to engineering to medicine. Although there are many books/handbook and journals focused on nanotechnology, no encyclopedic reference work has been published covering all aspects of nanoscale science and technology dealing with materials synthesis, processing, fabrication, probes, spectroscopy, physical properties, electronics, optics, mechanics, biotechnology, devices, etc. The Encyclopedia fills this gap to provide basic information on all fundamental and applied aspects of nanotechnology by drawing on two decades of pioneering research. It is the only scientific work of its kind since the beginning of the field of nanotechnology bringing together core knowledge and the very latest advances. It is written for all levels audience that allows non-scientists to understand the nanotechnology while providing up-to-date latest information to active scientists to experts in the field. This outstanding encyclopedia is an indispensable source for research professionals, technology investors and developers seeking the most up-to-date information on the nanotechnology among a wide range of disciplines from science to engineering to medicine.