977 resultados para Docker,ARM,Raspberry PI,single board computer,QEMU,Sabayon Linux,Gentoo Linux
Resumo:
Solid-state quantum computer architectures with qubits encoded using single atoms are now feasible given recent advances in the atomic doping of semiconductors. Here we present a charge qubit consisting of two dopant atoms in a semiconductor crystal, one of which is singly ionized. Surface electrodes control the qubit and a radio-frequency single-electron transistor provides fast readout. The calculated single gate times, of order 50 ps or less, are much shorter than the expected decoherence time. We propose universal one- and two-qubit gate operations for this system and discuss prospects for fabrication and scale up.
Resumo:
The phosphosulfomannan 1 (PI-88) is a mixture of highly sulfated oligosaccharides that is currently undergoing clinical evaluation in cancer patients. As well as it's anticancer properties, 1 displays a number of other interesting biological activities. A series of analogues of 1 were synthesized with a single carbon (pentasaccharide) backbone to facilitate structural characterization and interpretation of biological results. In a fashion similar to 1, all compounds were able to inhibit heparanase and to bind tightly to the proangiogenic growth factors FGF-1, FGF-2, and VEGF. The compounds also inhibited the infection of cells and cell-to-cell spread of herpes simplex virus (HSV-1). Preliminary pharmacokinetic data indicated that the compounds displayed different pharmacokinetic behavior compared with 1. Of particular note was the n-octyl derivative, which was cleared 3 times less rapidly than 1 and may provide increased systemic exposure.
Resumo:
We revisit the one-unit gradient ICA algorithm derived from the kurtosis function. By carefully studying properties of the stationary points of the discrete-time one-unit gradient ICA algorithm, with suitable condition on the learning rate, convergence can be proved. The condition on the learning rate helps alleviate the guesswork that accompanies the problem of choosing suitable learning rate in practical computation. These results may be useful to extract independent source signals on-line.
Resumo:
Background and Purpose. Arm lymphedema following breast cancer In this study, we assessed the surgery is a continuing problem. reliability and validity of circumferential measurements and water displacement for measuring upper-limb volume. Subjects. Participants included subjects who had had breast cancer surgery, including axillary dissection-19 with and 22 without a diagnosis of arm lymphedema-and 25 control subjects. Methods. Two raters measured each subject by using circumferential tape measurements at specified distances from the fingertips and in relation to anatornic landmarks and by using water displacement. Interrater reliability was calculated by analysis of variance and multilevel modeling. Volumes from circumferential measurements were compared with those from water displacement by use of means and correlation coefficients, respectively. The standard error of measurement, minimum detectable change (MDC), and limits of agreement (LOA) for volumes also were calculated. Results. Arm volumes obtained with these methods had high reliability. Compared with volumes from water displacement, volumes from circumferential measurements had high validity, although these volumes were slightly larger. Expected differences between subjects with and without clinical lymphedema following breast cancer were found. The MDC of volumes or the error associated with a single measure for data based oil anatomic landmarks was lower than that based oil distance from fingertips. The mean LOA with water displacement were lower for data based on anatomic landmarks than for data based on distance from fingertips. Discussion and Conclusion. Volumes calculated from anatomic landmarks are reliable, valid, and more accurate than those obtained from circumferential measurements based on distance from fingertips.
Resumo:
Finding single pair shortest paths on surface is a fundamental problem in various domains, like Geographic Information Systems (GIS) 3D applications, robotic path planning system, and surface nearest neighbor query in spatial database, etc. Currently, to solve the problem, existing algorithms must traverse the entire polyhedral surface. With the rapid advance in areas like Global Positioning System (CPS), Computer Aided Design (CAD) systems and laser range scanner, surface models axe becoming more and more complex. It is not uncommon that a surface model contains millions of polygons. The single pair shortest path problem is getting harder and harder to solve. Based on the observation that the single pair shortest path is in the locality, we propose in this paper efficient methods by excluding part of the surface model without considering them in the search process. Three novel expansion-based algorithms are proposed, namely, Naive algorithm, Rectangle-based Algorithm and Ellipse-based Algorithm. Each algorithm uses a two-step approach to find the shortest path. (1) compute an initial local path. (2) use the value of this initial path to select a search region, in which the global shortest path exists. The search process terminates once the global optimum criteria are satisfied. By reducing the searching region, the performance is improved dramatically in most cases.
Resumo:
In this paper, a channel emulator for assessing the performance of MIMO testbed implemented in a field programmable gate array technology is described. The FPGA based MIMO system includes a signal generator, modulation/demodulation and space time coding/decoding modules. The emulator uses information about a wireless channel from computer simulations or actual measurements. In simulations, a single bounce scattering model for an indoor environment is applied. The generated data is stored in the FPGA board. The tests are performed for a 2times2 MIMO system that uses Alamouti scheme for space coding/decoding. The performed tests show proper operation of the FPGA implemented MIMO testbed. Good agreement between the results using measured and simulated channel data is obtained.
Resumo:
Cellular mobile radio systems will be of increasing importance in the future. This thesis describes research work concerned with the teletraffic capacity and the canputer control requirements of such systems. The work involves theoretical analysis and experimental investigations using digital computer simulation. New formulas are derived for the congestion in single-cell systems in which there are both land-to-mobile and mobile-to-mobile calls and in which mobile-to-mobile calls go via the base station. Two approaches are used, the first yields modified forms of the familiar Erlang and Engset formulas, while the second gives more complicated but more accurate formulas. The results of computer simulations to establish the accuracy of the formulas are described. New teletraffic formulas are also derived for the congestion in multi -cell systems. Fixed, dynamic and hybrid channel assignments are considered. The formulas agree with previously published simulation results. Simulation programs are described for the evaluation of the speech traffic of mobiles and for the investigation of a possible computer network for the control of the speech traffic. The programs were developed according to the structured progranming approach leading to programs of modular construction. Two simulation methods are used for the speech traffic: the roulette method and the time-true method. The first is economical but has some restriction, while the second is expensive but gives comprehensive answers. The proposed control network operates at three hierarchical levels performing various control functions which include: the setting-up and clearing-down of calls, the hand-over of calls between cells and the address-changing of mobiles travelling between cities. The results demonstrate the feasibility of the control netwvork and indicate that small mini -computers inter-connected via voice grade data channels would be capable of providing satisfactory control
Resumo:
In Information Filtering (IF) a user may be interested in several topics in parallel. But IF systems have been built on representational models derived from Information Retrieval and Text Categorization, which assume independence between terms. The linearity of these models results in user profiles that can only represent one topic of interest. We present a methodology that takes into account term dependencies to construct a single profile representation for multiple topics, in the form of a hierarchical term network. We also introduce a series of non-linear functions for evaluating documents against the profile. Initial experiments produced positive results.
Resumo:
This preliminary report describes work carried out as part of work package 1.2 of the MUCM research project. The report is split in two parts: the ?rst part (Sections 1 and 2) summarises the state of the art in emulation of computer models, while the second presents some initial work on the emulation of dynamic models. In the ?rst part, we describe the basics of emulation, introduce the notation and put together the key results for the emulation of models with single and multiple outputs, with or without the use of mean function. In the second part, we present preliminary results on the chaotic Lorenz 63 model. We look at emulation of a single time step, and repeated application of the emulator for sequential predic- tion. After some design considerations, the emulator is compared with the exact simulator on a number of runs to assess its performance. Several general issues related to emulating dynamic models are raised and discussed. Current work on the larger Lorenz 96 model (40 variables) is presented in the context of dimension reduction, with results to be provided in a follow-up report. The notation used in this report are summarised in appendix.
Resumo:
In the last few years, there has been considerable interest in using saturated magnetic objective lenses in high resolution electron microscopes. Such lenses, in present commercial electron microscopes, are energized either by conventional or superconducting coils. Very little work, however, has been reported on the use of conventional coils in saturated magnetic electron lenses. The present investigation has been concerned with the design of high flux density saturated objective lenses of both single and double polepiece types which may be energized by conventional coils and in some cases by superconducting coils. Such coils have the advantage of being small and capable of carrying high current densities. The present work has been carried out with the aid of several computer programs based on the finite element method. The effect of the shape and position of the energizing coil on the electron optical parameter has been investigated. Electron optical properties such as chromatic and spherical aberration have been studies in detail for saturated single and double polepiece lenses. Several high flux density coils of different shapes have been investigated. The choice of the most favourable coil shape and position subject to the operational requirements, has been studied in some detail. The focal properties of such optimised lenses have been computed and compared.
Resumo:
Computer-Based Learning systems of one sort or another have been in existence for almost 20 years, but they have yet to achieve real credibility within Commerce, Industry or Education. A variety of reasons could be postulated for this, typically: - cost - complexity - inefficiency - inflexibility - tedium Obviously different systems deserve different levels and types of criticism, but it still remains true that Computer-Based Learning (CBL) is falling significantly short of its potential. Experience of a small, but highly successful CBL system within a large, geographically distributed industry (the National Coal Board) prompted an investigation into currently available packages, the original intention being to purchase the most suitable software and run it on existing computer hardware, alongside existing software systems. It became apparent that none of the available CBL packages were suitable, and a decision was taken to develop an in-house Computer-Assisted Instruction system according to the following criteria: - cheap to run; - easy to author course material; - easy to use; - requires no computing knowledge to use (as either an author or student) ; - efficient in the use of computer resources; - has a comprehensive range of facilities at all levels. This thesis describes the initial investigation, resultant observations and the design, development and implementation of the SCHOOL system. One of the principal characteristics c£ SCHOOL is that it uses a hierarchical database structure for the storage of course material - thereby providing inherently a great deal of the power, flexibility and efficiency originally required. Trials using the SCHOOL system on IBM 303X series equipment are also detailed, along with proposed and current development work on what is essentially an operational CBL system within a large-scale Industrial environment.
Resumo:
Computer integrated monitoring is a very large area in engineering where on-line, real time data acquisition with the aid of sensors is the solution to many problems in the manufacturing industry as opposed to the old data logging method by graphics analysis. The raw data which is collected this way however is useless in the absence of a proper computerized management system. The transfer of data between the management and the shop floor processes has been impossible in the past unless all the computers in the system were totally compatible with each other. This limits the efficiency of the systems because they get governed by the limitations of the computers. General Motors of U.S.A. have recently started research on a new standard called the Manufacturing Automation Protocol (MAP) which is expected to allow data transfer between different types of computers. This is still in early development stages and also is currently very expensive. This research programme shows how such a shop floor data acquisition system and a complete management system on entirely different computers can be integrated together to form a single system by achieving data transfer communications using a cheaper but a superior alternative to MAP. Standard communication character sets and hardware such as ASCII and UARTs have been used in this method but the technique is so powerful that totally incompatible computers are shown to run different programs (in different languages) simultaneously and yet receive data from each other and process in their own CPUs with no human intervention.
Resumo:
The present study describes a pragmatic approach to the implementation of production planning and scheduling techniques in foundries of all types and looks at the use of `state-of-the-art' management control and information systems. Following a review of systems for the classification of manufacturing companies, a definitive statement is made which highlights the important differences between foundries (i.e. `component makers') and other manufacturing companies (i.e. `component buyers'). An investigation of the manual procedures which are used to plan and control the manufacture of components reveals the inherent problems facing foundry production management staff, which suggests the unsuitability of many manufacturing techniques which have been applied to general engineering companies. From the literature it was discovered that computer-assisted systems are required which are primarily `information-based' rather than `decision based', whilst the availability of low-cost computers and `packaged-software' has enabled foundries to `get their feet wet' without the financial penalties which characterized many of the early attempts at computer-assistance (i.e. pre-1980). Moreover, no evidence of a single methodology for foundry scheduling emerged from the review. A philosophy for the development of a CAPM system is presented, which details the essential information requirements and puts forward proposals for the subsequent interactions between types of information and the sub-system of CAPM which they support. The work developed was oriented specifically at the functions of production planning and scheduling and introduces the concept of `manual interaction' for effective scheduling. The techniques developed were designed to use the information which is readily available in foundries and were found to be practically successful following the implementation of the techniques into a wide variety of foundries. The limitations of the techniques developed are subsequently discussed within the wider issues which form a CAPM system, prior to a presentation of the conclusions which can be drawn from the study.