920 resultados para Inventory system with finite backlog
Resumo:
When can a quantum system of finite dimension be used to simulate another quantum system of finite dimension? What restricts the capacity of one system to simulate another? In this paper we complete the program of studying what simulations can be done with entangling many-qudit Hamiltonians and local unitary control. By entangling we mean that every qudit is coupled to every other qudit, at least indirectly. We demonstrate that the only class of finite-dimensional entangling Hamiltonians that are not universal for simulation is the class of entangling Hamiltonians on qubits whose Pauli operator expansion contains only terms coupling an odd number of systems, as identified by Bremner [Phys. Rev. A 69, 012313 (2004)]. We show that in all other cases entangling many-qudit Hamiltonians are universal for simulation.
Resumo:
This letter presents an analytical model for evaluating the Bit Error Rate (BER) of a Direct Sequence Code Division Multiple Access (DS-CDMA) system, with M-ary orthogonal modulation and noncoherent detection, employing an array antenna operating in a Nakagami fading environment. An expression of the Signal to Interference plus Noise Ratio (SINR) at the output of the receiver is derived, which allows the BER to be evaluated using a closed form expression. The analytical model is validated by comparing the obtained results with simulation results.
Resumo:
We show how to efficiently simulate a quantum many-body system with tree structure when its entanglement (Schmidt number) is small for any bipartite split along an edge of the tree. As an application, we show that any one-way quantum computation on a tree graph can be efficiently simulated with a classical computer.
Resumo:
Foreign exchange trading has emerged in recent times as a significant activity in many countries. As with most forms of trading, the activity is influenced by many random parameters so that the creation of a system that effectively emulates the trading process is very helpful. In this paper, we try to create such a system with a genetic algorithm engine to emulate trader behaviour on the foreign exchange market and to find the most profitable trading strategy.
Resumo:
The developments of models in Earth Sciences, e.g. for earthquake prediction and for the simulation of mantel convection, are fare from being finalized. Therefore there is a need for a modelling environment that allows scientist to implement and test new models in an easy but flexible way. After been verified, the models should be easy to apply within its scope, typically by setting input parameters through a GUI or web services. It should be possible to link certain parameters to external data sources, such as databases and other simulation codes. Moreover, as typically large-scale meshes have to be used to achieve appropriate resolutions, the computational efficiency of the underlying numerical methods is important. Conceptional this leads to a software system with three major layers: the application layer, the mathematical layer, and the numerical algorithm layer. The latter is implemented as a C/C++ library to solve a basic, computational intensive linear problem, such as a linear partial differential equation. The mathematical layer allows the model developer to define his model and to implement high level solution algorithms (e.g. Newton-Raphson scheme, Crank-Nicholson scheme) or choose these algorithms form an algorithm library. The kernels of the model are generic, typically linear, solvers provided through the numerical algorithm layer. Finally, to provide an easy-to-use application environment, a web interface is (semi-automatically) built to edit the XML input file for the modelling code. In the talk, we will discuss the advantages and disadvantages of this concept in more details. We will also present the modelling environment escript which is a prototype implementation toward such a software system in Python (see www.python.org). Key components of escript are the Data class and the PDE class. Objects of the Data class allow generating, holding, accessing, and manipulating data, in such a way that the actual, in the particular context best, representation is transparent to the user. They are also the key to establish connections with external data sources. PDE class objects are describing (linear) partial differential equation objects to be solved by a numerical library. The current implementation of escript has been linked to the finite element code Finley to solve general linear partial differential equations. We will give a few simple examples which will illustrate the usage escript. Moreover, we show the usage of escript together with Finley for the modelling of interacting fault systems and for the simulation of mantel convection.
Resumo:
Oligo(ethylene glycol) (OEG) thiol self-assembled monolayer (SAM) decorated gold nanoparticles (AuNPs) have potential applications in bionanotechnology due to their unique property of preventing the nonspecific absorption of protein on the colloidal surface. For colloid-protein mixtures, a previous study (Zhang et al. J. Phys. Chem. A 2007, 111, 12229) has shown that the OEG SAM-coated AuNPs become unstable upon addition of proteins (BSA) above a critical concentration, c*. This has been explained as a depletion effect in the two-component system. Adding salt (NaCl) can reduce the value of c*; that is, reduce the stability of the mixture. In the present work, we study the influence of the nature of the added salt on the stability of this two-component colloid-protein system. It is shown that the addition of various salts does not change the stability of either protein or colloid in solution in the experimental conditions of this work, except that sodium sulfate can destabilize the colloidal solutions. In the binary mixtures, however, the stability of colloid-protein mixtures shows significant dependence on the nature of the salt: chaotropic salts (NaSCN, NaClO4, NaNO3, MgCl2) stabilize the system with increasing salt concentration, while kosmotropic salts (NaCl, Na2SO4, NH4Cl) lead to the aggregation of colloids with increasing salt concentration. These observations indicate that the Hofmeister effect can be enhanced in two-component systems; that is, the modification of the colloidal interface by ions changes significantly the effective depletive interaction via proteins. Real time SAXS measurements confirm in all cases that the aggregates are in an amorphous state.
Resumo:
In order to survive in the increasingly customer-oriented marketplace, continuous quality improvement marks the fastest growing quality organization’s success. In recent years, attention has been focused on intelligent systems which have shown great promise in supporting quality control. However, only a small number of the currently used systems are reported to be operating effectively because they are designed to maintain a quality level within the specified process, rather than to focus on cooperation within the production workflow. This paper proposes an intelligent system with a newly designed algorithm and the universal process data exchange standard to overcome the challenges of demanding customers who seek high-quality and low-cost products. The intelligent quality management system is equipped with the ‘‘distributed process mining” feature to provide all levels of employees with the ability to understand the relationships between processes, especially when any aspect of the process is going to degrade or fail. An example of generalized fuzzy association rules are applied in manufacturing sector to demonstrate how the proposed iterative process mining algorithm finds the relationships between distributed process parameters and the presence of quality problems.
Resumo:
Results of full numerical simulations of a guiding-centre soliton system with randomly birefringent SMF fibre are shown and analysed. It emerges that the soliton system becomes unstable even for small amounts of PMD.
Resumo:
A graphical process control language has been developed as a means of defining process control software. The user configures a block diagram describing the required control system, from a menu of functional blocks, using a graphics software system with graphics terminal. Additions may be made to the menu of functional blocks, to extend the system capability, and a group of blocks may be defined as a composite block. This latter feature provides for segmentation of the overall system diagram and the repeated use of the same group of blocks within the system. The completed diagram is analyzed by a graphics compiler which generates the programs and data structure to realise the run-time software. The run-time software has been designed as a data-driven system which allows for modifications at the run-time level in both parameters and system configuration. Data structures have been specified to ensure efficient execution and minimal storage requirements in the final control software. Machine independence has been accomodated as far as possible using CORAL 66 as the high level language throughout the entire system; the final run-time code being generated by a CORAL 66 compiler appropriate to the target processor.
Resumo:
This work concerns the developnent of a proton irduced X-ray emission (PIXE) analysis system and a multi-sample scattering chamber facility. The characteristics of the beam pulsing system and its counting rate capabilities were evaluated by observing the ion-induced X-ray emission from pure thick copper targets, with and without beam pulsing operation. The characteristic X-rays were detected with a high resolution Si(Li) detector coupled to a rrulti-channel analyser. The removal of the pile-up continuum by the use of the on-demand beam pulsing is clearly demonstrated in this work. This new on-demand pu1sirg system with its counting rate capability of 25, 18 and 10 kPPS corresponding to 2, 4 am 8 usec main amplifier time constant respectively enables thick targets to be analysed more readily. Reproducibility tests of the on-demard beam pulsing system operation were checked by repeated measurements of the system throughput curves, with and without beam pulsing. The reproducibility of the analysis performed using this system was also checked by repeated measurements of the intensity ratios from a number of standard binary alloys during the experimental work. A computer programme has been developed to evaluate the calculations of the X-ray yields from thick targets bornbarded by protons, taking into account the secondary X-ray yield production due to characteristic X-ray fluorescence from an element energetically higher than the absorption edge energy of the other element present in the target. This effect was studied on metallic binary alloys such as Fe/Ni and Cr/Fe. The quantitative analysis of Fe/Ni and Cr/Fe alloy samples to determine their elemental composition taking into account the enhancement has been demonstrated in this work. Furthermore, the usefulness of the Rutherford backscattering (R.B.S.) technique to obtain the depth profiles of the elements in the upper micron of the sample is discussed.
Resumo:
This thesis describes the work carried out on the development of a novel digit actuator system with tactile perception feedback to a user and demonstrated as a master-slave system. For the tactile surface of the digit, contrasting sensor elements of resistive strain gauges and optical fibre Bragg grating sensors were evaluated. A distributive tactile sensing system consisting of optimised neural networking schemes was developed, resulting in taxonomy of artificial touch. The device is suitable for use in minimal invasive surgical (MIS) procedures as a steerable tip and a digit constructed wholly from polymers makes it suitable for use in Magnetic Resonance Imaging (MRI) environments enabling active monitoring of the patient during a procedure. To provide a realistic template of the work the research responded to the needs of two contrasting procedures: palpation of the prostate and endotracheal intubation in anaesthesia where the application of touch sense can significantly assist navigation. The performance of the approach was demonstrated with an experimental digit constructed for use in the laboratory in phantom trials. The phantom unit was developed to resemble facets of the clinical applications and digit system is able to evaluate reactive force distributions acting over the surface of the digit as well as different descriptions of contact and motion relative to the surface of the lumen. Completing control of the digit is via an instrumented glove, such that the digit actuates in sympathy with finger gesture and tactile information feedback is achieved by a combination of the tactile and visual means.
Resumo:
Modern injection-moulding machinery which produces several, pairs of plastic footwear at a time brought increased production planning problems to a factory. The demand for its footwear is seasonal but the company's manning policy keeps a fairly constant production level thus determining the aggregate stock. Production planning must therefore be done within the limitations of a specified total stock. The thesis proposes a new production planning system with four subsystems. These are sales forecasting, resource planning, and two levels of production scheduling: (a) aggregate decisions concerning the 'manufacturing group' (group of products) to be produced in each machine each week, and (b) detailed decisions concerning the products within a manufacturing group to be scheduled into each mould-place. The detailed scheduling is least dependent on improvements elsewhere so the sub-systems were tackled in reverse order. The thesis concentrates on the production scheduling sub-systems which will provide most. of the benefits. The aggregate scheduling solution depends principally on the aggregate stocks of each manufacturing group and their division into 'safety stocks' (to prevent shortages) and 'freestocks' (to permit batch production). The problem is too complex for exact solution but a good heuristic solution, which has yet to be implemented, is provided by minimising graphically immediate plus expected future costs. The detailed problem splits into determining the optimal safety stocks and batch quantities given the appropriate aggregate stocks. It.is found that the optimal safety stocks are proportional to the demand. The ideal batch quantities are based on a modified, formula for the Economic Batch Quantity and the product schedule is created week by week using a priority system which schedules to minimise expected future costs. This algorithm performs almost optimally. The detailed scheduling solution was implemented and achieved the target savings for the whole project in favourable circumstances. Future plans include full implementation.
Resumo:
This paper introduces a revolutionary way to interrogate optical fiber sensors based on fiber Bragg gratings (FBGs) and to integrate the necessary driving optoelectronic components with the sensor elements. Low-cost optoelectronic chips are used to interrogate the optical fibers, creating a portable dynamic sensing system as an alternative for the traditionally bulky and expensive fiber sensor interrogation units. The possibility to embed these laser and detector chips is demonstrated resulting in an ultra thin flexible optoelectronic package of only 40 µm, provided with an integrated planar fiber pigtail. The result is a fully embedded flexible sensing system with a thickness of only 1 mm, based on a single Vertical-Cavity Surface-Emitting Laser (VCSEL), fiber sensor and photodetector chip. Temperature, strain and electrodynamic shaking tests have been performed on our system, not limited to static read-out measurements but dynamically reconstructing full spectral information datasets.
Resumo:
Applying direct error counting, we compare the accuracy and evaluate the validity of different available numerical approaches to the estimation of the bit-error rate (BER) in 40-Gb/s return-to-zero differential phase-shift-keying transmission. As a particular example, we consider a system with in-line semiconductor optical amplifiers. We demonstrate that none of the existing models has an absolute superiority over the others. We also reveal the impact of the duty cycle on the accuracy of the BER estimates through the differently introduced Q-factors.
Resumo:
Applying direct error counting, we compare the accuracy and evaluate the validity of different available numerical approaches to the estimation of the bit-error rate (BER) in 40-Gb/s return-to-zero differential phase-shift-keying transmission. As a particular example, we consider a system with in-line semiconductor optical amplifiers. We demonstrate that none of the existing models has an absolute superiority over the others. We also reveal the impact of the duty cycle on the accuracy of the BER estimates through the differently introduced Q-factors. © 2007 IEEE.