955 resultados para Memory Management (Computer science)
Resumo:
This paper details the implementation and operational performance of a minimum-power 2.45-GHz pulse receiver and a companion on-off keyed transmitter for use in a semi-active duplex RF biomedical transponder. A 50-Ohm microstrip stub-matched zero-bias diode detector forms the heart of a body-worn receiver that has a CMOS baseband amplifier consuming 20 microamps from +3 V and achieves a tangential sensitivity of -53 dBm. The base transmitter generates 0.5 W of peak RF output power into 50 Ohms. Both linear and right-hand circularly polarized Tx-Rx antenna sets were employed in system reliability trials carried out in a hospital Coronary Care Unit, For transmitting antenna heights between 0.3 and 2.2 m above floor level, transponder interrogations were 95% reliable within the 67-m-sq area of the ward, falling to an average of 46 % in the surrounding rooms and corridors. Overall, the circular antenna set gave the higher reliability and lower propagation power decay index.
Resumo:
What-if Simulations have been identified as one solution for business performance related decision support. Such support is especially useful in cases where it can be automatically generated out of Business Process Management (BPM) Environments from the existing business process models and performance parameters monitored from the executed business process instances. Currently, some of the available BPM Environments offer basic-level performance prediction capabilities. However, these functionalities are normally too limited to be generally useful for performance related decision support at business process level. In this paper, an approach is presented which allows the non-intrusive integration of sophisticated tooling for what-if simulations, analytic performance prediction tools process optimizations or a combination Of Such solutions into already existing BPM environments. The approach abstracts from process modelling techniques which enable automatic decision support spanning processes across numerous BPM Environments. For instance, this enables end-to-end decision support for composite processes modelled with the Business Process Modelling Notation (BPMN) on top of existing Enterprise Resource Planning (ERP) processes modelled with proprietary languages.
Resumo:
This paper describes the application of multivariate regression techniques to the Tennessee Eastman benchmark process for modelling and fault detection. Two methods are applied : linear partial least squares, and a nonlinear variant of this procedure using a radial basis function inner relation. The performance of the RBF networks is enhanced through the use of a recently developed training algorithm which uses quasi-Newton optimization to ensure an efficient and parsimonious network; details of this algorithm can be found in this paper. The PLS and PLS/RBF methods are then used to create on-line inferential models of delayed process measurements. As these measurements relate to the final product composition, these models suggest that on-line statistical quality control analysis should be possible for this plant. The generation of `soft sensors' for these measurements has the further effect of introducing a redundant element into the system, redundancy which can then be used to generate a fault detection and isolation scheme for these sensors. This is achieved by arranging the sensors and models in a manner comparable to the dedicated estimator scheme of Clarke et al. 1975, IEEE Trans. Pero. Elect. Sys., AES-14R, 465-473. The effectiveness of this scheme is demonstrated on a series of simulated sensor and process faults, with full detection and isolation shown to be possible for sensor malfunctions, and detection feasible in the case of process faults. Suggestions for enhancing the diagnostic capacity in the latter case are covered towards the end of the paper.
Resumo:
Purpose – The Six Sigma approach to business improvement has emerged as a phenomenon in both the practitioner and academic literature with potential for achieving increased competitiveness and contributing. However, there is a lack of critical reviews covering both theory and practice. Therefore, the purpose of this paper is to critically review the literature of Six Sigma using a consistent theoretical perspective, namely absorptive capacity.
Design/methodology/approach – The literature from peer-reviewed journals has been critically reviewed using the absorptive capacity framework and dimensions of acquisition, assimilation, transformation, and exploitation.
Findings – There is evidence of emerging theoretical underpinning in relation to Six Sigma borrowing from an eclectic range of organisational theories. However, this theoretical development lags behind practice in the area. The development of Six Sigma in practice is expanding mainly through more rigorous studies and applications in service-based environments (profit and not for profit). The absorptive capacity framework is found to be a useful overarching framework within which to situate existing theoretical and practice studies.
Research limitations/implications – Agendas for further research from the critical review, in relation to both theory and practice, have been established in relation to each dimension of the absorptive capacity framework.
Practical implications – The paper shows that Six Sigma is both a strategic and operational issue and that focussing solely on define, measure, analyse, improve control-based projects can limit the strategic effectiveness of the approach within organisations.
Originality/value – Despite the increasing volume of Six Sigma literature and organisational applications, there is a paucity of critical reviews which cover both theory and practice and which suggest research agendas derived from such reviews.
Resumo:
A key issue in the design of next generation Internet routers and switches will be provision of traffic manager (TM) functionality in the datapaths of their high speed switching fabrics. A new architecture that allows dynamic deployment of different TM functions is presented. By considering the processing requirements of operations such as policing and congestion, queuing, shaping and scheduling, a solution has been derived that is scalable with a consistent programmable interface. Programmability is achieved using a function computation unit which determines the action (e.g. drop, queue, remark, forward) based on the packet attribute information and a memory storage part. Results of a Xilinx Virtex-5 FPGA reference design are presented.
Resumo:
This paper discusses the monitoring of complex nonlinear and time-varying processes. Kernel principal component analysis (KPCA) has gained significant attention as a monitoring tool for nonlinear systems in recent years but relies on a fixed model that cannot be employed for time-varying systems. The contribution of this article is the development of a numerically efficient and memory saving moving window KPCA (MWKPCA) monitoring approach. The proposed technique incorporates an up- and downdating procedure to adapt (i) the data mean and covariance matrix in the feature space and (ii) approximates the eigenvalues and eigenvectors of the Gram matrix. The article shows that the proposed MWKPCA algorithm has a computation complexity of O(N2), whilst batch techniques, e.g. the Lanczos method, are of O(N3). Including the adaptation of the number of retained components and an l-step ahead application of the MWKPCA monitoring model, the paper finally demonstrates the utility of the proposed technique using a simulated nonlinear time-varying system and recorded data from an industrial distillation column.
Resumo:
In this paper, a novel video-based multimodal biometric verification scheme using the subspace-based low-level feature fusion of face and speech is developed for specific speaker recognition for perceptual human--computer interaction (HCI). In the proposed scheme, human face is tracked and face pose is estimated to weight the detected facelike regions in successive frames, where ill-posed faces and false-positive detections are assigned with lower credit to enhance the accuracy. In the audio modality, mel-frequency cepstral coefficients are extracted for voice-based biometric verification. In the fusion step, features from both modalities are projected into nonlinear Laplacian Eigenmap subspace for multimodal speaker recognition and combined at low level. The proposed approach is tested on the video database of ten human subjects, and the results show that the proposed scheme can attain better accuracy in comparison with the conventional multimodal fusion using latent semantic analysis as well as the single-modality verifications. The experiment on MATLAB shows the potential of the proposed scheme to attain the real-time performance for perceptual HCI applications.
Resumo:
Value-at-risk (VaR) forecasting generally relies on a parametric density function of portfolio returns that ignores higher moments or assumes them constant. In this paper, we propose a simple approach to forecasting of a portfolio VaR. We employ the Gram-Charlier expansion (GCE) augmenting the standard normal distribution with the first four moments, which are allowed to vary over time. In an extensive empirical study, we compare the GCE approach to other models of VaR forecasting and conclude that it provides accurate and robust estimates of the realized VaR. In spite of its simplicity, on our dataset GCE outperforms other estimates that are generated by both constant and time-varying higher-moments models.
Resumo:
We propose a simple and flexible framework for forecasting the joint density of asset returns. The multinormal distribution is augmented with a polynomial in (time-varying) non-central co-moments of assets. We estimate the coefficients of the polynomial via the Method of Moments for a carefully selected set of co-moments. In an extensive empirical study, we compare the proposed model with a range of other models widely used in the literature. Employing a recently proposed as well as standard techniques to evaluate multivariate forecasts, we conclude that the augmented joint density provides highly accurate forecasts of the “negative tail” of the joint distribution.
Resumo:
Power system islanding can improve the continuity of power supply. Synchronous islanded operation enables the islanded system to remain in phase with the main power system while not electrically connected, so avoiding out-of-synchronism re-closure. Specific consideration is required for the multiple-set scenario. In this paper a suitable island management system is proposed, with the emphasis being on maximum island flexibility by allowing passive islanding transitions to occur, facilitated by intelligent control. These transitions include: island detection, identification, fragmentation, merging and return-to-mains. It can be challenging to detect these transitions while maintaining syn-chronous islanded operation. The performance of this control system in the presence of a variable wind power in-feed is also examined. A Mathworks SimPowerSystems simulation is used to investigate the performance of the island management system. The benefit and requirements for energy storage, com-munications and distribution system protection for this application are considered.
Resumo:
For any proposed software project, when the software requirements specification has been established, requirements changes may result in not only a modification of the requirements specification but also a series of modifications of all existing artifacts during the development. Then it is necessary to provide effective and flexible requirements changes management. In this paper, we present an approach to managing requirements changes based on Booth’s negotiation-style framework for belief revision. Informally, we consider the current requirements specification as a belief set about the system-to-be. The request of requirements change is viewed as new information about the same system-to-be. Then the process of executing the requirements change is a process of revising beliefs about the system-to-be. We design a family of belief negotiation models appropriate for different processes of requirements revision, including the setting of the request of requirements change being fully accepted, the setting of the current requirements specification being fully preserved, and that of the current specification and the request of requirements change reaching a compromise. In particular, the prioritization of requirements plays an important role in reaching an agreement in each belief negotiation model designed in this paper.