902 resultados para supervisory control and data acquisition
Resumo:
A novel architecture for microwave/millimeter-wave signal generation and data modulation using a fiber-grating-based distributed feedback laser has been proposed in this letter. For demonstration, a 155.52-Mb/s data stream on a 16.9-GHz subcarrier has been transmitted and recovered successfully. It has been proved that this technology would be of benefit to future microwave data transmission systems. © 2006 IEEE.
Resumo:
Contrast sensitivity improves with the area of a sine-wave grating, but why? Here we assess this phenomenon against contemporary models involving spatial summation, probability summation, uncertainty, and stochastic noise. Using a two-interval forced-choice procedure we measured contrast sensitivity for circular patches of sine-wave gratings with various diameters that were blocked or interleaved across trials to produce low and high extrinsic uncertainty, respectively. Summation curves were steep initially, becoming shallower thereafter. For the smaller stimuli, sensitivity was slightly worse for the interleaved design than for the blocked design. Neither area nor blocking affected the slope of the psychometric function. We derived model predictions for noisy mechanisms and extrinsic uncertainty that was either low or high. The contrast transducer was either linear (c1.0) or nonlinear (c2.0), and pooling was either linear or a MAX operation. There was either no intrinsic uncertainty, or it was fixed or proportional to stimulus size. Of these 10 canonical models, only the nonlinear transducer with linear pooling (the noisy energy model) described the main forms of the data for both experimental designs. We also show how a cross-correlator can be modified to fit our results and provide a contemporary presentation of the relation between summation and the slope of the psychometric function.
Resumo:
Building on the ‘law and economics’ literature, this paper analyses corporate governance implications of debt financing in an environment where a dominant owner is able to extract ex ante ‘private benefits of control’. Ownership concentration may result in lower efficiency, measured as a ratio of a firm’s debt to investment, and this effect depends on the identity of the largest shareholder. Moreover, entrenched dominant shareholder(s) may be colluding with fixed-claim holders in extracting ‘control premium’. One of possible outcomes is a ‘crowding out’ of entrepreneurial firms from the debt market, and this is supported by evidence from the transition economies.
Resumo:
Large prospective trials designed to assess the relationship between metabolic control and CV outcomes in type 2 diabetes have entered a new phase of scrutiny due to strict requirements imposed by the FDA to assess new anti-diabetic agents. So what have we learned from recently completed trials and what do we expect to learn from on-going trials?
Resumo:
The open content creation process has proven itself to be a powerful and influential way of developing text-based content, as demonstrated by the success of Wikipedia and related sites. Distributed individuals independently edit, revise, or refine content, thereby creating knowledge artifacts of considerable breadth and quality. Our study explores the mechanisms that control and guide the content creation process and develops an understanding of open content governance. The repertory grid method is employed to systematically capture the experiences of individuals involved in the open content creation process and to determine the relative importance of the diverse control and guiding mechanisms. Our findings illustrate the important control and guiding mechanisms and highlight the multifaceted nature of open content governance. A range of governance mechanisms is discussed with regard to the varied levels of formality, the different loci of authority, and the diverse interaction environments involved. Limitations and opportunities for future research are provided.
Resumo:
We present the first experimental implementation of a recently designed quasi-lossless fiber span with strongly reduced signal power excursion. The resulting fiber waveguide medium can be advantageously used both in lightwave communications and in all-optical nonlinear data processing. © 2005 IEEE.
Resumo:
High-speed optical clock recovery, demultiplexing and data regeneration will be integral parts of any future photonic network based on high bit-rate OTDM. Much research has been conducted on devices that perform these functions, however to date each process has been demonstrated independently. A very promising method of all-optical switching is that of a semiconductor optical amplifier-based nonlinear optical loop mirror (SOA-NOLM). This has various advantages compared with the standard fiber NOLM, most notably low switching power, compact size and stability. We use the SOA-NOLM as an all-optical mixer in a classical phase-locked loop arrangement to achieve optical clock recovery, while at the same time achieving data regeneration in a single compact device
Resumo:
We present the first experimental implementation of a recently designed quasi-lossless fibre span with strongly reduced signal power excursion. The resulting fibre waveguide medium can be advantageously used both in lightwave communications and in all-optical nonlinear data processing.
Resumo:
Purpose – This paper aims to consider how climate change performance is measured and accounted for within the performance framework for local authority areas in England adopted in 2008. It critically evaluates the design of two mitigation and one adaptation indicators that are most relevant to climate change. Further, the potential for these performance indicators to contribute to climate change mitigation and adaptation is discussed. Design/methodology/approach – The authors begin by examining the importance of the performance framework and the related Local Area Agreements (LAAs), which were negotiated for all local areas in England between central government and Local Strategic Partnerships (LSPs). This development is located within the broader literature relating to new public management. The potential for this framework to assist in delivering the UK's climate change policy objectives is researched in a two-stage process. First, government publications and all 150 LAAs were analysed to identify the level of priority given to the climate change indicators. Second, interviews were conducted in spring 2009 with civil servants and local authority officials from the English West Midlands who were engaged in negotiating the climate change content of the LAAs. Findings – Nationally, the authors find that 97 per cent of LAAs included at least one climate change indicator as a priority. The indicators themselves, however, are perceived to be problematic – in terms of appropriateness, accuracy and timeliness. In addition, concerns were identified about the level of local control over the drivers of climate change performance and, therefore, a question is raised as to how LSPs can be held accountable for this. On a more positive note, for those concerned about climate change, the authors do find evidence that the inclusion of these indicators within the performance framework has helped to move climate change up the agenda for local authorities and their partners. However, actions by the UK's new coalition government to abolish the national performance framework and substantially reduce public expenditure potentially threaten this advance. Originality/value – This paper offers an insight into a new development for measuring climate change performance at a local level, which is relatively under-researched. It also contributes to knowledge of accountability within a local government setting and provides a reference point for further research into the potential role of local actions to address the issue of climate change.
Resumo:
Purpose – The purpose of this paper is to consider hierarchical control as a mode of governance, and analyses the extent of control exhibited by central government over local government through the best value (BV) and comprehensive performance assessment (CPA) performance regimes. Design/methodology/approach – This paper utilises Ouchi's framework and, specifically, his articulation of bureaucratic or hierarchical control in the move towards achievement of organisational objectives. Hierarchical control may be inferred from the extent of “command and control” by Central Government, use of rewards and sanctions, and alignment to government priorities and discrimination of performance. Findings – CPA represents a more sophisticated performance regime than BV in the governance of local authorities by central government. In comparison to BV, CPA involved less scope for dialogue with local government prior to introduction, closer inspection of and direction of support toward poorer performing authorities, and more alignment to government priorities in the weightings attached to service blocks. Originality/value - The paper focuses upon the hierarchic/bureaucratic mode of governance as articulated by Ouchi and expands on this mode in order to analyse shifts in performance regimes in the public sector.
Resumo:
This book is very practical in its international usefulness (because current risk practice and understanding is not equal across international boundaries). For example, an accountant in Belgium would want to know what the governance regulations are in that country and what the risk issues are that he/she needs to be aware of. This book covers the international aspect of risk management systems, risk and governance, and risk and accounting. In doing so the book covers topics such as: internal control and corporate governance; risk management systems; integrating risk into performance management systems; risk and audit; governance structures; risk management of pensions; pension scheme risks e.g. hedging derivatives, longevity bonds etc; risk reporting; and the role of the accountant in risk management. There are the case studies through out the book which illustrate by way of concrete practical examples the major themes contained in the book. The book includes highly topical areas such as the Sarbanes Oxley Act and pension risk management.
Resumo:
This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.