985 resultados para Digital repositories


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Information and technology and its use in organisation transformation presents unprecedented opportunities and risks. Increasingly, the Governance of Enterprise Information and Technology (GEIT) competency in the board room and executive is needed. Whether your organization is small or large, public, private or not for profit or whether your industry is not considered high-tech, IT is impacting your sector – no exceptions. But there is a skill shortage in boards: GEIT capability is concerningly low. This capability is urgently needed across the board, including those directors who come from finance, legal, marketing, operations and HR backgrounds. Digital disruption also affects all occupations. Putting in place a vision will help ensure emergency responses will meet technology-related duty of care responsibilities. When GEIT-related forward thinking and planning is carried out at the same time that you put your business strategy and plan in place, your organization has a significantly increased chance of not only surviving, but thriving into the future. Those organizations that don’t build GEIT capability risk joining the growing list of once-leading firms left behind in the digital ‘cloud of smoke’. Those organizations that do will be better placed to reap the benefits and hedge against the risks of a digital world. This chapter provides actionable, research-based considerations and processes for boards to use, to build awareness, knowledge and skills in governing technology-related organization strategy, risk and value creation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Frequency response analysis is critical in understanding the steady and transient state behavior of any electrical network. Network analyzeror frequency response analyzer is used to determine the frequency response of an electrical network. This paper deals with the design of an inexpensive digitally controlled Network Analyzer. The frequency range of the network analyzer is from 10Hz to 50kHz (suitable range for system studies on most power electronics apparatus). It is composed of a microcontroller (as central processing unit) and a personal computer (as analyzer and display). The communication between the microcontroller and personal computer is established through one of the USB ports. The testing and evaluation of the analyzer is done with RC, RLC and multi-resonant circuits. The design steps, basis of analysis, experimental results, limitation in bandwidth and possible techniques for improvement in performances are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Inverse filters are conventionally used for resolving overlapping signals of identical waveshape. However, the inverse filtering approach is shown to be useful for resolving overlapping signals, identical or otherwise, of unknown waveshapes. Digital inverse filter design based on autocorrelation formulation of linear prediction is known to perform optimum spectral flattening of the input signal for which the filter is designed. This property of the inverse filter is used to accomplish composite signal decomposition. The theory has been presented assuming constituent signals to be responses of all-pole filters. However, the approach may be used for a general situation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents the detailed dynamic digital simulation for the study of phenomenon of torsional interaction between HVDC-Turbine generator shaft, dynamics using the novel converter model presented in [ 1 ] The system model includes detailed representation of the synchronous generator and the shaft dynamics, the ac and dc network transients. The results of a case study indicate the various factors that influence the torsional interaction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose to compress weighted graphs (networks), motivated by the observation that large networks of social, biological, or other relations can be complex to handle and visualize. In the process also known as graph simplication, nodes and (unweighted) edges are grouped to supernodes and superedges, respectively, to obtain a smaller graph. We propose models and algorithms for weighted graphs. The interpretation (i.e. decompression) of a compressed, weighted graph is that a pair of original nodes is connected by an edge if their supernodes are connected by one, and that the weight of an edge is approximated to be the weight of the superedge. The compression problem now consists of choosing supernodes, superedges, and superedge weights so that the approximation error is minimized while the amount of compression is maximized. In this paper, we formulate this task as the 'simple weighted graph compression problem'. We then propose a much wider class of tasks under the name of 'generalized weighted graph compression problem'. The generalized task extends the optimization to preserve longer-range connectivities between nodes, not just individual edge weights. We study the properties of these problems and propose a range of algorithms to solve them, with dierent balances between complexity and quality of the result. We evaluate the problems and algorithms experimentally on real networks. The results indicate that weighted graphs can be compressed efficiently with relatively little compression error.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the cost-benefit analysis of digital long-term preservation (LTP) that was carried out in the context of the Finnish National Digital Library Project (NDL) in 2010. The analysis was based on the assumption that as many as 200 archives, libraries, and museums will share an LTP system. The term ‘system’ shall be understood as encompassing not only information technology, but also human resources, organizational structures, policies and funding mechanisms. The cost analysis shows that an LTP system will incur, over the first 12 years, cumulative costs of €42 million, i.e. an average of €3.5 million per annum. Human resources and investments in information technology are the major cost factors. After the initial stages, the analysis predicts annual costs of circa €4 million. The analysis compared scenarios with and without a shared LTP system. The results indicate that a shared system will have remarkable benefits. At the development and implementation stages, a shared system shows an advantage of €30 million against the alternative scenario consisting of five independent LTP solutions. During the later stages, the advantage is estimated at €10 million per annum. The cumulative cost benefit over the first 12 years would amount to circa €100 million.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The loss and degradation of forest cover is currently a globally recognised problem. The fragmentation of forests is further affecting the biodiversity and well-being of the ecosystems also in Kenya. This study focuses on two indigenous tropical montane forests in the Taita Hills in southeastern Kenya. The study is a part of the TAITA-project within the Department of Geography in the University of Helsinki. The study forests, Ngangao and Chawia, are studied by remote sensing and GIS methods. The main data includes black and white aerial photography from 1955 and true colour digital camera data from 2004. This data is used to produce aerial mosaics from the study areas. The land cover of these study areas is studied by visual interpretation, pixel-based supervised classification and object-oriented supervised classification. The change of the forest cover is studied with GIS methods using the visual interpretations from 1955 and 2004. Furthermore, the present state of the study forests is assessed with leaf area index and canopy closure parameters retrieved from hemispherical photographs as well as with additional, previously collected forest health monitoring data. The canopy parameters are also compared with textural parameters from digital aerial mosaics. This study concludes that the classification of forest areas by using true colour data is not an easy task although the digital aerial mosaics are proved to be very accurate. The best classifications are still achieved with visual interpretation methods as the accuracies of the pixel-based and object-oriented supervised classification methods are not satisfying. According to the change detection of the land cover in the study areas, the area of indigenous woodland in both forests has decreased in 1955 2004. However in Ngangao, the overall woodland area has grown mainly because of plantations of exotic species. In general, the land cover of both study areas is more fragmented in 2004 than in 1955. Although the forest area has decreased, forests seem to have a more optimistic future than before. This is due to the increasing appreciation of the forest areas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The mode I and mode II fracture toughness and the critical strain energy release rate for different concrete-concrete jointed interfaces are experimentally determined using the Digital Image Correlation technique. Concrete beams having different compressive strength materials on either side of a centrally placed vertical interface are prepared and tested under three-point bending in a closed loop servo-controlled testing machine under crack mouth opening displacement control. Digital images are captured before loading (undeformed state) and at different instances of loading. These images are analyzed using correlation techniques to compute the surface displacements, strain components, crack opening and sliding displacements, load-point displacement, crack length and crack tip location. It is seen that the CMOD and vertical load-point displacement computed using DIC analysis matches well with those measured experimentally.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A computerized non-linear-least-squares regression procedure to analyse the galvanostatic current-potential data for kinetically hindered reactions on porous gas-diffusion electrodes is reported. The simulated data fit well with the corresponding measured values. The analytical estimates of electrode-kinetic parameters and uncompensated resistance are found to be in good agreement with their respective values obtained from Tafel plots and the current-interrupter method. The procedure circumvents the need to collect the data in the limiting-current region where the polarization values are usually prone to errors. The polarization data for two typical cases, namely, methanol oxidation on a carbon-supported platinum-tin electrode and oxygen reduction on a Nafion-coated platinized carbon electrode, are successfully analysed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the main disturbances in EEG signals is EMG artefacts generated by muscle movements. In the paper, the use of a linear phase FIR digital low-pass filter with finite wordlength precision coefficients is proposed, designed using the compensation procedure, to minimise EMG artefacts in contaminated EEG signals. To make the filtering more effective, different structures are used, i.e. cascading, twicing and sharpening (apart from simple low-pass filtering) of the designed FIR filter Modifications are proposed to twicing and sharpening structures to regain the linear phase characteristics that are lost in conventional twicing and sharpening operations. The efficacy of all these transformed filters in minimising EMG artefacts is studied, using SNR improvements as a performance measure for simulated signals. Time plots of the signals are also compared. Studies show that the modified sharpening structure is superior in performance to all other proposed methods. These algorithms have also been applied to real or recorded EMG-contaminated EEG signal. Comparison of time plots, and also the output SNR, show that the proposed modified sharpened structure works better in minimising EMG artefacts compared with other methods considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present through the use of Petri Nets, modeling techniques for digital systems realizable using FPGAs. These Petri Net models are used for logic validation at the logic design phase. The technique is illustrated by modeling practical circuits. Further, the utility of the technique with respect to timing analysis of the modeled digital systems is considered. Copyright (C) 1997 Elsevier Science Ltd

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We address the problem of exact complex-wave reconstruction in digital holography. We show that, by confining the object-wave modulation to one quadrant of the frequency domain, and by maintaining a reference-wave intensity higher than that of the object, one can achieve exact complex-wave reconstruction in the absence of noise. A feature of the proposed technique is that the zero-order artifact, which is commonly encountered in hologram reconstruction, can be completely suppressed in the absence of noise. The technique is noniterative and nonlinear. We also establish a connection between the reconstruction technique and homomorphic signal processing, which enables an interpretation of the technique from the perspective of deconvolution. Another key contribution of this paper is a direct link between the reconstruction technique and the two-dimensional Hilbert transform formalism proposed by Hahn. We show that this connection leads to explicit Hilbert transform relations between the magnitude and phase of the complex wave encoded in the hologram. We also provide results on simulated as well as experimental data to validate the accuracy of the reconstruction technique. (C) 2011 Optical Society of America

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An in-situ power monitoring technique for Dynamic Voltage and Threshold scaling (DVTS) systems is proposed which measures total power consumed by load circuit using sleep transistor acting as power sensor. Design details of power monitor are examined using simulation framework in UMC 90nm CMOS process. Experimental results of test chip fabricated in AMS 0.35µm CMOS process are presented. The test chip has variable activity between 0.05 and 0.5 and has PMOS VTH control through nWell contact. Maximum resolution obtained from power monitor is 0.25mV. Overhead of power monitor in terms of its power consumption is 0.244 mW (2.2% of total power of load circuit). Lastly, power monitor is used to demonstrate closed loop DVTS system. DVTS algorithm shows 46.3% power savings using in-situ power monitor.