35 resultados para DIGITAL DATA
Resumo:
In this letter, we show how a 2.4-GHz retrodirective array operating in a multipath rich environment can be utilized in order to spatially encrypt digital data. For the first time, we give experimental evidence that digital data that has no mathematical encryption applied to it can be successfully recovered only when it is detected with a receiver that is polarization-matched to that of a reference continuous-wave (CW) pilot tone signal. In addition, we show that successful detection with low bit error rate (BER) will only occur within a highly constrained spatial region colocated close to the position of the CW reference signal. These effects mean that the signal cannot be intercepted and its modulated data recovered at locations other than the constrained spatial region around the position from which the retrodirective communication was initiated.
Resumo:
This paper investigates the application of complex wavelet transforms to the field of digital data hiding. Complex wavelets offer improved directional selectivity and shift invariance over their discretely sampled counterparts allowing for better adaptation of watermark distortions to the host media. Two methods of deriving visual models for the watermarking system are adapted to the complex wavelet transforms and their performances are compared. To produce improved capacity a spread transform embedding algorithm is devised, this combines the robustness of spread spectrum methods with the high capacity of quantization based methods. Using established information theoretic methods, limits of watermark capacity are derived that demonstrate the superiority of complex wavelets over discretely sampled wavelets. Finally results for the algorithm against commonly used attacks demonstrate its robustness and the improved performance offered by complex wavelet transforms.
Resumo:
A substantial amount of the 'critical mass' of digital data available to scholarship contains place-names, and it is now recognised that spatial and temporal data points, including place-names, are a vital part of the e-research infrastructure that supports the use, re-use and advanced analysis of data using ICT tools and methods. Place-names can also be linked semantically to contribute to the web of data, and to enrich content through linking existing data, and identifying new collections for digitization to strategically enhance existing digital collections. However, existing e-projects rely on modern gazetteers limiting them to the modern and the near-contemporary. This workshop explored how to further integrate the wealth of historical place-name scholarship, and the resulting digital resources generated within UK academia, so enabling integration of local knowledge over much longer periods.
Resumo:
The Irish Pavilion at the Venice Architecture Biennale 2012 charts a position for Irish architecture in a global culture where the modes of production of architecture are radically altered. Ireland is one of the most globalised countries in the world, yet it has developed a national culture of architecture derived from local place as a material construct. We now have to evolve our understanding in the light of the globalised nature of economic processes and architectural production which is largely dependent on internationally networked flows of products, data, and knowledge. We have just begun to represent this situation to ourselves and others. How should a global architecture be grounded culturally and philosophically? How does it position itself outside of shared national reference points?
heneghan peng architects were selected as participants because they are working across three continents on a range of competition-winning projects. Several of these are in sensitive and/or symbolic sites that include three UNESCO World Heritage sites, including the Grand Egyptian Museum in Cairo, the Giants Causeway Visitor Centre in Northern Ireland, and the new Rhine Bridge near Lorelei.
Our dialogue led us to discussing the universal languages of projective geometry and number are been shared by architects and related professionals. In the work of heneghan peng, the specific embodiment of these geometries is carefully calibrated by the choice of materials and the detailed design of their physical performance on site. The stone facade of the Giant’s Causeway Visitor Centre takes precise measure of the properties of the volcanic basalt seams from which it is hewn. The extraction of the stone is the subject of the pavilion wall drawings which record the cutting of stones to create the façade of the causeway centre.
We also identified water as an element which is shared across the different sites. Venice is a perfect place to take measure of this element which suggests links to another site – the Nile Valley which was enriched by the annual flooding of the River Nile. An ancient Egyptian rod for measuring the water level of the Nile inspired the design of the Nilometre - a responsive oscillating bench that invites visitors to balance their respective weights. This action embodies the ways of thinking that are evolving to operate in the globalised world, where the autonomous architectural object is dissolving into an expanded field of conceptual rules and systems. The bench constitutes a shifting ground located in the unstable field of Venice. It is about measurement and calibration of the weight of the body in relation to other bodies; in relation to the site of the installation; and in relation to water. The exhibit is located in the Artiglierie section of the Arsenale. Its level is calibrated against the mark of the acqua alta in the adjacent brickwork of the building which embodies a liminal moment in the fluctuating level of the lagoon.
The weights of bodies, the level of water, changes over time, are constant aspects of design across cultures and collectively they constitute a common ground for architecture - a ground shared with other design professionals. The movement of the bench required complex engineering design and active collaboration between the architects, engineers and fabricators. It is a kind of prototype – a physical object produced from digital data that explores the mathematics at play – the see-saw motion invites the observer to become a participant, to give it a test drive. It shows how a simple principle can generate complex effects that are difficult to predict and invites visitors to experiment and play with them.
Resumo:
Dynamic power consumption is very dependent on interconnect, so clever mapping of digital signal processing algorithms to parallelised realisations with data locality is vital. This is a particular problem for fast algorithm implementations where typically, designers will have sacrificed circuit structure for efficiency in software implementation. This study outlines an approach for reducing the dynamic power consumption of a class of fast algorithms by minimising the index space separation; this allows the generation of field programmable gate array (FPGA) implementations with reduced power consumption. It is shown how a 50% reduction in relative index space separation results in a measured power gain of 36 and 37% over a Cooley-Tukey Fast Fourier Transform (FFT)-based solution for both actual power measurements for a Xilinx Virtex-II FPGA implementation and circuit measurements for a Xilinx Virtex-5 implementation. The authors show the generality of the approach by applying it to a number of other fast algorithms namely the discrete cosine, the discrete Hartley and the Walsh-Hadamard transforms.
Resumo:
This article proposes that a complementary relationship exists between the formalised nature of digital loyalty card data, and the informal nature of small business market orientation. A longitudinal, case-based research approach analysed this relationship in small firms given access to Tesco Clubcard data. The findings reveal a new-found structure and precision in small firm marketing planning from data exposure; this complemented rather than conflicted with an intuitive feel for markets. In addition, small firm owners were encouraged to include employees in marketing planning.
Resumo:
Digital image analysis is at a crossroads. While the technology has made great strides over the past few decades, there is an urgent need for image analysis to inform the next wave of large scale tissue biomarker discovery studies in cancer. Drawing parallels from the growth of next generation sequencing, this presentation will consider the case for a common language or standard format for storing and communicating digital image analysis data. In this context, image analysis data comprises more than simply an image with markups and attached key-value pair metrics. The desire to objectively benchmark competing platforms or a push for data to be deposited to public repositories much like genomics data may drive the need for a standard that also encompasses granular, cell-by-cell data.
Resumo:
A problem with use of the geostatistical Kriging error for optimal sampling design is that the design does not adapt locally to the character of spatial variation. This is because a stationary variogram or covariance function is a parameter of the geostatistical model. The objective of this paper was to investigate the utility of non-stationary geostatistics for optimal sampling design. First, a contour data set of Wiltshire was split into 25 equal sub-regions and a local variogram was predicted for each. These variograms were fitted with models and the coefficients used in Kriging to select optimal sample spacings for each sub-region. Large differences existed between the designs for the whole region (based on the global variogram) and for the sub-regions (based on the local variograms). Second, a segmentation approach was used to divide a digital terrain model into separate segments. Segment-based variograms were predicted and fitted with models. Optimal sample spacings were then determined for the whole region and for the sub-regions. It was demonstrated that the global design was inadequate, grossly over-sampling some segments while under-sampling others.
Resumo:
A family of stochastic gradient algorithms and their behaviour in the data echo cancellation work platform are presented. The cost function adaptation algorithms use an error exponent update strategy based on an absolute error mapping, which is updated at every iteration. The quadratic and nonquadratic cost functions are special cases of the new family. Several possible realisations are introduced using these approaches. The noisy error problem is discussed and the digital recursive filter estimator is proposed. The simulation outcomes confirm the effectiveness of the proposed family of algorithms.
Resumo:
For a digital echo canceller it is desirable to reduce the adaptation time, during which the transmission of useful data is not possible. LMS is a non-optimal algorithm in this case as the signals involved are statistically non-Gaussian. Walach and Widrow (IEEE Trans. Inform. Theory 30 (2) (March 1984) 275-283) investigated the use of a power of 4, while other research established algorithms with arbitrary integer (Pei and Tseng, IEEE J. Selected Areas Commun. 12(9)(December 1994) 1540-1547) or non-quadratic power (Shah and Cowan, IEE.Proc.-Vis. Image Signal Process. 142 (3) (June 1995) 187-191). This paper suggests that continuous and automatic, adaptation of the error exponent gives a more satisfactory result. The family of cost function adaptation (CFA) stochastic gradient algorithm proposed allows an increase in convergence rate and, an improvement of residual error. As special case the staircase CFA algorithm is first presented, then the smooth CFA is developed. Details of implementations are also discussed. Results of simulation are provided to show the properties of the proposed family of algorithms. (C) 2000 Elsevier Science B.V. All rights reserved.
Resumo:
Previous research suggests that the digital cushion, a shock-absorbing structure in the claw, plays an important role in protecting cattle from lameness. This study aimed to assess the degree to which nutritional factors influence the composition of the digital cushion. This involved quantifying lipid content and fatty acid composition differences in digital cushion tissue from cattle offered diets with different amounts of linseed. Forty-six bulls were allocated to 1 of 4 treatments, which were applied for an average of 140 +/- 27 d during the finishing period. The treatments consisted of a linseed supplement offered once daily on top of the basal diet (grass silage:concentrate) at 0, 400, 800, or 1,200 g of supplement/animal per day. For each treatment, the concentrate offered was adjusted to ensure that total estimated ME intake was constant across treatments. Target BW at slaughter was 540 kg. Legs were collected in 3 batches after 120, 147 and 185 d on experiment. Six samples of the digital cushion were dissected from the right lateral hind claw of each animal. Lipids were extracted and expressed as a proportion of fresh tissue, and fatty acid composition of the digital cushion was determined by gas chromatography. Data were analyzed by ANOVA, with diet, location within the digital cushion, and their interactions as fixed effects and fat content (grams per 100 g of tissue) as a covariate. Linear or quadratic contrasts were examined. The lipid content of digital cushion tissue differed between sampling locations (P