59 resultados para computation- and data-intensive applications
Resumo:
The n-tuple recognition method is briefly reviewed, summarizing the main theoretical results. Large-scale experiments carried out on Stat-Log project datasets confirm this method as a viable competitor to more popular methods due to its speed, simplicity, and accuracy on the majority of a wide variety of classification problems. A further investigation into the failure of the method on certain datasets finds the problem to be largely due to a mismatch between the scales which describe generalization and data sparseness.
Resumo:
Hierarchical visualization systems are desirable because a single two-dimensional visualization plot may not be sufficient to capture all of the interesting aspects of complex high-dimensional data sets. We extend an existing locally linear hierarchical visualization system PhiVis [1] in several directions: bf(1) we allow for em non-linear projection manifolds (the basic building block is the Generative Topographic Mapping -- GTM), bf(2) we introduce a general formulation of hierarchical probabilistic models consisting of local probabilistic models organized in a hierarchical tree, bf(3) we describe folding patterns of low-dimensional projection manifold in high-dimensional data space by computing and visualizing the manifold's local directional curvatures. Quantities such as magnification factors [3] and directional curvatures are helpful for understanding the layout of the nonlinear projection manifold in the data space and for further refinement of the hierarchical visualization plot. Like PhiVis, our system is statistically principled and is built interactively in a top-down fashion using the EM algorithm. We demonstrate the visualization system principle of the approach on a complex 12-dimensional data set and mention possible applications in the pharmaceutical industry.
Resumo:
The development of sensing devices is one of the instrumentation fields that has grown rapidly in the last decade. Corresponding to the swift advance in the development of microelectronic sensors, optical fibre sensors are widely investigated because of their advantageous properties over the electronics sensors such as their wavelength multiplexing capability and high sensitivity to temperature, pressure, strain, vibration and acoustic emission. Moreover, optical fibre sensors are more attractive than the electronics sensors as they can perform distributed sensing, in terms of covering a reasonably large area using a single piece of fibre. Apart from being a responsive element in the sensing field, optical fibre possesses good assets in generating, distributing, processing and transmitting signals in the future broadband information network. These assets include wide bandwidth, high capacity and low loss that grant mobility and flexibility for wireless access systems. Among these core technologies, the fibre optic signal processing and transmission of optical and radio frequency signals have been the subjects of study in this thesis. Based on the intrinsic properties of single-mode optical fibre, this thesis aims to exploit the fibre characteristics such as thermal sensitivity, birefringence, dispersion and nonlinearity, in the applications of temperature sensing and radio-over-fibre systems. By exploiting the fibre thermal sensitivity, a fully distributed temperature sensing system consisting of an apodised chirped fibre Bragg grating has been implemented. The proposed system has proven to be efficient in characterising grating and providing the information of temperature variation, location and width of the heat source applied in the area under test.To exploit the fibre birefringence, a fibre delay line filter using a single high-birefringence optical fibre structure has been presented. The proposed filter can be reconfigured and programmed by adjusting the input azimuth of launched light, as well as the strength and direction of the applied coupling, to meet the requirements of signal processing for different purposes in microwave photonic and optical filtering applications. To exploit the fibre dispersion and nonlinearity, experimental investigations have been carried out to study their joint effect in high power double-sideband and single-sideband modulated links with the presence of fibre loss. The experimental results have been theoretically verified based on the in-house implementation of the split-step Fourier method applied to the generalised nonlinear Schrödinger equation. Further simulation study on the inter-modulation distortion in two-tone signal transmission has also been presented so as to show the effect of nonlinearity of one channel on the other. In addition to the experimental work, numerical simulations have also been carried out in all the proposed systems, to ensure that all the aspects concerned are comprehensively investigated.
Resumo:
The use of digital communication systems is increasing very rapidly. This is due to lower system implementation cost compared to analogue transmission and at the same time, the ease with which several types of data sources (data, digitised speech and video, etc.) can be mixed. The emergence of packet broadcast techniques as an efficient type of multiplexing, especially with the use of contention random multiple access protocols, has led to a wide-spread application of these distributed access protocols in local area networks (LANs) and a further extension of them to radio and mobile radio communication applications. In this research, a proposal for a modified version of the distributed access contention protocol which uses the packet broadcast switching technique has been achieved. The carrier sense multiple access with collision avoidance (CSMA/CA) is found to be the most appropriate protocol which has the ability to satisfy equally the operational requirements for local area networks as well as for radio and mobile radio applications. The suggested version of the protocol is designed in a way in which all desirable features of its precedents is maintained. However, all the shortcomings are eliminated and additional features have been added to strengthen its ability to work with radio and mobile radio channels. Operational performance evaluation of the protocol has been carried out for the two types of non-persistent and slotted non-persistent, through mathematical and simulation modelling of the protocol. The results obtained from the two modelling procedures validate the accuracy of both methods, which compares favourably with its precedent protocol CSMA/CD (with collision detection). A further extension of the protocol operation has been suggested to operate with multichannel systems. Two multichannel systems based on the CSMA/CA protocol for medium access are therefore proposed. These are; the dynamic multichannel system, which is based on two types of channel selection, the random choice (RC) and the idle choice (IC), and the sequential multichannel system. The latter has been proposed in order to supress the effect of the hidden terminal, which always represents a major problem with the usage of the contention random multiple access protocols with radio and mobile radio channels. Verification of their operation performance evaluation has been carried out using mathematical modelling for the dynamic system. However, simulation modelling has been chosen for the sequential system. Both systems are found to improve system operation and fault tolerance when compared to single channel operation.
Resumo:
Perception of Mach bands may be explained by spatial filtering ('lateral inhibition') that can be approximated by 2nd derivative computation, and several alternative models have been proposed. To distinguish between them, we used a novel set of ‘generalised Gaussian’ images, in which the sharp ramp-plateau junction of the Mach ramp was replaced by smoother transitions. The images ranged from a slightly blurred Mach ramp to a Gaussian edge and beyond, and also included a sine-wave edge. The probability of seeing Mach Bands increased with the (relative) sharpness of the junction, but was largely independent of absolute spatial scale. These data did not fit the predictions of MIRAGE, nor 2nd derivative computation at a single fine scale. In experiment 2, observers used a cursor to mark features on the same set of images. Data on perceived position of Mach bands did not support the local energy model. Perceived width of Mach bands was poorly explained by a single-scale edge detection model, despite its previous success with Mach edges (Wallis & Georgeson, 2009, Vision Research, 49, 1886-1893). A more successful model used separate (odd and even) scale-space filtering for edges and bars, local peak detection to find candidate features, and the MAX operator to compare odd- and even-filter response maps (Georgeson, VSS 2006, Journal of Vision 6(6), 191a). Mach bands are seen when there is a local peak in the even-filter (bar) response map, AND that peak value exceeds corresponding responses in the odd-filter (edge) maps.
Resumo:
The objective of the work described was to identify and synthesize a range of biodegradable hypercoiling or hydrophobically associating polymers to mimic natural apoproteins, such as those found in lung surfactant or plasma apolipoproteins. Stirred interfacial polymerization was used to synthesize potentially biodegradable aromatic polyamides (Mw of 12,000-26,000) based on L-Iysine, L-Iysine ethyl ester, L-ornithine and DL-diaminopropionic acid, by reaction with isophthaloyl chloride. A similar technique was used to synthesize aliphatic polyamides based on L-Iysine ethyl ester and either adipoyl chloride or glutaryl chloride resulting in the synthesis of poly(lysine ethyl ester adipamide) [PLETESA] or poly(lysine ethyl ester glutaramide) (Mw of 126,000 and 26,000, respectively). PLETESA was found to be soluble in both polar and non-polar solvents and the hydrophobic/hydrophilic balance could be modified by partial saponification (66-75%) of the ethyl ester side chains. Surface or interfacial tension/pH profiles were used to assess the conformation of both the poly(isophthalamides) and partially saponified PLETESA in aqueous solution. The results demonstrated that a loss of charge from the polymer was accompanied by an initial fall in surface activity, followed by a rise in activity, and ultimately, by polymer precipitation. These observations were explained by a collapse of the polymer chains into non-surface active intramolecular coils, followed by a transition to an amphipathic conformation, and finally to a collapsed hydrophobe. 2-Dimensional NMR analysis of polymer conformation in polar and non-polar solvents revealed intramolecular associations between the hydrophobic groups within partially saponified PLETESA. Unsaponified PLETESA appeared to form a coiled structure in polar solvents where the ethyl ester side chains were contained within the polymer coil. The implications of the secondary structure of PLETESA and potential biomedical applications are discussed.
Resumo:
INTAMAP is a web processing service for the automatic interpolation of measured point data. Requirements were (i) using open standards for spatial data such as developed in the context of the open geospatial consortium (OGC), (ii) using a suitable environment for statistical modelling and computation, and (iii) producing an open source solution. The system couples the 52-North web processing service, accepting data in the form of an observations and measurements (O&M) document with a computing back-end realized in the R statistical environment. The probability distribution of interpolation errors is encoded with UncertML, a new markup language to encode uncertain data. Automatic interpolation needs to be useful for a wide range of applications and the algorithms have been designed to cope with anisotropies and extreme values. In the light of the INTAMAP experience, we discuss the lessons learnt.
Resumo:
This research examines the role of the information management process within a process-oriented enterprise, Xerox Ltd. The research approach is based on a post-positive paradigm and has resulted in thirty-five idiographic statements. The three major outcomes are: 1. The process-oriented holistic enterprise is an organisation that requires a long-term management commitment to its development. It depends on the careful management of people, tasks, information and technology. A complex integration of business processes is required and this can be managed through the use of consistent documentation techniques, clarity in the definition of process responsibilities and management attention to the global metrics and the centralisation of the management of the process model are critical to its success. 2. The role of the information management process within the context of a process-oriented enterprise is to provide flexible and cost-effective applications, technological, and process support to the business. This is best achieved through a centralisation of the management of information management and of the process model. A business-led approach combined with the consolidation of applications, information, process, and data architectures is central to providing effective business and process-focused support. 3. In a process oriented holistic enterprise, process and information management are inextricably linked. The model of process management depends heavily on information management, whilst the model of information management is totally focused around supporting and creating the process model. The two models are mutually creating - one cannot exist without the other. There is a duality concept of process and information management.
Resumo:
Substantial altimetry datasets collected by different satellites have only become available during the past five years, but the future will bring a variety of new altimetry missions, both parallel and consecutive in time. The characteristics of each produced dataset vary with the different orbital heights and inclinations of the spacecraft, as well as with the technical properties of the radar instrument. An integral analysis of datasets with different properties offers advantages both in terms of data quantity and data quality. This thesis is concerned with the development of the means for such integral analysis, in particular for dynamic solutions in which precise orbits for the satellites are computed simultaneously. The first half of the thesis discusses the theory and numerical implementation of dynamic multi-satellite altimetry analysis. The most important aspect of this analysis is the application of dual satellite altimetry crossover points as a bi-directional tracking data type in simultaneous orbit solutions. The central problem is that the spatial and temporal distributions of the crossovers are in conflict with the time-organised nature of traditional solution methods. Their application to the adjustment of the orbits of both satellites involved in a dual crossover therefore requires several fundamental changes of the classical least-squares prediction/correction methods. The second part of the thesis applies the developed numerical techniques to the problems of precise orbit computation and gravity field adjustment, using the altimetry datasets of ERS-1 and TOPEX/Poseidon. Although the two datasets can be considered less compatible that those of planned future satellite missions, the obtained results adequately illustrate the merits of a simultaneous solution technique. In particular, the geographically correlated orbit error is partially observable from a dataset consisting of crossover differences between two sufficiently different altimetry datasets, while being unobservable from the analysis of altimetry data of both satellites individually. This error signal, which has a substantial gravity-induced component, can be employed advantageously in simultaneous solutions for the two satellites in which also the harmonic coefficients of the gravity field model are estimated.
Resumo:
The aim of the research project was to gain d complete and accurate accounting of the needs and deficiencies of materials selection and design data, with particular attention given to the feasibility of a computerised materials selection system that would include application analysis, property data and screening techniques. The project also investigates and integrates the three major aspects of materials resources, materials selection and materials recycling. Consideration of the materials resource base suggests that, though our discovery potential has increased, geologic availability is the ultimate determinant and several metals may well become scarce at the same time, thus compounding the problem of substitution. With around 2- to 20- million units of engineering materials data, the use of a computer is the only logical answer for scientific selection of materials. The system developed at Aston is used for data storage, mathematical computation and output. The system enables programs to be run in batch and interactive (on-line) mode. The program with modification can also handle such variables as quantity of mineral resources, energy cost of materials and depletion and utilisation rates of strateqic materials. The work also carries out an in-depth study of copper recycling in the U.K. and concludes that, somewhere in the region of 2 million tonnes of copper is missing from the recycling cycle. It also sets out guidelines on product design and conservation policies from the recyclability point of view.
Resumo:
Recently, there is a great interest in pushing communication technologies to 100 Gb/s. However, there are still many challenges to perform high speed (> 40 Gb/s) clock and data recovery, and data time-division-multiplexing (TDM). Here, we propose and numerically analyze an asynchronous optical packet retimer using parabolic or sinusoidal phase modulation and linear dispersion. This scheme is named pulse position locking (PPL). Numerical simulation shows that this scheme can effectively resynchronize input signals with arbitrary delays to the local clock, and reduce input jitter. The scheme can also be applied to TDM 10 Gb/s and 40 Gb/s signals to over 100 Gb/s.
Resumo:
The development of new all-optical technologies for data processing and signal manipulation is a field of growing importance with a strong potential for numerous applications in diverse areas of modern science. Nonlinear phenomena occurring in optical fibres have many attractive features and great, but not yet fully explored, potential in signal processing. Here, we review recent progress on the use of fibre nonlinearities for the generation and shaping of optical pulses and on the applications of advanced pulse shapes in all-optical signal processing. Amongst other topics, we will discuss ultrahigh repetition rate pulse sources, the generation of parabolic shaped pulses in active and passive fibres, the generation of pulses with triangular temporal profiles, and coherent supercontinuum sources. The signal processing applications will span optical regeneration, linear distortion compensation, optical decision at the receiver in optical communication systems, spectral and temporal signal doubling, and frequency conversion. © Copyright 2012 Sonia Boscolo and Christophe Finot.
Resumo:
In this paper, we discuss some practical implications for implementing adaptable network algorithms applied to non-stationary time series problems. Two real world data sets, containing electricity load demands and foreign exchange market prices, are used to test several different methods, ranging from linear models with fixed parameters, to non-linear models which adapt both parameters and model order on-line. Training with the extended Kalman filter, we demonstrate that the dynamic model-order increment procedure of the resource allocating RBF network (RAN) is highly sensitive to the parameters of the novelty criterion. We investigate the use of system noise for increasing the plasticity of the Kalman filter training algorithm, and discuss the consequences for on-line model order selection. The results of our experiments show that there are advantages to be gained in tracking real world non-stationary data through the use of more complex adaptive models.
Resumo:
This paper analyses the relationship between production subsidies and firms’ export performance using a very comprehensive and recent firm-level database and controlling for the endogeneity of subsidies. It documents robust evidence that production subsidies stimulate export activity at the intensive margin, although this effect is conditional on firm characteristics. In particular, the positive relationship between subsidies and the intensive margin of exports is strongest among profit-making firms, firms in capital-intensive industries, and those located in non-coastal regions. Compared to firm characteristics, the extent of heterogeneity across ownership structure (SOEs, collectives, and privately owned firms) proves to be relatively less important.
Resumo:
We introduce a flexible visual data mining framework which combines advanced projection algorithms from the machine learning domain and visual techniques developed in the information visualization domain. The advantage of such an interface is that the user is directly involved in the data mining process. We integrate principled projection algorithms, such as generative topographic mapping (GTM) and hierarchical GTM (HGTM), with powerful visual techniques, such as magnification factors, directional curvatures, parallel coordinates and billboarding, to provide a visual data mining framework. Results on a real-life chemoinformatics dataset using GTM are promising and have been analytically compared with the results from the traditional projection methods. It is also shown that the HGTM algorithm provides additional value for large datasets. The computational complexity of these algorithms is discussed to demonstrate their suitability for the visual data mining framework. Copyright 2006 ACM.