25 resultados para data storage concept

em Aston University Research Archive


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The increasing demand for high capacity data storage requires decreasing the head-to-tape gap and reducing the track width. A problem very often encountered is the development of adhesive debris on the heads at low humidity and high temperatures that can lead to an increase of space between the head and media, and thus a decrease in the playback signal. The influence of stains on the playback signal of reading heads is studied using RAW (Read After Write) tests and their influence on the wear of the heads by using indentation technique. The playback signal has been found to vary and the errors to increase as stains form a patchy pattern and grow in size to form a continuous layer. The indentation technique shows that stains reduce the wear rate of the heads. In addition, the wear tends to be more pronounced at the leading edge of the head compared to the trailing one. Chemical analysis of the stains using ferrite samples in conjunction with MP (metal particulate) tapes shows that stains contain iron particles and polymeric binder transferred from the MP tape. The chemical anchors in the binder used to grip the iron particles now react with the ferrite surface to create strong chemical bonds. At high humidity, a thin layer of iron oxyhydroxide forms on the surface of the ferrite. This soft material increases the wear rate and so reduces the amount of stain present on the heads. The stability of the binder under high humidity and under high temperature as well as the chemical reactions that might occur on the ferrite poles of the heads influences the dynamic behaviour of stains. A model of stain formation taking into account the channels of binder degradation and evolution upon different environmental conditions is proposed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Information systems have developed to the stage that there is plenty of data available in most organisations but there are still major problems in turning that data into information for management decision making. This thesis argues that the link between decision support information and transaction processing data should be through a common object model which reflects the real world of the organisation and encompasses the artefacts of the information system. The CORD (Collections, Objects, Roles and Domains) model is developed which is richer in appropriate modelling abstractions than current Object Models. A flexible Object Prototyping tool based on a Semantic Data Storage Manager has been developed which enables a variety of models to be stored and experimented with. A statistical summary table model COST (Collections of Objects Statistical Table) has been developed within CORD and is shown to be adequate to meet the modelling needs of Decision Support and Executive Information Systems. The COST model is supported by a statistical table creator and editor COSTed which is also built on top of the Object Prototyper and uses the CORD model to manage its metadata.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of the research project was to gain d complete and accurate accounting of the needs and deficiencies of materials selection and design data, with particular attention given to the feasibility of a computerised materials selection system that would include application analysis, property data and screening techniques. The project also investigates and integrates the three major aspects of materials resources, materials selection and materials recycling. Consideration of the materials resource base suggests that, though our discovery potential has increased, geologic availability is the ultimate determinant and several metals may well become scarce at the same time, thus compounding the problem of substitution. With around 2- to 20- million units of engineering materials data, the use of a computer is the only logical answer for scientific selection of materials. The system developed at Aston is used for data storage, mathematical computation and output. The system enables programs to be run in batch and interactive (on-line) mode. The program with modification can also handle such variables as quantity of mineral resources, energy cost of materials and depletion and utilisation rates of strateqic materials. The work also carries out an in-depth study of copper recycling in the U.K. and concludes that, somewhere in the region of 2 million tonnes of copper is missing from the recycling cycle. It also sets out guidelines on product design and conservation policies from the recyclability point of view.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A survey of the existing state-of-the-art of turbine blade manufacture highlights two operations that have not been automated namely that of loading of a turbine blade into an encapsulation die, and that of removing a machined blade from the encapsulation block. The automation of blade decapsulation has not been pursued. In order to develop a system to automate the loading of an encapsulation die a prototype mechanical handling robot has been designed together with a computer controlled encapsulation die. The robot has been designed as a mechanical handling robot of cylindrical geometry, suitable for use in a circular work cell. It is the prototype for a production model to be called `The Cybermate'. The prototype robot is mechanically complete but due to unforeseen circumstances the robot control system is not available (the development of the control system did not form a part of this project), hence it has not been possible to fully test and assess the robot mechanical design. Robot loading of the encapsulation die has thus been simulated. The research work with regard to the encapsulation die has focused on the development of computer controlled, hydraulically actuated, location pins. Such pins compensate for the inherent positional inaccuracy of the loading robot and reproduce the dexterity of the human operator. Each pin comprises a miniature hydraulic cylinder, controlled by a standard bidirectional flow control valve. The precision positional control is obtained through pulsing of the valves under software control, with positional feedback from an 8-bit transducer. A test-rig comprising one hydraulic location pin together with an opposing spring loaded pin has demonstrated that such a pin arrangement can be controlled with a repeatability of +/-.00045'. In addition this test-rig has demonstrated that such a pin arrangement can be used to gauge and compensate for the dimensional error of the component held between the pins, by offsetting the pin datum positions to allow for the component error. A gauging repeatability of +/- 0.00015' was demonstrated. This work has led to the design and manufacture of an encapsulation die comprising ten such pins and the associated computer software. All aspects of the control software except blade gauging and positional data storage have been demonstrated. Work is now required to achieve the accuracy of control demonstrated by the single pin test-rig, with each of the ten pins in the encapsulation die. This would allow trials of the complete loading cycle to take place.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We consider a variation of the prototype combinatorial optimization problem known as graph colouring. Our optimization goal is to colour the vertices of a graph with a fixed number of colours, in a way to maximize the number of different colours present in the set of nearest neighbours of each given vertex. This problem, which we pictorially call palette-colouring, has been recently addressed as a basic example of a problem arising in the context of distributed data storage. Even though it has not been proved to be NP-complete, random search algorithms find the problem hard to solve. Heuristics based on a naive belief propagation algorithm are observed to work quite well in certain conditions. In this paper, we build upon the mentioned result, working out the correct belief propagation algorithm, which needs to take into account the many-body nature of the constraints present in this problem. This method improves the naive belief propagation approach at the cost of increased computational effort. We also investigate the emergence of a satisfiable-to-unsatisfiable 'phase transition' as a function of the vertex mean degree, for different ensembles of sparse random graphs in the large size ('thermodynamic') limit.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mode-locked lasers emitting a train of femtosecond pulses called dissipative solitons are an enabling technology for metrology, high-resolution spectroscopy, fibre optic communications, nano-optics and many other fields of science and applications. Recently, the vector nature of dissipative solitons has been exploited to demonstrate mode locked lasing with both locked and rapidly evolving states of polarisation. Here, for an erbium-doped fibre laser mode locked with carbon nanotubes, we demonstrate the first experimental and theoretical evidence of a new class of slowly evolving vector solitons characterized by a double-scroll chaotic polarisation attractor substantially different from Lorenz, Rössler and Ikeda strange attractors. The underlying physics comprises a long time scale coherent coupling of two polarisation modes. The observed phenomena, apart from the fundamental interest, provide a base for advances in secure communications, trapping and manipulation of atoms and nanoparticles, control of magnetisation in data storage devices and many other areas. © 2014 CIOMP. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We report on a new vector model of an erbium-doped fibre laser mode locked with carbon nanotubes. This model goes beyond the limitations of the previously used models based on either coupled nonlinear Schrödinger or Ginzburg-Landau equations. Unlike the previous models, it accounts for the vector nature of the interaction between an optical field and an erbium-doped active medium, slow relaxation dynamics of erbium ions, linear birefringence in a fibre, linear and circular birefringence of a laser cavity caused by in-cavity polarization controller and light-induced anisotropy caused by elliptically polarized pump field. Interplay of aforementioned factors changes coherent coupling of two polarization modes at a long time scale and so results in a new family of vector solitons (VSs) with fast and slowly evolving states of polarization. The observed VSs can be of interest in secure communications, trapping and manipulation of atoms and nanoparticles, control of magnetization in data storage devices and many other areas.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY WITH PRIOR ARRANGEMENT

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper introduces a new mathematical method for improving the discrimination power of data envelopment analysis and to completely rank the efficient decision-making units (DMUs). Fuzzy concept is utilised. For this purpose, first all DMUs are evaluated with the CCR model. Thereafter, the resulted weights for each output are considered as fuzzy sets and are then converted to fuzzy numbers. The introduced model is a multi-objective linear model, endpoints of which are the highest and lowest of the weighted values. An added advantage of the model is its ability to handle the infeasibility situation sometimes faced by previously introduced models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Signal integration determines cell fate on the cellular level, affects cognitive processes and affective responses on the behavioural level, and is likely to be involved in psychoneurobiological processes underlying mood disorders. Interactions between stimuli may subjected to time effects. Time-dependencies of interactions between stimuli typically lead to complex cell responses and complex responses on the behavioural level. We show that both three-factor models and time series models can be used to uncover such time-dependencies. However, we argue that for short longitudinal data the three factor modelling approach is more suitable. In order to illustrate both approaches, we re-analysed previously published short longitudinal data sets. We found that in human embryonic kidney 293 cells cells the interaction effect in the regulation of extracellular signal-regulated kinase (ERK) 1 signalling activation by insulin and epidermal growth factor is subjected to a time effect and dramatically decays at peak values of ERK activation. In contrast, we found that the interaction effect induced by hypoxia and tumour necrosis factor-alpha for the transcriptional activity of the human cyclo-oxygenase-2 promoter in HEK293 cells is time invariant at least in the first 12-h time window after stimulation. Furthermore, we applied the three-factor model to previously reported animal studies. In these studies, memory storage was found to be subjected to an interaction effect of the beta-adrenoceptor agonist clenbuterol and certain antagonists acting on the alpha-1-adrenoceptor / glucocorticoid-receptor system. Our model-based analysis suggests that only if the antagonist drug is administer in a critical time window, then the interaction effect is relevant.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Digital image processing is exploited in many diverse applications but the size of digital images places excessive demands on current storage and transmission technology. Image data compression is required to permit further use of digital image processing. Conventional image compression techniques based on statistical analysis have reached a saturation level so it is necessary to explore more radical methods. This thesis is concerned with novel methods, based on the use of fractals, for achieving significant compression of image data within reasonable processing time without introducing excessive distortion. Images are modelled as fractal data and this model is exploited directly by compression schemes. The validity of this is demonstrated by showing that the fractal complexity measure of fractal dimension is an excellent predictor of image compressibility. A method of fractal waveform coding is developed which has low computational demands and performs better than conventional waveform coding methods such as PCM and DPCM. Fractal techniques based on the use of space-filling curves are developed as a mechanism for hierarchical application of conventional techniques. Two particular applications are highlighted: the re-ordering of data during image scanning and the mapping of multi-dimensional data to one dimension. It is shown that there are many possible space-filling curves which may be used to scan images and that selection of an optimum curve leads to significantly improved data compression. The multi-dimensional mapping property of space-filling curves is used to speed up substantially the lookup process in vector quantisation. Iterated function systems are compared with vector quantisers and the computational complexity or iterated function system encoding is also reduced by using the efficient matching algcnithms identified for vector quantisers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The design and implementation of data bases involve, firstly, the formulation of a conceptual data model by systematic analysis of the structure and information requirements of the organisation for which the system is being designed; secondly, the logical mapping of this conceptual model onto the data structure of the target data base management system (DBMS); and thirdly, the physical mapping of this structured model into storage structures of the target DBMS. The accuracy of both the logical and physical mapping determine the performance of the resulting systems. This thesis describes research which develops software tools to facilitate the implementation of data bases. A conceptual model describing the information structure of a hospital is derived using the Entity-Relationship (E-R) approach and this model forms the basis for mapping onto the logical model. Rules are derived for automatically mapping the conceptual model onto relational and CODASYL types of data structures. Further algorithms are developed for partly automating the implementation of these models onto INGRES, MIMER and VAX-11 DBMS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents the results from an investigation into the merits of analysing Magnetoencephalographic (MEG) data in the context of dynamical systems theory. MEG is the study of both the methods for the measurement of minute magnetic flux variations at the scalp, resulting from neuro-electric activity in the neocortex, as well as the techniques required to process and extract useful information from these measurements. As a result of its unique mode of action - by directly measuring neuronal activity via the resulting magnetic field fluctuations - MEG possesses a number of useful qualities which could potentially make it a powerful addition to any brain researcher's arsenal. Unfortunately, MEG research has so far failed to fulfil its early promise, being hindered in its progress by a variety of factors. Conventionally, the analysis of MEG has been dominated by the search for activity in certain spectral bands - the so-called alpha, delta, beta, etc that are commonly referred to in both academic and lay publications. Other efforts have centred upon generating optimal fits of "equivalent current dipoles" that best explain the observed field distribution. Many of these approaches carry the implicit assumption that the dynamics which result in the observed time series are linear. This is despite a variety of reasons which suggest that nonlinearity might be present in MEG recordings. By using methods that allow for nonlinear dynamics, the research described in this thesis avoids these restrictive linearity assumptions. A crucial concept underpinning this project is the belief that MEG recordings are mere observations of the evolution of the true underlying state, which is unobservable and is assumed to reflect some abstract brain cognitive state. Further, we maintain that it is unreasonable to expect these processes to be adequately described in the traditional way: as a linear sum of a large number of frequency generators. One of the main objectives of this thesis will be to prove that much more effective and powerful analysis of MEG can be achieved if one were to assume the presence of both linear and nonlinear characteristics from the outset. Our position is that the combined action of a relatively small number of these generators, coupled with external and dynamic noise sources, is more than sufficient to account for the complexity observed in the MEG recordings. Another problem that has plagued MEG researchers is the extremely low signal to noise ratios that are obtained. As the magnetic flux variations resulting from actual cortical processes can be extremely minute, the measuring devices used in MEG are, necessarily, extremely sensitive. The unfortunate side-effect of this is that even commonplace phenomena such as the earth's geomagnetic field can easily swamp signals of interest. This problem is commonly addressed by averaging over a large number of recordings. However, this has a number of notable drawbacks. In particular, it is difficult to synchronise high frequency activity which might be of interest, and often these signals will be cancelled out by the averaging process. Other problems that have been encountered are high costs and low portability of state-of-the- art multichannel machines. The result of this is that the use of MEG has, hitherto, been restricted to large institutions which are able to afford the high costs associated with the procurement and maintenance of these machines. In this project, we seek to address these issues by working almost exclusively with single channel, unaveraged MEG data. We demonstrate the applicability of a variety of methods originating from the fields of signal processing, dynamical systems, information theory and neural networks, to the analysis of MEG data. It is noteworthy that while modern signal processing tools such as independent component analysis, topographic maps and latent variable modelling have enjoyed extensive success in a variety of research areas from financial time series modelling to the analysis of sun spot activity, their use in MEG analysis has thus far been extremely limited. It is hoped that this work will help to remedy this oversight.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The relationship between the religious and political fields in the Orthodox Church is defined by the concept of symphonia which dates back to the Byzantine Empire. The concept suggests that the religious and political authorities should work together in a symphonic agreement towards achieving the material and spiritual welfare of the faithful. This article argues that an investigation of the theory of sign and symbol offers a better understanding of symphonia and, in particular, of its relationship with the nation-building process. From this perspective, by corroborating the data provided by the European Values Survey from 1990 to 2000 with this theory, this article demonstrates that the enlargement of the European Union represents the most significant challenge to symphonia, shifting its national focus to a supranational level. © 2011 Taylor & Francis.