855 resultados para Automatic Data Processing.


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Buried heat sources can be investigated by examining thermal infrared images and comparing these with the results of theoretical models which predict the thermal anomaly a given heat source may generate. Key factors influencing surface temperature include the geometry and temperature of the heat source, the surface meteorological environment, and the thermal conductivity and anisotropy of the rock. In general, a geothermal heat flux of greater than 2% of solar insolation is required to produce a detectable thermal anomaly in a thermal infrared image. A heat source of, for example, 2-300K greater than the average surface temperature must be a t depth shallower than 50m for the detection of the anomaly in a thermal infrared image, for typical terrestrial conditions. Atmospheric factors are of critical importance. While the mean atmospheric temperature has little significance, the convection is a dominant factor, and can act to swamp the thermal signature entirely. Given a steady state heat source that produces a detectable thermal anomaly, it is possible to loosely constrain the physical properties of the heat source and surrounding rock, using the surface thermal anomaly as a basis. The success of this technique is highly dependent on the degree to which the physical properties of the host rock are known. Important parameters include the surface thermal properties and thermal conductivity of the rock. Modelling of transient thermal situations was carried out, to assess the effect of time dependant thermal fluxes. One-dimensional finite element models can be readily and accurately applied to the investigation of diurnal heat flow, as with thermal inertia models. Diurnal thermal models of environments on Earth, the Moon and Mars were carried out using finite elements and found to be consistent with published measurements. The heat flow from an injection of hot lava into a near surface lava tube was considered. While this approach was useful for study, and long term monitoring in inhospitable areas, it was found to have little hazard warning utility, as the time taken for the thermal energy to propagate to the surface in dry rock (several months) in very long. The resolution of the thermal infrared imaging system is an important factor. Presently available satellite based systems such as Landsat (resolution of 120m) are inadequate for detailed study of geothermal anomalies. Airborne systems, such as TIMS (variable resolution of 3-6m) are much more useful for discriminating small buried heat sources. Planned improvements in the resolution of satellite based systems will broaden the potential for application of the techniques developed in this thesis. It is important to note, however, that adequate spatial resolution is a necessary but not sufficient condition for successful application of these techniques.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work considers the static calculation of a program’s average-case time. The number of systems that currently tackle this research problem is quite small due to the difficulties inherent in average-case analysis. While each of these systems make a pertinent contribution, and are individually discussed in this work, only one of them forms the basis of this research. That particular system is known as MOQA. The MOQA system consists of the MOQA language and the MOQA static analysis tool. Its technique for statically determining average-case behaviour centres on maintaining strict control over both the data structure type and the labeling distribution. This research develops and evaluates the MOQA language implementation, and adds to the functions already available in this language. Furthermore, the theory that backs MOQA is generalised and the range of data structures for which the MOQA static analysis tool can determine average-case behaviour is increased. Also, some of the MOQA applications and extensions suggested in other works are logically examined here. For example, the accuracy of classifying the MOQA language as reversible is investigated, along with the feasibility of incorporating duplicate labels into the MOQA theory. Finally, the analyses that take place during the course of this research reveal some of the MOQA strengths and weaknesses. This thesis aims to be pragmatic when evaluating the current MOQA theory, the advancements set forth in the following work and the benefits of MOQA when compared to similar systems. Succinctly, this work’s significant expansion of the MOQA theory is accompanied by a realistic assessment of MOQA’s accomplishments and a serious deliberation of the opportunities available to MOQA in the future.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes implementations of two mobile cloud applications, file synchronisation and intensive data processing, using the Context Aware Mobile Cloud Services middleware, and the Cloud Personal Assistant. Both are part of the same mobile cloud project, actively developed and currently at the second version. We describe recent changes to the middleware, along with our experimental results of the two application models. We discuss challenges faced during the development of the middleware and their implications. The paper includes performance analysis of the CPA support for the two applications in respect to existing solutions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Long reach passive optical networks (LR-PONs), which integrate fibre-to-the-home with metro networks, have been the subject of intensive research in recent years and are considered one of the most promising candidates for the next generation of optical access networks. Such systems ideally have reaches greater than 100km and bit rates of at least 10Gb/s per wavelength in the downstream and upstream directions. Due to the limited equipment sharing that is possible in access networks, the laser transmitters in the terminal units, which are usually the most expensive components, must be as cheap as possible. However, the requirement for low cost is generally incompatible with the need for a transmitter chirp characteristic that is optimised for such long reaches at 10Gb/s, and hence dispersion compensation is required. In this thesis electronic dispersion compensation (EDC) techniques are employed to increase the chromatic dispersion tolerance and to enhance the system performance at the expense of moderate additional implementation complexity. In order to use such EDC in LR-PON architectures, a number of challenges associated with the burst-mode nature of the upstream link need to be overcome. In particular, the EDC must be made adaptive from one burst to the next (burst-mode EDC, or BM-EDC) in time scales on the order of tens to hundreds of nanoseconds. Burst-mode operation of EDC has received little attention to date. The main objective of this thesis is to demonstrate the feasibility of such a concept and to identify the key BM-EDC design parameters required for applications in a 10Gb/s burst-mode link. This is achieved through a combination of simulations and transmission experiments utilising off-line data processing. The research shows that burst-to-burst adaptation can in principle be implemented efficiently, opening the possibility of low overhead, adaptive EDC-enabled burst-mode systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In many important high-technology markets, including software development, data processing, communications, aeronautics, and defense, suppliers learn through experience how to provide better service at lower cost. This paper examines how a buyer designs dynamic competition among rival suppliers to exploit learning economies while minimizing the costs of becoming locked in to one producer. Strategies for controlling dynamic competition include the handicapping of more efficient suppliers in procurement competitions, the protection and allocation of intellectual property, and the sharing of information among rival suppliers. (JEL C73, D44, L10).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes work carried out in the FIRE EXIT research project. FIRE EXIT aims to develop an Evacuation Simulator, capable of addressing issues of mustering, ship motions, fire and abandonment. In achieving these aims, FIRE EXIT took as its starting point the state-of-the-art in ship evacuation simulation (the maritimeEXODUS software), fire simulation (the SMARTFIRE software) and large-scale experimental facilities (the SHEBA facility). It then significantly enhanced these capabilities. A number of new technologies have been developed in achieving these objectives. The innovations include directly linking CFD fire simulation with evacuation and abandonment software and automatic data transfer from concept design software allowing rapid generation of ship simulation models. Software usability was augmented by a module for interpretation of evacuation software output. Enhancements to a ship evacuation testing rig have resulted in a unique facility, capable of providing passenger movement data for realistic evacuation scenarios and large scale tests have provided meaningful data for the evacuation simulation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Regular plankton sampling off Plymouth by the Marine Biological Association (MBA) has been carried out from the early 1900s. Much of the sample analysis and description of the results was carried out by Sir Frederick Russell and Professor Alan Southward (AJS), the latter having completed the organisation and transfer of the paper records to digital files. The current authors have transferred the main data files of AJS on zooplankton and fish larvae to the MBA long-term database (including various editing and checking against original analysis records and published data) together with adding the data for 2002-2009. In this report the updated time-series are reviewed in the context of earlier work, particularly with respect to the Russell Cycle. It is not intended as an exhaustive analysis. Brief details of the sampling and comments on data processing are given in an appendix.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Coastal zones and shelf-seas are important for tourism, commercial fishing and aquaculture. As a result the importance of good water quality within these regions to support life is recognised worldwide and a number of international directives for monitoring them now exist. This paper describes the AlgaRisk water quality monitoring demonstration service that was developed and operated for the UK Environment Agency in response to the microbiological monitoring needs within the revised European Union Bathing Waters Directive. The AlgaRisk approach used satellite Earth observation to provide a near-real time monitoring of microbiological water quality and a series of nested operational models (atmospheric and hydrodynamic-ecosystem) provided a forecast capability. For the period of the demonstration service (2008–2013) all monitoring and forecast datasets were processed in near-real time on a daily basis and disseminated through a dedicated web portal, with extracted data automatically emailed to agency staff. Near-real time data processing was achieved using a series of supercomputers and an Open Grid approach. The novel web portal and java-based viewer enabled users to visualise and interrogate current and historical data. The system description, the algorithms employed and example results focussing on a case study of an incidence of the harmful algal bloom Karenia mikimotoi are presented. Recommendations and the potential exploitation of web services for future water quality monitoring services are discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There can be wide variation in the level of oral/aural language ability that prelingually hearing-impaired children develop after cochlear implantation. Automatic perceptual processing mechanisms have come under increasing scrutiny in attempts to explain this variation. Using mismatch negativity methods, this study explored associations between auditory sensory memory mechanisms and verbal working memory function in children with cochlear implants and a group of hearing controls of similar age. Whilst clear relationships were observed in the hearing children between mismatch activation and working memory measures, this association appeared to be disrupted in the implant children. These findings would fit with the proposal that early auditory deprivation and a degraded auditory signal can cause changes in the processes underpinning the development of oral/aural language skills in prelingually hearing-impaired children with cochlear implants and thus alter their developmental trajectory

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Metallographic characterisation is combined with statistical analysis to study the microstructure of a BT16 titanium alloy after different heat treatment processes. It was found that the length, width and aspect ratio of α plates in this alloy follow the three-parameter Weibull distribution. Increasing annealing temperature or time causes the probability distribution of the length and the width of α plates to tend toward a normal distribution. The phase transformation temperature of the BT16 titanium alloy was found to be 875±5°C.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mass spectrometry (MS)-based metabolomics is emerging as an important field of research in many scientific areas, including chemical safety of food. A particular strength of this approach is its potential to reveal some physiological effects induced by complex mixtures of chemicals present at trace concentrations. The limitations of other analytical approaches currently employed to detect low-dose and mixture effects of chemicals make detection very problematic. Besides this basic technical challenge, numerous analytical choices have to be made at each step of a metabolomics study, and each step can have a direct impact on the final results obtained and their interpretation (i.e. sample preparation, sample introduction, ionization, signal acquisition, data processing, and data analysis). As the application of metabolomics to chemical analysis of food is still in its infancy, no consensus has yet been reached on defining many of these important parameters. In this context, the aim of the present study is to review all these aspects of MS-based approaches to metabolomics, and to give a comprehensive, critical overview of the current state of the art, possible pitfalls, and future challenges and trends linked to this emerging field. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Wireless sensor node platforms are very diversified and very constrained, particularly in power consumption. When choosing or sizing a platform for a given application, it is necessary to be able to evaluate in an early design stage the impact of those choices. Applied to the computing platform implemented on the sensor node, it requires a good understanding of the workload it must perform. Nevertheless, this workload is highly application-dependent. It depends on the data sampling frequency together with application-specific data processing and management. It is thus necessary to have a model that can represent the workload of applications with various needs and characteristics. In this paper, we propose a workload model for wireless sensor node computing platforms. This model is based on a synthetic application that models the different computational tasks that the computing platform will perform to process sensor data. It allows to model the workload of various different applications by tuning data sampling rate and processing. A case study is performed by modeling different applications and by showing how it can be used for workload characterization. © 2011 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A reduction in the time required to locate and restore faults on a utility's distribution network improves the customer minutes lost (CML) measurement and hence brings direct cost savings to the operating company. The traditional approach to fault location involves fault impedance determination from high volume waveform files dispatched across a communications channel to a central location for processing and analysis. This paper examines an alternative scheme where data processing is undertaken locally within a recording instrument thus reducing the volume of data to be transmitted. Processed event fault reports may be emailed to relevant operational staff for the timely repair and restoration of the line.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose: The purpose of this paper is to present an artificial neural network (ANN) model that predicts earthmoving trucks condition level using simple predictors; the model’s performance is compared to the respective predictive accuracy of the statistical method of discriminant analysis (DA).

Design/methodology/approach: An ANN-based predictive model is developed. The condition level predictors selected are the capacity, age, kilometers travelled and maintenance level. The relevant data set was provided by two Greek construction companies and includes the characteristics of 126 earthmoving trucks.

Findings: Data processing identifies a particularly strong connection of kilometers travelled and maintenance level with the earthmoving trucks condition level. Moreover, the validation process reveals that the predictive efficiency of the proposed ANN model is very high. Similar findings emerge from the application of DA to the same data set using the same predictors.

Originality/value: Earthmoving trucks’ sound condition level prediction reduces downtime and its adverse impact on earthmoving duration and cost, while also enhancing the maintenance and replacement policies effectiveness. This research proves that a sound condition level prediction for earthmoving trucks is achievable through the utilization of easy to collect data and provides a comparative evaluation of the results of two widely applied predictive methods.