921 resultados para seismic data processing


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Two complementary wireless sensor nodes for building two-tiered heterogeneous networks are presented. A larger node with a 25 mm by 25 mm size acts as the backbone of the network, and can handle complex data processing. A smaller, cheaper node with a 10 mm by 10 mm size can perform simpler sensor-interfacing tasks. The 25mm node is based on previous work that has been done in the Tyndall National Institute that created a modular wireless sensor node. In this work, a new 25mm module is developed operating in the 433/868 MHz frequency bands, with a range of 3.8 km. The 10mm node is highly miniaturised, while retaining a high level of modularity. It has been designed to support very energy efficient operation for applications with low duty cycles, with a sleep current of 3.3 μA. Both nodes use commercially available components and have low manufacturing costs to allow the construction of large networks. In addition, interface boards for communicating with nodes have been developed for both the 25mm and 10mm nodes. These interface boards provide a USB connection, and support recharging of a Li-ion battery from the USB power supply. This paper discusses the design goals, the design methods, and the resulting implementation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The power consumption of wireless sensor networks (WSN) module is an important practical concern in building energy management (BEM) system deployments. A set of metrics are created to assess the power profiles of WSN in real world condition. The aim of this work is to understand and eventually eliminate the uncertainties in WSN power consumption during long term deployments and the compatibility with existing and emerging energy harvesting technologies. This paper investigates the key metrics in data processing, wireless data transmission, data sensing and duty cycle parameter to understand the system power profile from a practical deployment prospective. Based on the proposed analysis, the impacts of individual metric on power consumption in a typical BEM application are presented and the subsequent low power solutions are investigated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Body Sensor Network (BSN) technology is seeing a rapid emergence in application areas such as health, fitness and sports monitoring. Current BSN wireless sensors typically operate on a single frequency band (e.g. utilizing the IEEE 802.15.4 standard that operates at 2.45GHz) employing a single radio transceiver for wireless communications. This allows a simple wireless architecture to be realized with low cost and power consumption. However, network congestion/failure can create potential issues in terms of reliability of data transfer, quality-of-service (QOS) and data throughput for the sensor. These issues can be especially critical in healthcare monitoring applications where data availability and integrity is crucial. The addition of more than one radio has the potential to address some of the above issues. For example, multi-radio implementations can allow access to more than one network, providing increased coverage and data processing as well as improved interoperability between networks. A small number of multi-radio wireless sensor solutions exist at present but require the use of more than one radio transceiver devices to achieve multi-band operation. This paper presents the design of a novel prototype multi-radio hardware platform that uses a single radio transceiver. The proposed design allows multi-band operation in the 433/868MHz ISM bands and this, together with its low complexity and small form factor, make it suitable for a wide range of BSN applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Buried heat sources can be investigated by examining thermal infrared images and comparing these with the results of theoretical models which predict the thermal anomaly a given heat source may generate. Key factors influencing surface temperature include the geometry and temperature of the heat source, the surface meteorological environment, and the thermal conductivity and anisotropy of the rock. In general, a geothermal heat flux of greater than 2% of solar insolation is required to produce a detectable thermal anomaly in a thermal infrared image. A heat source of, for example, 2-300K greater than the average surface temperature must be a t depth shallower than 50m for the detection of the anomaly in a thermal infrared image, for typical terrestrial conditions. Atmospheric factors are of critical importance. While the mean atmospheric temperature has little significance, the convection is a dominant factor, and can act to swamp the thermal signature entirely. Given a steady state heat source that produces a detectable thermal anomaly, it is possible to loosely constrain the physical properties of the heat source and surrounding rock, using the surface thermal anomaly as a basis. The success of this technique is highly dependent on the degree to which the physical properties of the host rock are known. Important parameters include the surface thermal properties and thermal conductivity of the rock. Modelling of transient thermal situations was carried out, to assess the effect of time dependant thermal fluxes. One-dimensional finite element models can be readily and accurately applied to the investigation of diurnal heat flow, as with thermal inertia models. Diurnal thermal models of environments on Earth, the Moon and Mars were carried out using finite elements and found to be consistent with published measurements. The heat flow from an injection of hot lava into a near surface lava tube was considered. While this approach was useful for study, and long term monitoring in inhospitable areas, it was found to have little hazard warning utility, as the time taken for the thermal energy to propagate to the surface in dry rock (several months) in very long. The resolution of the thermal infrared imaging system is an important factor. Presently available satellite based systems such as Landsat (resolution of 120m) are inadequate for detailed study of geothermal anomalies. Airborne systems, such as TIMS (variable resolution of 3-6m) are much more useful for discriminating small buried heat sources. Planned improvements in the resolution of satellite based systems will broaden the potential for application of the techniques developed in this thesis. It is important to note, however, that adequate spatial resolution is a necessary but not sufficient condition for successful application of these techniques.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work considers the static calculation of a program’s average-case time. The number of systems that currently tackle this research problem is quite small due to the difficulties inherent in average-case analysis. While each of these systems make a pertinent contribution, and are individually discussed in this work, only one of them forms the basis of this research. That particular system is known as MOQA. The MOQA system consists of the MOQA language and the MOQA static analysis tool. Its technique for statically determining average-case behaviour centres on maintaining strict control over both the data structure type and the labeling distribution. This research develops and evaluates the MOQA language implementation, and adds to the functions already available in this language. Furthermore, the theory that backs MOQA is generalised and the range of data structures for which the MOQA static analysis tool can determine average-case behaviour is increased. Also, some of the MOQA applications and extensions suggested in other works are logically examined here. For example, the accuracy of classifying the MOQA language as reversible is investigated, along with the feasibility of incorporating duplicate labels into the MOQA theory. Finally, the analyses that take place during the course of this research reveal some of the MOQA strengths and weaknesses. This thesis aims to be pragmatic when evaluating the current MOQA theory, the advancements set forth in the following work and the benefits of MOQA when compared to similar systems. Succinctly, this work’s significant expansion of the MOQA theory is accompanied by a realistic assessment of MOQA’s accomplishments and a serious deliberation of the opportunities available to MOQA in the future.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper describes implementations of two mobile cloud applications, file synchronisation and intensive data processing, using the Context Aware Mobile Cloud Services middleware, and the Cloud Personal Assistant. Both are part of the same mobile cloud project, actively developed and currently at the second version. We describe recent changes to the middleware, along with our experimental results of the two application models. We discuss challenges faced during the development of the middleware and their implications. The paper includes performance analysis of the CPA support for the two applications in respect to existing solutions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Long reach passive optical networks (LR-PONs), which integrate fibre-to-the-home with metro networks, have been the subject of intensive research in recent years and are considered one of the most promising candidates for the next generation of optical access networks. Such systems ideally have reaches greater than 100km and bit rates of at least 10Gb/s per wavelength in the downstream and upstream directions. Due to the limited equipment sharing that is possible in access networks, the laser transmitters in the terminal units, which are usually the most expensive components, must be as cheap as possible. However, the requirement for low cost is generally incompatible with the need for a transmitter chirp characteristic that is optimised for such long reaches at 10Gb/s, and hence dispersion compensation is required. In this thesis electronic dispersion compensation (EDC) techniques are employed to increase the chromatic dispersion tolerance and to enhance the system performance at the expense of moderate additional implementation complexity. In order to use such EDC in LR-PON architectures, a number of challenges associated with the burst-mode nature of the upstream link need to be overcome. In particular, the EDC must be made adaptive from one burst to the next (burst-mode EDC, or BM-EDC) in time scales on the order of tens to hundreds of nanoseconds. Burst-mode operation of EDC has received little attention to date. The main objective of this thesis is to demonstrate the feasibility of such a concept and to identify the key BM-EDC design parameters required for applications in a 10Gb/s burst-mode link. This is achieved through a combination of simulations and transmission experiments utilising off-line data processing. The research shows that burst-to-burst adaptation can in principle be implemented efficiently, opening the possibility of low overhead, adaptive EDC-enabled burst-mode systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In many important high-technology markets, including software development, data processing, communications, aeronautics, and defense, suppliers learn through experience how to provide better service at lower cost. This paper examines how a buyer designs dynamic competition among rival suppliers to exploit learning economies while minimizing the costs of becoming locked in to one producer. Strategies for controlling dynamic competition include the handicapping of more efficient suppliers in procurement competitions, the protection and allocation of intellectual property, and the sharing of information among rival suppliers. (JEL C73, D44, L10).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Segmentation of anatomical and pathological structures in ophthalmic images is crucial for the diagnosis and study of ocular diseases. However, manual segmentation is often a time-consuming and subjective process. This paper presents an automatic approach for segmenting retinal layers in Spectral Domain Optical Coherence Tomography images using graph theory and dynamic programming. Results show that this method accurately segments eight retinal layer boundaries in normal adult eyes more closely to an expert grader as compared to a second expert grader.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Regular plankton sampling off Plymouth by the Marine Biological Association (MBA) has been carried out from the early 1900s. Much of the sample analysis and description of the results was carried out by Sir Frederick Russell and Professor Alan Southward (AJS), the latter having completed the organisation and transfer of the paper records to digital files. The current authors have transferred the main data files of AJS on zooplankton and fish larvae to the MBA long-term database (including various editing and checking against original analysis records and published data) together with adding the data for 2002-2009. In this report the updated time-series are reviewed in the context of earlier work, particularly with respect to the Russell Cycle. It is not intended as an exhaustive analysis. Brief details of the sampling and comments on data processing are given in an appendix.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Coastal zones and shelf-seas are important for tourism, commercial fishing and aquaculture. As a result the importance of good water quality within these regions to support life is recognised worldwide and a number of international directives for monitoring them now exist. This paper describes the AlgaRisk water quality monitoring demonstration service that was developed and operated for the UK Environment Agency in response to the microbiological monitoring needs within the revised European Union Bathing Waters Directive. The AlgaRisk approach used satellite Earth observation to provide a near-real time monitoring of microbiological water quality and a series of nested operational models (atmospheric and hydrodynamic-ecosystem) provided a forecast capability. For the period of the demonstration service (2008–2013) all monitoring and forecast datasets were processed in near-real time on a daily basis and disseminated through a dedicated web portal, with extracted data automatically emailed to agency staff. Near-real time data processing was achieved using a series of supercomputers and an Open Grid approach. The novel web portal and java-based viewer enabled users to visualise and interrogate current and historical data. The system description, the algorithms employed and example results focussing on a case study of an incidence of the harmful algal bloom Karenia mikimotoi are presented. Recommendations and the potential exploitation of web services for future water quality monitoring services are discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Metallographic characterisation is combined with statistical analysis to study the microstructure of a BT16 titanium alloy after different heat treatment processes. It was found that the length, width and aspect ratio of α plates in this alloy follow the three-parameter Weibull distribution. Increasing annealing temperature or time causes the probability distribution of the length and the width of α plates to tend toward a normal distribution. The phase transformation temperature of the BT16 titanium alloy was found to be 875±5°C.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mass spectrometry (MS)-based metabolomics is emerging as an important field of research in many scientific areas, including chemical safety of food. A particular strength of this approach is its potential to reveal some physiological effects induced by complex mixtures of chemicals present at trace concentrations. The limitations of other analytical approaches currently employed to detect low-dose and mixture effects of chemicals make detection very problematic. Besides this basic technical challenge, numerous analytical choices have to be made at each step of a metabolomics study, and each step can have a direct impact on the final results obtained and their interpretation (i.e. sample preparation, sample introduction, ionization, signal acquisition, data processing, and data analysis). As the application of metabolomics to chemical analysis of food is still in its infancy, no consensus has yet been reached on defining many of these important parameters. In this context, the aim of the present study is to review all these aspects of MS-based approaches to metabolomics, and to give a comprehensive, critical overview of the current state of the art, possible pitfalls, and future challenges and trends linked to this emerging field. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cores from slopes east of the Great Barrier Reef (GBR) challenge traditional models for sedimentation on tropical mixed siliciclastic-carbonate margins. However, satisfactory explanations of sediment accumulation on this archetypal margin that include both hemipelagic and turbidite sedimentation remain elusive, as submarine canyons and their role in delivering coarse-grained turbidite deposits, are poorly understood. Towards addressing this problem we investigated the shelf and canyon system bordering the northern Ribbon Reefs and reconstructed the history of turbidite deposition since the Late Pleistocene. High-resolution bathymetric and seismic data show a large paleo-channel system that crosses the shelf before connecting with the canyons via the inter-reef passages between the Ribbon Reefs. High-resolution bathymetry of the canyon axis reveals a complex and active system of channels, sand waves, and local submarine landslides. Multi-proxy examination of three cores from down the axis of the canyon system reveals 18 turbidites and debrites, interlayered with hemipelagic muds, that are derived from a mix of shallow and deep sources. Twenty radiocarbon ages indicate that siliciclastic-dominated and mixed turbidites only occur prior to 31 ka during Marine Isotope Stage (MIS) 3, while carbonate-dominated turbidites are well established by 11 ka in MIS1 until as recently as 1.2 ka. The apparent lack of siliciclastic-dominated turbidites and presence of only a few carbonate-dominated turbidites during the MIS2 lowstand are not consistent with generic models of margin sedimentation but might also reflect a gap in the turbidite record. These data suggest that turbidite sedimentation in the Ribbon Reef canyons, probably reflects the complex relationship between the prolonged period (> 25 ka) of MIS3 millennial sea level changes and local factors such as the shelf, inter-reef passage depth, canyon morphology and different sediment sources. On this basis we predict that the spatial and temporal patterns of turbidite sedimentation could vary considerably along the length of the GBR margin.