77 resultados para Church architecture - Data processing - Spain

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Real-Time Kinematic (RTK) positioning is a technique used to provide precise positioning services at centimetre accuracy level in the context of Global Navigation Satellite Systems (GNSS). While a Network-based RTK (N-RTK) system involves multiple continuously operating reference stations (CORS), the simplest form of a NRTK system is a single-base RTK. In Australia there are several NRTK services operating in different states and over 1000 single-base RTK systems to support precise positioning applications for surveying, mining, agriculture, and civil construction in regional areas. Additionally, future generation GNSS constellations, including modernised GPS, Galileo, GLONASS, and Compass, with multiple frequencies have been either developed or will become fully operational in the next decade. A trend of future development of RTK systems is to make use of various isolated operating network and single-base RTK systems and multiple GNSS constellations for extended service coverage and improved performance. Several computational challenges have been identified for future NRTK services including: • Multiple GNSS constellations and multiple frequencies • Large scale, wide area NRTK services with a network of networks • Complex computation algorithms and processes • Greater part of positioning processes shifting from user end to network centre with the ability to cope with hundreds of simultaneous users’ requests (reverse RTK) There are two major requirements for NRTK data processing based on the four challenges faced by future NRTK systems, expandable computing power and scalable data sharing/transferring capability. This research explores new approaches to address these future NRTK challenges and requirements using the Grid Computing facility, in particular for large data processing burdens and complex computation algorithms. A Grid Computing based NRTK framework is proposed in this research, which is a layered framework consisting of: 1) Client layer with the form of Grid portal; 2) Service layer; 3) Execution layer. The user’s request is passed through these layers, and scheduled to different Grid nodes in the network infrastructure. A proof-of-concept demonstration for the proposed framework is performed in a five-node Grid environment at QUT and also Grid Australia. The Networked Transport of RTCM via Internet Protocol (Ntrip) open source software is adopted to download real-time RTCM data from multiple reference stations through the Internet, followed by job scheduling and simplified RTK computing. The system performance has been analysed and the results have preliminarily demonstrated the concepts and functionality of the new NRTK framework based on Grid Computing, whilst some aspects of the performance of the system are yet to be improved in future work.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Serving as a powerful tool for extracting localized variations in non-stationary signals, applications of wavelet transforms (WTs) in traffic engineering have been introduced; however, lacking in some important theoretical fundamentals. In particular, there is little guidance provided on selecting an appropriate WT across potential transport applications. This research described in this paper contributes uniquely to the literature by first describing a numerical experiment to demonstrate the shortcomings of commonly-used data processing techniques in traffic engineering (i.e., averaging, moving averaging, second-order difference, oblique cumulative curve, and short-time Fourier transform). It then mathematically describes WT’s ability to detect singularities in traffic data. Next, selecting a suitable WT for a particular research topic in traffic engineering is discussed in detail by objectively and quantitatively comparing candidate wavelets’ performances using a numerical experiment. Finally, based on several case studies using both loop detector data and vehicle trajectories, it is shown that selecting a suitable wavelet largely depends on the specific research topic, and that the Mexican hat wavelet generally gives a satisfactory performance in detecting singularities in traffic and vehicular data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monitoring gases for environmental, industrial and agricultural fields is a demanding task that requires long periods of observation, large quantity of sensors, data management, high temporal and spatial resolution, long term stability, recalibration procedures, computational resources, and energy availability. Wireless Sensor Networks (WSNs) and Unmanned Aerial Vehicles (UAVs) are currently representing the best alternative to monitor large, remote, and difficult access areas, as these technologies have the possibility of carrying specialised gas sensing systems, and offer the possibility of geo-located and time stamp samples. However, these technologies are not fully functional for scientific and commercial applications as their development and availability is limited by a number of factors: the cost of sensors required to cover large areas, their stability over long periods, their power consumption, and the weight of the system to be used on small UAVs. Energy availability is a serious challenge when WSN are deployed in remote areas with difficult access to the grid, while small UAVs are limited by the energy in their reservoir tank or batteries. Another important challenge is the management of data produced by the sensor nodes, requiring large amount of resources to be stored, analysed and displayed after long periods of operation. In response to these challenges, this research proposes the following solutions aiming to improve the availability and development of these technologies for gas sensing monitoring: first, the integration of WSNs and UAVs for environmental gas sensing in order to monitor large volumes at ground and aerial levels with a minimum of sensor nodes for an effective 3D monitoring; second, the use of solar energy as a main power source to allow continuous monitoring; and lastly, the creation of a data management platform to store, analyse and share the information with operators and external users. The principal outcomes of this research are the creation of a gas sensing system suitable for monitoring any kind of gas, which has been installed and tested on CH4 and CO2 in a sensor network (WSN) and on a UAV. The use of the same gas sensing system in a WSN and a UAV reduces significantly the complexity and cost of the application as it allows: a) the standardisation of the signal acquisition and data processing, thereby reducing the required computational resources; b) the standardisation of calibration and operational procedures, reducing systematic errors and complexity; c) the reduction of the weight and energy consumption, leading to an improved power management and weight balance in the case of UAVs; d) the simplification of the sensor node architecture, which is easily replicated in all the nodes. I evaluated two different sensor modules by laboratory, bench, and field tests: a non-dispersive infrared module (NDIR) and a metal-oxide resistive nano-sensor module (MOX nano-sensor). The tests revealed advantages and disadvantages of the two modules when used for static nodes at the ground level and mobile nodes on-board a UAV. Commercial NDIR modules for CO2 have been successfully tested and evaluated in the WSN and on board of the UAV. Their advantage is the precision and stability, but their application is limited to a few gases. The advantages of the MOX nano-sensors are the small size, low weight, low power consumption and their sensitivity to a broad range of gases. However, selectivity is still a concern that needs to be addressed with further studies. An electronic board to interface sensors in a large range of resistivity was successfully designed, created and adapted to operate on ground nodes and on-board UAV. The WSN and UAV created were powered with solar energy in order to facilitate outdoor deployment, data collection and continuous monitoring over large and remote volumes. The gas sensing, solar power, transmission and data management systems of the WSN and UAV were fully evaluated by laboratory, bench and field testing. The methodology created to design, developed, integrate and test these systems was extensively described and experimentally validated. The sampling and transmission capabilities of the WSN and UAV were successfully tested in an emulated mission involving the detection and measurement of CO2 concentrations in a field coming from a contaminant source; the data collected during the mission was transmitted in real time to a central node for data analysis and 3D mapping of the target gas. The major outcome of this research is the accomplishment of the first flight mission, never reported before in the literature, of a solar powered UAV equipped with a CO2 sensing system in conjunction with a network of ground sensor nodes for an effective 3D monitoring of the target gas. A data management platform was created using an external internet server, which manages, stores, and shares the data collected in two web pages, showing statistics and static graph images for internal and external users as requested. The system was bench tested with real data produced by the sensor nodes and the architecture of the platform was widely described and illustrated in order to provide guidance and support on how to replicate the system. In conclusion, the overall results of the project provide guidance on how to create a gas sensing system integrating WSNs and UAVs, how to power the system with solar energy and manage the data produced by the sensor nodes. This system can be used in a wide range of outdoor applications, especially in agriculture, bushfires, mining studies, zoology, and botanical studies opening the way to an ubiquitous low cost environmental monitoring, which may help to decrease our carbon footprint and to improve the health of the planet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a safety data recording and analysis system that has been developed to capture safety occurrences including precursors using high-definition forward-facing video from train cabs and data from other train-borne systems. The paper describes the data processing model and how events detected through data analysis are related to an underlying socio-technical model of accident causation. The integrated approach to safety data recording and analysis insures systemic factors that condition, influence or potentially contribute to an occurrence are captured both for safety occurrences and precursor events, providing a rich tapestry of antecedent causal factors that can significantly improve learning around accident causation. This can ultimately provide benefit to railways through the development of targeted and more effective countermeasures, better risk models and more effective use and prioritization of safety funds. Level crossing occurrences are a key focus in this paper with data analysis scenarios describing causal factors around near-miss occurrences. The paper concludes with a discussion on how the system can also be applied to other types of railway safety occurrences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper introduces our dedicated authenticated encryption scheme ICEPOLE. ICEPOLE is a high-speed hardware-oriented scheme, suitable for high-throughput network nodes or generally any environment where specialized hardware (such as FPGAs or ASICs) can be used to provide high data processing rates. ICEPOLE-128 (the primary ICEPOLE variant) is very fast. On the modern FPGA device Virtex 6, a basic iterative architecture of ICEPOLE reaches 41 Gbits/s, which is over 10 times faster than the equivalent implementation of AES-128-GCM. The throughput-to-area ratio is also substantially better when compared to AES-128-GCM. We have carefully examined the security of the algorithm through a range of cryptanalytic techniques and our findings indicate that ICEPOLE offers high security level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increasingly larger scale applications are generating an unprecedented amount of data. However, the increasing gap between computation and I/O capacity on High End Computing machines makes a severe bottleneck for data analysis. Instead of moving data from its source to the output storage, in-situ analytics processes output data while simulations are running. However, in-situ data analysis incurs much more computing resource contentions with simulations. Such contentions severely damage the performance of simulation on HPE. Since different data processing strategies have different impact on performance and cost, there is a consequent need for flexibility in the location of data analytics. In this paper, we explore and analyze several potential data-analytics placement strategies along the I/O path. To find out the best strategy to reduce data movement in given situation, we propose a flexible data analytics (FlexAnalytics) framework in this paper. Based on this framework, a FlexAnalytics prototype system is developed for analytics placement. FlexAnalytics system enhances the scalability and flexibility of current I/O stack on HEC platforms and is useful for data pre-processing, runtime data analysis and visualization, as well as for large-scale data transfer. Two use cases – scientific data compression and remote visualization – have been applied in the study to verify the performance of FlexAnalytics. Experimental results demonstrate that FlexAnalytics framework increases data transition bandwidth and improves the application end-to-end transfer performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Organisations use Enterprise Architecture (EA) to reduce organisational complexity, improve communication, align business and information technology (IT), and drive organisational change. Due to the dynamic nature of environmental and organisational factors, EA descriptions need to change over time to keep providing value for its stakeholders. Emerging business and IT trends, such as Service-Oriented Architecture (SOA), may impact EA frameworks, methodologies, governance and tools. However, the phenomenon of EA evolution is still poorly understood. Using Archer's morphogenetic theory as a foundation, this research conceptualises three analytical phases of EA evolution in organisations, namely conditioning, interaction and elaboration. Based on a case study with a government agency, this paper provides new empirically and theoretically grounded insights into EA evolution, in particular in relation to the introduction of SOA, and describes relevant generative mechanisms affecting EA evolution. By doing so, it builds a foundation to further examine the impact of other IT trends such as mobile or cloud-based solutions on EA evolution. At a practical level, the research delivers a model that can be used to guide professionals to manage EA and continually evolve it.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monitoring unused or dark IP addresses offers opportunities to extract useful information about both on-going and new attack patterns. In recent years, different techniques have been used to analyze such traffic including sequential analysis where a change in traffic behavior, for example change in mean, is used as an indication of malicious activity. Change points themselves say little about detected change; further data processing is necessary for the extraction of useful information and to identify the exact cause of the detected change which is limited due to the size and nature of observed traffic. In this paper, we address the problem of analyzing a large volume of such traffic by correlating change points identified in different traffic parameters. The significance of the proposed technique is two-fold. Firstly, automatic extraction of information related to change points by correlating change points detected across multiple traffic parameters. Secondly, validation of the detected change point by the simultaneous presence of another change point in a different parameter. Using a real network trace collected from unused IP addresses, we demonstrate that the proposed technique enables us to not only validate the change point but also extract useful information about the causes of change points.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the problems of three carrier phase ambiguity resolution (TCAR) and position estimation (PE) are generalized as real time GNSS data processing problems for a continuously observing network on large scale. In order to describe these problems, a general linear equation system is presented to uniform various geometry-free, geometry-based and geometry-constrained TCAR models, along with state transition questions between observation times. With this general formulation, generalized TCAR solutions are given to cover different real time GNSS data processing scenarios, and various simplified integer solutions, such as geometry-free rounding and geometry-based LAMBDA solutions with single and multiple-epoch measurements. In fact, various ambiguity resolution (AR) solutions differ in the floating ambiguity estimation and integer ambiguity search processes, but their theoretical equivalence remains under the same observational systems models and statistical assumptions. TCAR performance benefits as outlined from the data analyses in some recent literatures are reviewed, showing profound implications for the future GNSS development from both technology and application perspectives.