793 resultados para Data compression (Telecommunication)
Resumo:
The objective of this chapter is to provide an overview of traffic data collection that can and should be used for the calibration and validation of traffic simulation models. There are big differences in availability of data from different sources. Some types of data such as loop detector data are widely available and used. Some can be measured with additional effort, for example, travel time data from GPS probe vehicles. Some types such as trajectory data are available only in rare situations such as research projects.
Resumo:
This project recognized lack of data analysis and travel time prediction on arterials as the main gap in the current literature. For this purpose it first investigated reliability of data gathered by Bluetooth technology as a new cost effective method for data collection on arterial roads. Then by considering the similarity among varieties of daily travel time on different arterial routes, created a SARIMA model to predict future travel time values. Based on this research outcome, the created model can be applied for online short term travel time prediction in future.
Resumo:
In this study, an LPG fumigation system was fitted to a Euro III compression ignition (CI) engine to explore its impact on performance, and gaseous and particulate emissions. LPG was introduced to the intake air stream (as a secondary fuel) by using a low pressure fuel injector situated upstream of the turbocharger. LPG substitutions were test mode dependent, but varied in the range of 14-29% by energy. The engine was tested over a 5 point test cycle using ultra low sulphur diesel (ULSD), and a low and high LPG substitution at each test mode. The results show that LPG fumigation coerces the combustion into pre-mixed mode, as increases in the peak combustion pressure (and the rate of pressure rise) were observed in most tests. The emissions results show decreases in nitric oxide (NO) and particulate matter (PM2.5) emissions; however, very significant increases in carbon monoxide (CO) and hydrocarbon (HC) emissions were observed. A more detailed investigation of the particulate emissions showed that the number of particles emitted was reduced with LPG fumigation at all test settings – apart from mode 6 of the ECE R49 test cycle. Furthermore, the particles emitted generally had a slightly larger median diameter with LPG fumigation, and had a smaller semi-volatile fraction relative to ULSD. Overall, the results show that with some modifications, LPG fumigation systems could be used to extend ULSD supplies without adversely impacting on engine performance and emissions.
Resumo:
This paper presents the fire performance results of light gauge steel frame (LSF) walls lined with single and double plasterboards, and externally insulated with rock fibre insulation as obtained using a finite element analysis based parametric study. A validated numerical model was used to study the influence of various fire curves developed for a range of compartment characteristics. Data from the parametric study was utilized to develop a simplified method to predict the fire resistance ratings of LSF walls exposed to realistic design fire curves. Further, this paper also presents the details of suitable fire design rules based on current cold-formed steel standards and the modifications proposed by previous researchers. Of these the recently developed design rules by Gunalan and Mahendran [1] were investigated to determine their applicability to predict the axial compression strengths and fire resistance ratings (FRR) of LSF walls exposed to realistic design fires. Finally, the stud failure times obtained from fire design rules and finite element studies were compared for LSF walls lined with single and double plasterboards, and externally insulated with rock fibres under realistic design fire curves.
Resumo:
This research proposes the development of interfaces to support collaborative, community-driven inquiry into data, which we refer to as Participatory Data Analytics. Since the investigation is led by local communities, it is not possible to anticipate which data will be relevant and what questions are going to be asked. Therefore, users have to be able to construct and tailor visualisations to their own needs. The poster presents early work towards defining a suitable compositional model, which will allow users to mix, match, and manipulate data sets to obtain visual representations with little-to-no programming knowledge. Following a user-centred design process, we are subsequently planning to identify appropriate interaction techniques and metaphors for generating such visual specifications on wall-sized, multi-touch displays.
Resumo:
We consider the following problem: a user stores encrypted documents on an untrusted server, and wishes to retrieve all documents containing some keywords without any loss of data confidentiality. Conjunctive keyword searches on encrypted data have been studied by numerous researchers over the past few years, and all existing schemes use keyword fields as compulsory information. This however is impractical for many applications. In this paper, we propose a scheme of keyword field-free conjunctive keyword searches on encrypted data, which affirmatively answers an open problem asked by Golle et al. at ACNS 2004. Furthermore, the proposed scheme is extended to the dynamic group setting. Security analysis of our constructions is given in the paper.
Resumo:
Carcinoma ex pleomorphic adenoma (Ca ex PA) is a carcinoma arising from a primary or recurrent benign pleomorphic adenoma. It often poses a diagnostic challenge to clinicians and pathologists. This study intends to review the literature and highlight the current clinical and molecular perspectives about this entity. The most common clinical presentation of CA ex PA is of a firm mass in the parotid gland. The proportion of adenoma and carcinoma components determines the macroscopic features of this neoplasm. The entity is difficult to diagnose pre-operatively. Pathologic assessment is the gold standard for making the diagnosis. Treatment for Ca ex PA often involves an ablative surgical procedure which may be followed by radiotherapy. Overall, patients with Ca ex PA have a poor prognosis. Accurate diagnosis and aggressive surgical management of patients presenting with Ca ex PA can increase their survival rates. Molecular studies have revealed that the development of Ca ex PA follows a multi-step model of carcinogenesis, with the progressive loss of heterozygosity at chromosomal arms 8q, then 12q and finally 17p. There are specific candidate genes in these regions that are associated with particular stages in the progression of Ca ex PA. In addition, many genes which regulate tumour suppression, cell cycle control, growth factors and cell-cell adhesion play a role in the development and progression of Ca ex PA. It is hopeful that these molecular data can give clues for the diagnosis and management of the disease.
Resumo:
The ability to build high-fidelity 3D representations of the environment from sensor data is critical for autonomous robots. Multi-sensor data fusion allows for more complete and accurate representations. Furthermore, using distinct sensing modalities (i.e. sensors using a different physical process and/or operating at different electromagnetic frequencies) usually leads to more reliable perception, especially in challenging environments, as modalities may complement each other. However, they may react differently to certain materials or environmental conditions, leading to catastrophic fusion. In this paper, we propose a new method to reliably fuse data from multiple sensing modalities, including in situations where they detect different targets. We first compute distinct continuous surface representations for each sensing modality, with uncertainty, using Gaussian Process Implicit Surfaces (GPIS). Second, we perform a local consistency test between these representations, to separate consistent data (i.e. data corresponding to the detection of the same target by the sensors) from inconsistent data. The consistent data can then be fused together, using another GPIS process, and the rest of the data can be combined as appropriate. The approach is first validated using synthetic data. We then demonstrate its benefit using a mobile robot, equipped with a laser scanner and a radar, which operates in an outdoor environment in the presence of large clouds of airborne dust and smoke.
Resumo:
Aim Performance measures for Australian laboratories reporting cervical cytology are a set of quantifiable measures relating to the profile and accuracy of reporting. This study reviews aggregate data collected over the ten years in which participation in the performance measures has been mandatory. Methods Laboratories submit annual data on performance measures relating to the profile of reporting, including reporting rates for technically unsatisfactory specimens, high grade or possible high grade abnormalities and abnormal reports. Cytology-histology correlation data and review findings of negative smears reported from women with histological high grade disease are also collected. Suggested acceptable standards are set for each measure. This study reviews the aggregate data submitted by all laboratories for the years 1998-2008 and examines trends in reporting and the performance of laboratories against the suggested standards. Results The performance of Australian laboratories has shown continued improvement over the study period. There has been a fall in the proportion of laboratories with data outside the acceptable standard range in all performance measures. Laboratories are reporting a greater proportion of specimens as definite or possible high grade abnormality. This is partly attributable to an increase in the proportion of abnormal results classified as high grade or possible high grade abnormality. Despite this, the positive predictive value for high grade and possible high grade abnormalities has continued to rise. Conclusion Performance measures for cervical cytology have provided a valuable addition to external quality assurance procedures in Australia. They have documented continued improvements in the aggregate performance, as well as providing benchmarking data and goals for acceptable performance for individual laboratories.
Resumo:
This paper describes a safety data recording and analysis system that has been developed to capture safety occurrences including precursors using high-definition forward-facing video from train cabs and data from other train-borne systems. The paper describes the data processing model and how events detected through data analysis are related to an underlying socio-technical model of accident causation. The integrated approach to safety data recording and analysis insures systemic factors that condition, influence or potentially contribute to an occurrence are captured both for safety occurrences and precursor events, providing a rich tapestry of antecedent causal factors that can significantly improve learning around accident causation. This can ultimately provide benefit to railways through the development of targeted and more effective countermeasures, better risk models and more effective use and prioritization of safety funds. Level crossing occurrences are a key focus in this paper with data analysis scenarios describing causal factors around near-miss occurrences. The paper concludes with a discussion on how the system can also be applied to other types of railway safety occurrences.
Resumo:
In the Australian sugar industry, sugar cane is smashed into a straw like material by hammers before being squeezed between large rollers to extract the sugar juice. The straw like material is initially called prepared cane and then bagasse as it passes through successive roller milling units. The sugar cane materials are highly compressible, have high moisture content, are fibrous, and they resemble some peat soils in both appearance and mechanical behaviour. A promising avenue to improve the performance of milling units for increased throughput and juice extraction, and to reduce costs is by modelling of the crushing process. To achieve this, it is believed necessary that milling models should be able to reproduce measured bagasse behaviour. This investigation sought to measure the mechanical (compression, shear, and volume) behaviour of prepared cane and bagasse, to identify limitations in currently used material models, and to progress towards a material model that can predict bagasse behaviour adequately. Tests were carried out using a modified direct shear test equipment and procedure at most of the large range of pressures occurring in the crushing process. The investigation included an assessment of the performance of the direct shear test for measuring bagasse behaviour. The assessment was carried out using finite element modelling. It was shown that prepared cane and bagasse exhibited critical state behavior similar to that of soils and the magnitudes of material parameters were determined. The measurements were used to identify desirable features for a bagasse material model. It was shown that currently used material models had major limitations for reproducing bagasse behaviour. A model from the soil mechanics literature was modified and shown to achieve improved reproduction while using magnitudes of material parameters that better reflected the measured values. Finally, a typical three roller mill pressure feeder configuration was modelled. The predictions and limitations were assessed by comparison to measured data from a sugar factory.
Resumo:
Using a case study approach, this paper presents a robust methodology for assessing the compatibility of stormwater treatment performance data between two geographical regions in relation to a treatment system. The desktop analysis compared data derived from a field study undertaken in Florida, USA, with South East Queensland (SEQ) rainfall and pollutant characteristics. The analysis was based on the hypothesis that when transposing treatment performance information from one geographical region to another, detailed assessment of specific rainfall and stormwater quality parameters is required. Accordingly, characteristics of measured rainfall events and stormwater quality in the Florida study were compared with typical characteristics for SEQ. Rainfall events monitored in the Florida study were found to be similar to events that occur in SEQ in terms of their primary characteristics of depth, duration and intensity. Similarities in total suspended solids (TSS) and total nitrogen (TN) concentration ranges for Florida and SEQ suggest that TSS and TN removal performances would not be very different if the treatment system is installed in SEQ. However, further investigations are needed to evaluate the treatment performance of total phosphorus (TP). The methodology presented also allows comparison of other water quality parameters.
Resumo:
The foliage of a plant performs vital functions. As such, leaf models are required to be developed for modelling the plant architecture from a set of scattered data captured using a scanning device. The leaf model can be used for purely visual purposes or as part of a further model, such as a fluid movement model or biological process. For these reasons, an accurate mathematical representation of the surface and boundary is required. This paper compares three approaches for fitting a continuously differentiable surface through a set of scanned data points from a leaf surface, with a technique already used for reconstructing leaf surfaces. The techniques which will be considered are discrete smoothing D2-splines [R. Arcangeli, M. C. Lopez de Silanes, and J. J. Torrens, Multidimensional Minimising Splines, Springer, 2004.], the thin plate spline finite element smoother [S. Roberts, M. Hegland, and I. Altas, Approximation of a Thin Plate Spline Smoother using Continuous Piecewise Polynomial Functions, SIAM, 1 (2003), pp. 208--234] and the radial basis function Clough-Tocher method [M. Oqielat, I. Turner, and J. Belward, A hybrid Clough-Tocher method for surface fitting with application to leaf data., Appl. Math. Modelling, 33 (2009), pp. 2582-2595]. Numerical results show that discrete smoothing D2-splines produce reconstructed leaf surfaces which better represent the original physical leaf.
Resumo:
Background Detection of outbreaks is an important part of disease surveillance. Although many algorithms have been designed for detecting outbreaks, few have been specifically assessed against diseases that have distinct seasonal incidence patterns, such as those caused by vector-borne pathogens. Methods We applied five previously reported outbreak detection algorithms to Ross River virus (RRV) disease data (1991-2007) for the four local government areas (LGAs) of Brisbane, Emerald, Redland and Townsville in Queensland, Australia. The methods used were the Early Aberration Reporting System (EARS) C1, C2 and C3 methods, negative binomial cusum (NBC), historical limits method (HLM), Poisson outbreak detection (POD) method and the purely temporal SaTScan analysis. Seasonally-adjusted variants of the NBC and SaTScan methods were developed. Some of the algorithms were applied using a range of parameter values, resulting in 17 variants of the five algorithms. Results The 9,188 RRV disease notifications that occurred in the four selected regions over the study period showed marked seasonality, which adversely affected the performance of some of the outbreak detection algorithms. Most of the methods examined were able to detect the same major events. The exception was the seasonally-adjusted NBC methods that detected an excess of short signals. The NBC, POD and temporal SaTScan algorithms were the only methods that consistently had high true positive rates and low false positive and false negative rates across the four study areas. The timeliness of outbreak signals generated by each method was also compared but there was no consistency across outbreaks and LGAs. Conclusions This study has highlighted several issues associated with applying outbreak detection algorithms to seasonal disease data. In lieu of a true gold standard, a quantitative comparison is difficult and caution should be taken when interpreting the true positives, false positives, sensitivity and specificity.