95 resultados para rainfall-runoff empirical statistical model

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

IEEE 802.11p is the new standard for Inter-Vehicular Communications (IVC) using the 5.9 GHz frequency band, as part of the DSRC framework; it will enable applications based on Cooperative Systems. Simulation is widely used to estimate or verify the potential benefits of such cooperative applications, notably in terms of safety for the drivers. We have developed a performance model for 802.11p that can be used by simulations of cooperative applications (e.g. collision avoidance) without requiring intricate models of the whole IVC stack. Instead, it provide a a straightforward yet realistic modelisation of IVC performance. Our model uses data from extensive field trials to infer the correlation between speed, distance and performance metrics such as maximum range, latency and frame loss. Then, we improve this model to limit the number of profiles that have to be generated when there are more than a few couples of emitter-receptor in a given location. Our model generates realistic performance for rural or suburban environments among small groups of IVC-equipped vehicles and road side units.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

IEEE 802.11p is the new standard for intervehicular communications (IVC) using the 5.9 GHz frequency band; it is planned to be widely deployed to enable cooperative systems. 802.11p uses and performance have been studied theoretically and in simulations over the past years. Unfortunately, many of these results have not been confirmed by on-tracks experimentation. In this paper, we describe field trials of 802.11p technology with our test vehicles; metrics such as maximum range, latency and frame loss are examined. Then, we propose a detailed modelisation of 802.11p that can be used to accurately simulate its performance within Cooperative Systems (CS) applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper provides a first look at the acceptance of Accountable-eHealth systems, a new genre of eHealth systems, designed to manage information privacy concerns that hinder the proliferation of eHealth. The underlying concept of AeH systems is appropriate use of information through after-the-fact accountability for intentional misuse of information by healthcare professionals. An online questionnaire survey was utilised for data collection from three educational institutions in Queensland, Australia. A total of 23 hypothesis relating to 9 constructs were tested using a structural equation modelling technique. A total of 334 valid responses were received. The cohort consisted of medical, nursing and other health related students studying at various levels in both undergraduate and postgraduate courses. The hypothesis testing disproved 7 hypotheses. The empirical research model developed was capable of predicting 47.3% of healthcare professionals’ perceived intention to use AeH systems. A validation of the model with a wider survey cohort would be useful to confirm the current findings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduced in this paper is a Bayesian model for isolating the resonant frequency from combustion chamber resonance. The model shown in this paper focused on characterising the initial rise in the resonant frequency to investigate the rise of in-cylinder bulk temperature associated with combustion. By resolving the model parameters, it is possible to determine: the start of pre-mixed combustion, the start of diffusion combustion, the initial resonant frequency, the resonant frequency as a function of crank angle, the in-cylinder bulk temperature as a function of crank angle and the trapped mass as a function of crank angle. The Bayesian method allows for individual cycles to be examined without cycle-averaging|allowing inter-cycle variability studies. Results are shown for a turbo-charged, common-rail compression ignition engine run at 2000 rpm and full load.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper provides a first look at the acceptance of Accountable-eHealth (AeH) systems–a new genre of eHealth systems designed to manage information privacy concerns that hinder the proliferation of eHealth. The underlying concept of AeH systems is appropriate use of information through after-the-fact accountability for intentional misuse of information by healthcare professionals. An online questionnaire survey was utilised for data collection from three educational institutions in Queensland, Australia. A total of 23 hypotheses relating to 9 constructs were tested using a structural equation modelling technique. The moderation effects on the hypotheses were also tested based on six moderation factors to understand their role on the designed research model. A total of 334 valid responses were received. The cohort consisted of medical, nursing and other health related students studying at various levels in both undergraduate and postgraduate courses. Hypothesis testing provided sufficient data to accept 7 hypotheses. The empirical research model developed was capable of predicting 47.3% of healthcare professionals’ perceived intention to use AeH systems. All six moderation factors showed significant influence on the research model. A validation of this model with a wider survey cohort is recommended as a future study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The quality of stormwater runoff from ports is significant as it can be an important source of pollution to the marine environment. This is also a significant issue for the Port of Brisbane as it is located in an area of high environmental values. Therefore, it is imperative to develop an in-depth understanding of stormwater runoff quality to ensure that appropriate strategies are in place for quality improvement, where necessary. To this end, the Port of Brisbane Corporation aimed to develop a port specific stormwater model for the Fisherman Islands facility. The need has to be considered in the context of the proposed future developments of the Port area. ----------------- The Project: The research project is an outcome of the collaborative Partnership between the Port of Brisbane Corporation (POBC) and Queensland University of Technology (QUT). A key feature of this Partnership is that it seeks to undertake research to assist the Port in strengthening the environmental custodianship of the Port area through ‘cutting edge’ research and its translation into practical application. ------------------ The project was separated into two stages. The first stage developed a quantitative understanding of the generation potential of pollutant loads in the existing land uses. This knowledge was then used as input for the stormwater quality model developed in the subsequent stage. The aim is to expand this model across the yet to be developed port expansion area. This is in order to predict pollutant loads associated with stormwater flows from this area with the longer term objective of contributing to the development of ecological risk mitigation strategies for future expansion scenarios. ----------------- Study approach: Stage 1 of the overall study confirmed that Port land uses are unique in terms of the anthropogenic activities occurring on them. This uniqueness in land use results in distinctive stormwater quality characteristics different to other conventional urban land uses. Therefore, it was not scientifically valid to consider the Port as belonging to a single land use category or to consider as being similar to any typical urban land use. The approach adopted in this study was very different to conventional modelling studies where modelling parameters are developed using calibration. The field investigations undertaken in Stage 1 of the overall study helped to create fundamental knowledge on pollutant build-up and wash-off in different Port land uses. This knowledge was then used in computer modelling so that the specific characteristics of pollutant build-up and wash-off can be replicated. This meant that no calibration processes were involved due to the use of measured parameters for build-up and wash-off. ---------------- Conclusions: Stage 2 of the study was primarily undertaken using the SWMM stormwater quality model. It is a physically based model which replicates natural processes as closely as possible. The time step used and catchment variability considered was adequate to accommodate the temporal and spatial variability of input parameters and the parameters used in the modelling reflect the true nature of rainfall-runoff and pollutant processes to the best of currently available knowledge. In this study, the initial loss values adopted for the impervious surfaces are relatively high compared to values noted in research literature. However, given the scientifically valid approach used for the field investigations, it is appropriate to adopt the initial losses derived from this study for future modelling of Port land uses. The relatively high initial losses will reduce the runoff volume generated as well as the frequency of runoff events significantly. Apart from initial losses, most of the other parameters used in SWMM modelling are generic to most modelling studies. Development of parameters for MUSIC model source nodes was one of the primary objectives of this study. MUSIC, uses the mean and standard deviation of pollutant parameters based on a normal distribution. However, based on the values generated in this study, the variation of Event Mean Concentrations (EMCs) for Port land uses within the given investigation period does not fit a normal distribution. This is possibly due to the fact that only one specific location was considered, namely the Port of Brisbane unlike in the case of the MUSIC model where a range of areas with different geographic and climatic conditions were investigated. Consequently, the assumptions used in MUSIC are not totally applicable for the analysis of water quality in Port land uses. Therefore, in using the parameters included in this report for MUSIC modelling, it is important to note that it may result in under or over estimations of annual pollutant loads. It is recommended that the annual pollutant load values given in the report should be used as a guide to assess the accuracy of the modelling outcomes. A step by step guide for using the knowledge generated from this study for MUSIC modelling is given in Table 4.6. ------------------ Recommendations: The following recommendations are provided to further strengthen the cutting edge nature of the work undertaken: * It is important to further validate the approach recommended for stormwater quality modelling at the Port. Validation will require data collection in relation to rainfall, runoff and water quality from the selected Port land uses. Additionally, the recommended modelling approach could be applied to a soon-to-be-developed area to assess ‘before’ and ‘after’ scenarios. * In the modelling study, TSS was adopted as the surrogate parameter for other pollutants. This approach was based on other urban water quality research undertaken at QUT. The validity of this approach should be further assessed for Port land uses. * The adoption of TSS as a surrogate parameter for other pollutants and the confirmation that the <150 m particle size range was predominant in suspended solids for pollutant wash-off gives rise to a number of important considerations. The ability of the existing structural stormwater mitigation measures to remove the <150 m particle size range need to be assessed. The feasibility of introducing source control measures as opposed to end-of-pipe measures for stormwater quality improvement may also need to be considered.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to knowledge gaps in relation to urban stormwater quality processes, an in-depth understanding of model uncertainty can enhance decision making. Uncertainty in stormwater quality models can originate from a range of sources such as the complexity of urban rainfall-runoff-stormwater pollutant processes and the paucity of observed data. Unfortunately, studies relating to epistemic uncertainty, which arises from the simplification of reality are limited and often deemed mostly unquantifiable. This paper presents a statistical modelling framework for ascertaining epistemic uncertainty associated with pollutant wash-off under a regression modelling paradigm using Ordinary Least Squares Regression (OLSR) and Weighted Least Squares Regression (WLSR) methods with a Bayesian/Gibbs sampling statistical approach. The study results confirmed that WLSR assuming probability distributed data provides more realistic uncertainty estimates of the observed and predicted wash-off values compared to OLSR modelling. It was also noted that the Bayesian/Gibbs sampling approach is superior compared to the most commonly adopted classical statistical and deterministic approaches commonly used in water quality modelling. The study outcomes confirmed that the predication error associated with wash-off replication is relatively higher due to limited data availability. The uncertainty analysis also highlighted the variability of the wash-off modelling coefficient k as a function of complex physical processes, which is primarily influenced by surface characteristics and rainfall intensity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multicarrier code division multiple access (MC-CDMA) is a very promising candidate for the multiple access scheme in fourth generation wireless communi- cation systems. During asynchronous transmission, multiple access interference (MAI) is a major challenge for MC-CDMA systems and significantly affects their performance. The main objectives of this thesis are to analyze the MAI in asyn- chronous MC-CDMA, and to develop robust techniques to reduce the MAI effect. Focus is first on the statistical analysis of MAI in asynchronous MC-CDMA. A new statistical model of MAI is developed. In the new model, the derivation of MAI can be applied to different distributions of timing offset, and the MAI power is modelled as a Gamma distributed random variable. By applying the new statistical model of MAI, a new computer simulation model is proposed. This model is based on the modelling of a multiuser system as a single user system followed by an additive noise component representing the MAI, which enables the new simulation model to significantly reduce the computation load during computer simulations. MAI reduction using slow frequency hopping (SFH) technique is the topic of the second part of the thesis. Two subsystems are considered. The first sub- system involves subcarrier frequency hopping as a group, which is referred to as GSFH/MC-CDMA. In the second subsystem, the condition of group hopping is dropped, resulting in a more general system, namely individual subcarrier frequency hopping MC-CDMA (ISFH/MC-CDMA). This research found that with the introduction of SFH, both of GSFH/MC-CDMA and ISFH/MC-CDMA sys- tems generate less MAI power than the basic MC-CDMA system during asyn- chronous transmission. Because of this, both SFH systems are shown to outper- form MC-CDMA in terms of BER. This improvement, however, is at the expense of spectral widening. In the third part of this thesis, base station polarization diversity, as another MAI reduction technique, is introduced to asynchronous MC-CDMA. The com- bined system is referred to as Pol/MC-CDMA. In this part a new optimum com- bining technique namely maximal signal-to-MAI ratio combining (MSMAIRC) is proposed to combine the signals in two base station antennas. With the applica- tion of MSMAIRC and in the absents of additive white Gaussian noise (AWGN), the resulting signal-to-MAI ratio (SMAIR) is not only maximized but also in- dependent of cross polarization discrimination (XPD) and antenna angle. In the case when AWGN is present, the performance of MSMAIRC is still affected by the XPD and antenna angle, but to a much lesser degree than the traditional maximal ratio combining (MRC). Furthermore, this research found that the BER performance for Pol/MC-CDMA can be further improved by changing the angle between the two receiving antennas. Hence the optimum antenna angles for both MSMAIRC and MRC are derived and their effects on the BER performance are compared. With the derived optimum antenna angle, the Pol/MC-CDMA system is able to obtain the lowest BER for a given XPD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study model selection strategies based on penalized empirical loss minimization. We point out a tight relationship between error estimation and data-based complexity penalization: any good error estimate may be converted into a data-based penalty function and the performance of the estimate is governed by the quality of the error estimate. We consider several penalty functions, involving error estimates on independent test data, empirical VC dimension, empirical VC entropy, and margin-based quantities. We also consider the maximal difference between the error on the first half of the training data and the second half, and the expected maximal discrepancy, a closely related capacity estimate that can be calculated by Monte Carlo integration. Maximal discrepancy penalty functions are appealing for pattern classification problems, since their computation is equivalent to empirical risk minimization over the training data with some labels flipped.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In natural estuaries, scalar diffusion and dispersion are driven by turbulence. In the present study, detailed turbulence measurements were conducted in a small subtropical estuary with semi-diurnal tides under neap tide conditions. Three acoustic Doppler velocimeters were installed mid-estuary at fixed locations close together. The units were sampled simultaneously and continuously at relatively high frequency for 50 h. The results illustrated the influence of tidal forcing in the small estuary, although low frequency longitudinal velocity oscillations were observed and believed to be induced by external resonance. The boundary shear stress data implied that the turbulent shear in the lower flow region was one order of magnitude larger than the boundary shear itself. The observation differed from turbulence data in a laboratory channel, but a key feature of natural estuary flow was the significant three dimensional effects associated with strong secondary currents including transverse shear events. The velocity covariances and triple correlations, as well as the backscatter intensity and covariances, were calculated for the entire field study. The covariances of the longitudinal velocity component showed some tidal trend, while the covariances of the transverse horizontal velocity component exhibited trends that reflected changes in secondary current patterns between ebb and flood tides. The triple correlation data tended to show some differences between ebb and flood tides. The acoustic backscatter intensity data were characterised by large fluctuations during the entire study, with dimensionless fluctuation intensity I0b =Ib between 0.46 and 0.54. An unusual feature of the field study was some moderate rainfall prior to and during the first part of the sampling period. Visual observations showed some surface scars and marked channels, while some mini transient fronts were observed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents the outcomes of a comprehensive research study undertaken to investigate the influence of rainfall and catchment characteristics on urban stormwater quality. The knowledge created is expected to contribute to a greater understanding of urban stormwater quality and thereby enhance the design of stormwater quality treatment systems. The research study was undertaken based on selected urban catchments in Gold Coast, Australia. The research methodology included field investigations, laboratory testing, computer modelling and data analysis. Both univariate and multivariate data analysis techniques were used to investigate the influence of rainfall and catchment characteristics on urban stormwater quality. The rainfall characteristics investigated included average rainfall intensity and rainfall duration whilst catchment characteristics included land use, impervious area percentage, urban form and pervious area location. The catchment scale data for the analysis was obtained from four residential catchments, including rainfall-runoff records, drainage network data, stormwater quality data and land use and land cover data. Pollutants build-up samples were collected from twelve road surfaces in residential, commercial and industrial land use areas. The relationships between rainfall characteristics, catchment characteristics and urban stormwater quality were investigated based on residential catchments and then extended to other land uses. Based on the influence rainfall characteristics exert on urban stormwater quality, rainfall events can be classified into three different types, namely, high average intensity-short duration (Type 1), high average intensity-long duration (Type 2) and low average intensity-long duration (Type 3). This provides an innovative approach to conventional modelling which does not commonly relate stormwater quality to rainfall characteristics. Additionally, it was found that the threshold intensity for pollutant wash-off from urban catchments is much less than for rural catchments. High average intensity-short duration rainfall events are cumulatively responsible for the generation of a major fraction of the annual pollutants load compared to the other rainfall event types. Additionally, rainfall events less than 1 year ARI such as 6- month ARI should be considered for treatment design as they generate a significant fraction of the annual runoff volume and by implication a significant fraction of the pollutants load. This implies that stormwater treatment designs based on larger rainfall events would not be feasible in the context of cost-effectiveness, efficiency in treatment performance and possible savings in land area needed. This also suggests that the simulation of long-term continuous rainfall events for stormwater treatment design may not be needed and that event based simulations would be adequate. The investigations into the relationship between catchment characteristics and urban stormwater quality found that other than conventional catchment characteristics such as land use and impervious area percentage, other catchment characteristics such as urban form and pervious area location also play important roles in influencing urban stormwater quality. These outcomes point to the fact that the conventional modelling approach in the design of stormwater quality treatment systems which is commonly based on land use and impervious area percentage would be inadequate. It was also noted that the small uniformly urbanised areas within a larger mixed catchment produce relatively lower variations in stormwater quality and as expected lower runoff volume with the opposite being the case for large mixed use urbanised catchments. Therefore, a decentralised approach to water quality treatment would be more effective rather than an "end-of-pipe" approach. The investigation of pollutants build-up on different land uses showed that pollutant build-up characteristics vary even within the same land use. Therefore, the conventional approach in stormwater quality modelling, which is based solely on land use, may prove to be inappropriate. Industrial land use has relatively higher variability in maximum pollutant build-up, build-up rate and particle size distribution than the other two land uses. However, commercial and residential land uses had relatively higher variations of nutrients and organic carbon build-up. Additionally, it was found that particle size distribution had a relatively higher variability for all three land uses compared to the other build-up parameters. The high variability in particle size distribution for all land uses illustrate the dissimilarities associated with the fine and coarse particle size fractions even within the same land use and hence the variations in stormwater quality in relation to pollutants adsorbing to different sizes of particles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The discovery of protein variation is an important strategy in disease diagnosis within the biological sciences. The current benchmark for elucidating information from multiple biological variables is the so called “omics” disciplines of the biological sciences. Such variability is uncovered by implementation of multivariable data mining techniques which come under two primary categories, machine learning strategies and statistical based approaches. Typically proteomic studies can produce hundreds or thousands of variables, p, per observation, n, depending on the analytical platform or method employed to generate the data. Many classification methods are limited by an n≪p constraint, and as such, require pre-treatment to reduce the dimensionality prior to classification. Recently machine learning techniques have gained popularity in the field for their ability to successfully classify unknown samples. One limitation of such methods is the lack of a functional model allowing meaningful interpretation of results in terms of the features used for classification. This is a problem that might be solved using a statistical model-based approach where not only is the importance of the individual protein explicit, they are combined into a readily interpretable classification rule without relying on a black box approach. Here we incorporate statistical dimension reduction techniques Partial Least Squares (PLS) and Principal Components Analysis (PCA) followed by both statistical and machine learning classification methods, and compared them to a popular machine learning technique, Support Vector Machines (SVM). Both PLS and SVM demonstrate strong utility for proteomic classification problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chatrooms, for example Internet Relay Chat, are generally multi-user, multi-channel and multiserver chat-systems which run over the Internet and provide a protocol for real-time text-based conferencing between users all over the world. While a well-trained human observer is able to understand who is chatting with whom, there are no efficient and accurate automated tools to determine the groups of users conversing with each other. A precursor to analysing evolving cyber-social phenomena is to first determine what the conversations are and which groups of chatters are involved in each conversation. We consider this problem in this paper. We propose an algorithm to discover all groups of users that are engaged in conversation. Our algorithms are based on a statistical model of a chatroom that is founded on our experience with real chatrooms. Our approach does not require any semantic analysis of the conversations, rather it is based purely on the statistical information contained in the sequence of posts. We improve the accuracy by applying some graph algorithms to clean the statistical information. We present some experimental results which indicate that one can automatically determine the conversing groups in a chatroom, purely on the basis of statistical analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Bus Rapid Transit (BRT) station is the interface between passengers and services. The station is crucial to line operation as it is typically the only location where buses can pass each other. Congestion may occur here when buses maneuvering into and out of the platform lane interfere with bus flow, or when a queue of buses forms upstream of the platform lane blocking the passing lane. Further, some systems include operation where express buses do not observe the station, resulting in a proportion of non-stopping buses. It is important to understand the operation of the station under this type of operation and its effect on BRT line capacity. This study uses microscopic traffic simulation modeling to treat the BRT station operation and to analyze the relationship between station bus capacity and BRT line bus capacity. First, the simulation model is developed for the limit state scenario and then a statistical model is defined and calibrated for a specified range of controlled scenarios of dwell time characteristics. A field survey was conducted to verify the parameters such as dwell time, clearance time and coefficient of variation of dwell time to obtain relevant station bus capacity. The proposed model for BRT bus capacity provides a better understanding of BRT line capacity and is useful to transit authorities in BRT planning, design and operation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electricity network investment and asset management require accurate estimation of future demand in energy consumption within specified service areas. For this purpose, simple models are typically developed to predict future trends in electricity consumption using various methods and assumptions. This paper presents a statistical model to predict electricity consumption in the residential sector at the Census Collection District (CCD) level over the state of New South Wales, Australia, based on spatial building and household characteristics. Residential household demographic and building data from the Australian Bureau of Statistics (ABS) and actual electricity consumption data from electricity companies are merged for 74 % of the 12,000 CCDs in the state. Eighty percent of the merged dataset is randomly set aside to establish the model using regression analysis, and the remaining 20 % is used to independently test the accuracy of model prediction against actual consumption. In 90 % of the cases, the predicted consumption is shown to be within 5 kWh per dwelling per day from actual values, with an overall state accuracy of -1.15 %. Given a future scenario with a shift in climate zone and a growth in population, the model is used to identify the geographical or service areas that are most likely to have increased electricity consumption. Such geographical representation can be of great benefit when assessing alternatives to the centralised generation of energy; having such a model gives a quantifiable method to selecting the 'most' appropriate system when a review or upgrade of the network infrastructure is required.