867 resultados para Low sperm quality
Resumo:
This thesis presents the outcomes of a comprehensive research study undertaken to investigate the influence of rainfall and catchment characteristics on urban stormwater quality. The knowledge created is expected to contribute to a greater understanding of urban stormwater quality and thereby enhance the design of stormwater quality treatment systems. The research study was undertaken based on selected urban catchments in Gold Coast, Australia. The research methodology included field investigations, laboratory testing, computer modelling and data analysis. Both univariate and multivariate data analysis techniques were used to investigate the influence of rainfall and catchment characteristics on urban stormwater quality. The rainfall characteristics investigated included average rainfall intensity and rainfall duration whilst catchment characteristics included land use, impervious area percentage, urban form and pervious area location. The catchment scale data for the analysis was obtained from four residential catchments, including rainfall-runoff records, drainage network data, stormwater quality data and land use and land cover data. Pollutants build-up samples were collected from twelve road surfaces in residential, commercial and industrial land use areas. The relationships between rainfall characteristics, catchment characteristics and urban stormwater quality were investigated based on residential catchments and then extended to other land uses. Based on the influence rainfall characteristics exert on urban stormwater quality, rainfall events can be classified into three different types, namely, high average intensity-short duration (Type 1), high average intensity-long duration (Type 2) and low average intensity-long duration (Type 3). This provides an innovative approach to conventional modelling which does not commonly relate stormwater quality to rainfall characteristics. Additionally, it was found that the threshold intensity for pollutant wash-off from urban catchments is much less than for rural catchments. High average intensity-short duration rainfall events are cumulatively responsible for the generation of a major fraction of the annual pollutants load compared to the other rainfall event types. Additionally, rainfall events less than 1 year ARI such as 6- month ARI should be considered for treatment design as they generate a significant fraction of the annual runoff volume and by implication a significant fraction of the pollutants load. This implies that stormwater treatment designs based on larger rainfall events would not be feasible in the context of cost-effectiveness, efficiency in treatment performance and possible savings in land area needed. This also suggests that the simulation of long-term continuous rainfall events for stormwater treatment design may not be needed and that event based simulations would be adequate. The investigations into the relationship between catchment characteristics and urban stormwater quality found that other than conventional catchment characteristics such as land use and impervious area percentage, other catchment characteristics such as urban form and pervious area location also play important roles in influencing urban stormwater quality. These outcomes point to the fact that the conventional modelling approach in the design of stormwater quality treatment systems which is commonly based on land use and impervious area percentage would be inadequate. It was also noted that the small uniformly urbanised areas within a larger mixed catchment produce relatively lower variations in stormwater quality and as expected lower runoff volume with the opposite being the case for large mixed use urbanised catchments. Therefore, a decentralised approach to water quality treatment would be more effective rather than an "end-of-pipe" approach. The investigation of pollutants build-up on different land uses showed that pollutant build-up characteristics vary even within the same land use. Therefore, the conventional approach in stormwater quality modelling, which is based solely on land use, may prove to be inappropriate. Industrial land use has relatively higher variability in maximum pollutant build-up, build-up rate and particle size distribution than the other two land uses. However, commercial and residential land uses had relatively higher variations of nutrients and organic carbon build-up. Additionally, it was found that particle size distribution had a relatively higher variability for all three land uses compared to the other build-up parameters. The high variability in particle size distribution for all land uses illustrate the dissimilarities associated with the fine and coarse particle size fractions even within the same land use and hence the variations in stormwater quality in relation to pollutants adsorbing to different sizes of particles.
Resumo:
Prevailing video adaptation solutions change the quality of the video uniformly throughout the whole frame in the bitrate adjustment process; while region-of-interest (ROI)-based solutions selectively retains the quality in the areas of the frame where the viewers are more likely to pay more attention to. ROI-based coding can improve perceptual quality and viewer satisfaction while trading off some bandwidth. However, there has been no comprehensive study to measure the bitrate vs. perceptual quality trade-off so far. The paper proposes an ROI detection scheme for videos, which is characterized with low computational complexity and robustness, and measures the bitrate vs. quality trade-off for ROI-based encoding using a state-of-the-art H.264/AVC encoder to justify the viability of this type of encoding method. The results from the subjective quality test reveal that ROI-based encoding achieves a significant perceptual quality improvement over the encoding with uniform quality at the cost of slightly more bits. Based on the bitrate measurements and subjective quality assessments, the bitrate and the perceptual quality estimation models for non-scalable ROI-based video coding (AVC) are developed, which are found to be similar to the models for scalable video coding (SVC).
Resumo:
In 1999 Richards compared the accuracy of commercially available motion capture systems commonly used in biomechanics. Richards identified that in static tests the optical motion capture systems generally produced RMS errors of less than 1.0 mm. During dynamic tests, the RMS error increased to up to 4.2 mm in some systems. In the last 12 years motion capture systems have continued to evolve and now include high-resolution CCD or CMOS image sensors, wireless communication, and high full frame sampling frequencies. In addition to hardware advances, there have also been a number of advances in software, which includes improved calibration and tracking algorithms, real time data streaming, and the introduction of the c3d standard. These advances have allowed the system manufactures to maintain a high retail price in the name of advancement. In areas such as gait analysis and ergonomics many of the advanced features such as high resolution image sensors and high sampling frequencies are not required due to the nature of the task often investigated. Recently Natural Point introduced low cost cameras, which on face value appear to be suitable as at very least a high quality teaching tool in biomechanics and possibly even a research tool when coupled with the correct calibration and tracking software. The aim of the study was therefore to compare both the linear accuracy and quality of angular kinematics from a typical high end motion capture system and a low cost system during a simple task.
Resumo:
Background Although physical activity is associated with health-related quality of life (HRQL), the nature of the dose-response relationship remains unclear. This study examined the concurrent and prospective dose-response relationships between total physical activity (TPA) and (only) walking with HRQL in two age cohorts of women. Methods Participants were 10,698 women born in 1946-1951 and 7,646 born in 1921-1926, who completed three mailed surveys for the Australian Longitudinal Study on Women's Health. They reported weekly TPA minutes (sum of walking, moderate, and vigorous minutes). HRQL was measured with the Medical Outcomes Study Short-Form 36 Health Status Survey (SF-36). Linear mixed models, adjusted for socio-demographic and health-related variables, were used to examine associations between TPA level (none, very low, low, intermediate, sufficient, high, and very high) and SF-36 scores. For women who reported walking as their only physical activity, associations between walking and SF-36 scores were also examined. Results Curvilinear trends were observed between TPA and walking with SF-36 scores. Concurrently, HRQL scores increased significantly with increasing TPA and walking, in both cohorts, with increases less marked above sufficient activity levels. Prospectively, associations were attenuated although significant and meaningful improvements in physical functioning and vitality were observed across most TPA and walking categories above the low category. Conclusion For women in their 50s-80s without clinical depression, greater amounts of TPA are associated with better current and future HRQL, particularly physical functioning and vitality. Even if walking is their only activity, women, particularly those in their 70s-80s, have better health-related quality of life.
Resumo:
Existing secure software development principles tend to focus on coding vulnerabilities, such as buffer or integer overflows, that apply to individual program statements, or issues associated with the run-time environment, such as component isolation. Here we instead consider software security from the perspective of potential information flow through a program’s object-oriented module structure. In particular, we define a set of quantifiable "security metrics" which allow programmers to quickly and easily assess the overall security of a given source code program or object-oriented design. Although measuring quality attributes of object-oriented programs for properties such as maintainability and performance has been well-covered in the literature, metrics which measure the quality of information security have received little attention. Moreover, existing securityrelevant metrics assess a system either at a very high level, i.e., the whole system, or at a fine level of granularity, i.e., with respect to individual statements. These approaches make it hard and expensive to recognise a secure system from an early stage of development. Instead, our security metrics are based on well-established compositional properties of object-oriented programs (i.e., data encapsulation, cohesion, coupling, composition, extensibility, inheritance and design size), combined with data flow analysis principles that trace potential information flow between high- and low-security system variables. We first define a set of metrics to assess the security quality of a given object-oriented system based on its design artifacts, allowing defects to be detected at an early stage of development. We then extend these metrics to produce a second set applicable to object-oriented program source code. The resulting metrics make it easy to compare the relative security of functionallyequivalent system designs or source code programs so that, for instance, the security of two different revisions of the same system can be compared directly. This capability is further used to study the impact of specific refactoring rules on system security more generally, at both the design and code levels. By measuring the relative security of various programs refactored using different rules, we thus provide guidelines for the safe application of refactoring steps to security-critical programs. Finally, to make it easy and efficient to measure a system design or program’s security, we have also developed a stand-alone software tool which automatically analyses and measures the security of UML designs and Java program code. The tool’s capabilities are demonstrated by applying it to a number of security-critical system designs and Java programs. Notably, the validity of the metrics is demonstrated empirically through measurements that confirm our expectation that program security typically improves as bugs are fixed, but worsens as new functionality is added.
Resumo:
Voltage drop and rise at network peak and off–peak periods along with voltage unbalance are the major power quality problems in low voltage distribution networks. Usually, the utilities try to use adjusting the transformer tap changers as a solution for the voltage drop. They also try to distribute the loads equally as a solution for network voltage unbalance problem. On the other hand, the ever increasing energy demand, along with the necessity of cost reduction and higher reliability requirements, are driving the modern power systems towards Distributed Generation (DG) units. This can be in the form of small rooftop photovoltaic cells (PV), Plug–in Electric Vehicles (PEVs) or Micro Grids (MGs). Rooftop PVs, typically with power levels ranging from 1–5 kW installed by the householders are gaining popularity due to their financial benefits for the householders. Also PEVs will be soon emerged in residential distribution networks which behave as a huge residential load when they are being charged while in their later generation, they are also expected to support the network as small DG units which transfer the energy stored in their battery into grid. Furthermore, the MG which is a cluster of loads and several DG units such as diesel generators, PVs, fuel cells and batteries are recently introduced to distribution networks. The voltage unbalance in the network can be increased due to the uncertainties in the random connection point of the PVs and PEVs to the network, their nominal capacity and time of operation. Therefore, it is of high interest to investigate the voltage unbalance in these networks as the result of MGs, PVs and PEVs integration to low voltage networks. In addition, the network might experience non–standard voltage drop due to high penetration of PEVs, being charged at night periods, or non–standard voltage rise due to high penetration of PVs and PEVs generating electricity back into the grid in the network off–peak periods. In this thesis, a voltage unbalance sensitivity analysis and stochastic evaluation is carried out for PVs installed by the householders versus their installation point, their nominal capacity and penetration level as different uncertainties. A similar analysis is carried out for PEVs penetration in the network working in two different modes: Grid to vehicle and Vehicle to grid. Furthermore, the conventional methods are discussed for improving the voltage unbalance within these networks. This is later continued by proposing new and efficient improvement methods for voltage profile improvement at network peak and off–peak periods and voltage unbalance reduction. In addition, voltage unbalance reduction is investigated for MGs and new improvement methods are proposed and applied for the MG test bed, planned to be established at Queensland University of Technology (QUT). MATLAB and PSCAD/EMTDC simulation softwares are used for verification of the analyses and the proposals.
Resumo:
This Exceptional Teachers for Disadvantaged Schools (ETDS) project sets out to design a new model of Australian teacher education responding to recent demands for quality education in low SES and disadvantaged schools. The project moves teacher education from the ‘missionary’ (Larabee, 2010) or deficit (Comber and Kamler 2004; Flessa, 2007) approaches, towards a focus on notions of quality and academic excellence. Rice (2008, p.1) argues for a need to place more of the “very best teachers into the most challenging schools”, yet the problem is not merely one of training more teachers, for disadvantaged schools already receive disproportionate numbers of beginning teachers (Connell, 1994; Vickers & Ferfolja, 2006). Rather, Grossman and Loeb (2010, p. 245) argue the problem centers on the common practice of “[p]lacing the least experienced teachers with the most needy students”. This paper reports on the first year trial of the project. The ETDS project is at present, the only mainstream Australian teacher education model that targets cohorts of academically high achieving pre-service teachers with the overt aim of preparing graduates of the program to teach in disadvantaged schools. At the end of its first year, the ETDS program graduated 20 new teachers, each of whom had over the previous 18 months engaged with a specialized curriculum and carefully monitored/scaffolded practicum placements in disadvantaged schools around Brisbane, Australia.
Resumo:
In this paper we extend the ideas of Brugnano, Iavernaro and Trigiante in their development of HBVM($s,r$) methods to construct symplectic Runge-Kutta methods for all values of $s$ and $r$ with $s\geq r$. However, these methods do not see the dramatic performance improvement that HBVMs can attain. Nevertheless, in the case of additive stochastic Hamiltonian problems an extension of these ideas, which requires the simulation of an independent Wiener process at each stage of a Runge-Kutta method, leads to methods that have very favourable properties. These ideas are illustrated by some simple numerical tests for the modified midpoint rule.
Resumo:
Effective streaming of video can be achieved by providing more bits to the most important region in the frame at the cost of reduced bits in the less important regions. This strategy can be beneficial for delivering high quality videos in mobile devices, especially when the availability of bandwidth is usually low and limited. While the state-of-the-art video codecs such as H.264 may have been optimised for perceived quality, it is hypothesised that users will give more attention to interesting region/object when watching videos. Therefore, giving a higher quality to region of interest (ROI)while reducing quality of other areas may result in improving the overall perceived quality without necessarily increasing the bitrate. In this paper, the impact of ROI-based encoded video on perceived quality is investigated by conducting a user study for varous target bitrates. The results from the user study demonstrate that ROI-based video coding has superior perceived quality compared to normal encoded video at the same bitrate in the lower bitrate range.
Resumo:
The increasing popularity of video consumption from mobile devices requires an effective video coding strategy. To overcome diverse communication networks, video services often need to maintain sustainable quality when the available bandwidth is limited. One of the strategy for a visually-optimised video adaptation is by implementing a region-of-interest (ROI) based scalability, whereby important regions can be encoded at a higher quality while maintaining sufficient quality for the rest of the frame. The result is an improved perceived quality at the same bit rate as normal encoding, which is particularly obvious at the range of lower bit rate. However, because of the difficulties of predicting region-of-interest (ROI) accurately, there is a limited research and development of ROI-based video coding for general videos. In this paper, the phase spectrum quaternion of Fourier Transform (PQFT) method is adopted to determine the ROI. To improve the results of ROI detection, the saliency map from the PQFT is augmented with maps created from high level knowledge of factors that are known to attract human attention. Hence, maps that locate faces and emphasise the centre of the screen are used in combination with the saliency map to determine the ROI. The contribution of this paper lies on the automatic ROI detection technique for coding a low bit rate videos which include the ROI prioritisation technique to give different level of encoding qualities for multiple ROIs, and the evaluation of the proposed automatic ROI detection that is shown to have a close performance to human ROI, based on the eye fixation data.
Resumo:
Frequent exposure to ultrafine particles (UFP) is associated with detrimental effects on cardiopulmonary function and health. UFP dose and therefore the associated health risk are a factor of exposure frequency, duration, and magnitude of (therefore also proximity to) a UFP emission source. Bicycle commuters using on-road routes during peak traffic times are sharing a microenvironment with high levels of motorised traffic, a major UFP emission source. Inhaled particle counts were measured along popular pre-identified bicycle commute route alterations of low (LOW) and high (HIGH) motorised traffic to the same inner-city destination at peak commute traffic times. During commute, real-time particle number concentration (PNC; mostly in the UFP range) and particle diameter (PD), heart and respiratory rate, geographical location, and meteorological variables were measured. To determine inhaled particle counts, ventilation rate was calculated from heart-rate-ventilation associations, produced from periodic exercise testing. Total mean PNC of LOW (compared to HIGH) was reduced (1.56 x e4 ± 0.38 x e4 versus 3.06 x e4 ± 0.53 x e4 ppcc; p = 0.012). Total estimated ventilation rate did not vary significantly between LOW and HIGH (43 ± 5 versus 46 ± 9 L•min; p = 0.136); however, due to total mean PNC, accumulated inhaled particle counts were 48% lower in LOW, compared to HIGH (7.6 x e8 ± 1.5 x e8 versus 14.6 x e8 ± 1.8 x e8; p = 0.003). For bicycle commuting at peak morning commute times, inhaled particle counts and therefore cardiopulmonary health risk may be substantially reduced by decreasing exposure to motorised traffic, which should be considered by both bicycle commuters and urban planners.
Resumo:
A simple and effective down-sample algorithm, Peak-Hold-Down-Sample (PHDS) algorithm is developed in this paper to enable a rapid and efficient data transfer in remote condition monitoring applications. The algorithm is particularly useful for high frequency Condition Monitoring (CM) techniques, and for low speed machine applications since the combination of the high sampling frequency and low rotating speed will generally lead to large unwieldy data size. The effectiveness of the algorithm was evaluated and tested on four sets of data in the study. One set of the data was extracted from the condition monitoring signal of a practical industry application. Another set of data was acquired from a low speed machine test rig in the laboratory. The other two sets of data were computer simulated bearing defect signals having either a single or multiple bearing defects. The results disclose that the PHDS algorithm can substantially reduce the size of data while preserving the critical bearing defect information for all the data sets used in this work even when a large down-sample ratio was used (i.e., 500 times down-sampled). In contrast, the down-sample process using existing normal down-sample technique in signal processing eliminates the useful and critical information such as bearing defect frequencies in a signal when the same down-sample ratio was employed. Noise and artificial frequency components were also induced by the normal down-sample technique, thus limits its usefulness for machine condition monitoring applications.
Resumo:
Recent attention in education within many western contexts has focused on improved outcomes for students, with a particular focus on closing the gap between those who come from disadvantaged backgrounds and the rest of the student population. Much of this attention has supported a set of simplistic solutions to improving scores on high stakes standardized tests. The collateral damage (Nichols & Berliner, 2007) of such responses includes a narrowing of the curriculum, plateaus in gain scores on the tests, and unproductive blame games aimed by the media and politicians at teachers and communities (Nichols & Berliner, 2007; Synder, 2008). Alternative approaches to improving the quality and equity of schooling remain as viable alternatives to these measures. As an example in a recent study of school literacy reform in low SES schools, Luke, Woods and Dooley (2011) argued for the increase of substantive content and intellectual quality of the curriculum as a necessary means to re-engaging middle school students, improving outcomes of schooling and achieving a high quality, high equity system. The MediaClub is an afterschool program for students in years 4 to 7 (9-12 year old) at a primary school in a low SES area of a large Australian city. It is run as part of an Australian Research Council funded research project. The aim of the program has been to provide an opportunity for students to gain expertise in digital technologies and media literacies in an afterschool setting. It was hypothesized that this expertise might then be used to shift the ways of being literate that these students had to call on within classroom teaching and learning events. Each term, there is a different focus on digital media, and information and communication technology (ICT) activities in the MediaClub. The work detailed in this chapter relates to a robotics program presented as one of the modules within this afterschool setting. As part of the program, the participants were challenged to find creative solutions to problems in a constructivist-learning environment.
Resumo:
Abstract Objective: To explore whether area-level socioeconomic position or the form of retail stream (conventional versus farmers’ market) are associated with differences in the price, availability, variety and quality of a range of fresh fruit and vegetables. Design: A multi-site cross-sectional pilot study of farmers’ markets, supermarkets and independent fruit and vegetable retailers. Each was surveyed to assess the price, availability, variety and quality of 15 fruit and 18 vegetable items. Setting: Retail outlets were located in South-East Queensland. Subjects: Fifteen retail outlets were surveyed (five of each retail stream). Results: Average basket prices were not significantly different across the socioeconomic spectrum however prices in low socioeconomic areas were cheapest. Availability, variety, and quality did not differ across levels of socioeconomic position however the areas with the most socioeconomic disadvantage scored poorest for quality and variety. Supermarkets had significantly better fruit and vegetable availability than farmers’ markets however price, variety and quality scores were not different across retail streams. Results demonstrate a trend to fruit and vegetable prices being more expensive at farmers’ markets, with the price of the Fruit basket being significantly greater at the organic farmer’s market compared with the non-organic farmers’ markets. Conclusions: Neither area-level socioeconomic position nor the form of retail stream was significantly associated with differences in the availability, price, variety and quality of fruit and vegetables, except for availability which was higher in supermarkets than farmers’ markets. Further research is needed to determine what role farmers’ markets can play in affecting fruit and vegetable intake.
Resumo:
Fourteen new complexes of the form cis-\[RuIIX2(R2qpy2+)2]4+ (R2qpy2+ = a 4,4′:2′,2″:4″,4‴-quaterpyridinium ligand, X = Cl− or NCS−) have been prepared and isolated as their PF6− salts. Characterisation involved various techniques including 1H NMR spectroscopy and +electrospray or MALDI mass spectrometry. The UV–Vis spectra display intense intraligand π → π∗ absorptions, and also metal-to-ligand charge-transfer (MLCT) bands with two resolved maxima in the visible region. Red-shifts in the MLCT bands occur as the electron-withdrawing strength of the pyridinium groups increases, while replacing Cl− with NCS− causes blue-shifts. Cyclic voltammograms show quasi-reversible or reversible RuIII/II oxidation waves, and several ligand-based reductions that are irreversible. The variations in the redox potentials correlate with changes in the MLCT energies. A single-crystal X-ray structure has been obtained for a protonated form of a proligand salt, \[(4-(CO2H)Ph)2qpyH3+]\[HSO4]3·3H2O. Time-dependent density functional theory calculations give adequate correlations with the experimental UV–Vis spectra for the two carboxylic acid-functionalised complexes in DMSO. Despite their attractive electronic absorption spectra, these dyes are relatively inefficient photosensitisers on electrodes coated with TiO2 or ZnO. These observations are attributed primarily to weak electronic coupling with the surfaces, since the DFT-derived LUMOs include no electron density near the carboxylic acid anchors.