601 resultados para Typical
Resumo:
An experimental programme in 2007 used three air suspended heavy vehicles travelling over typical urban roads to determine whether dynamic axle-to-chassis forces could be reduced by using larger-than-standard diameter longitudinal air lines. This paper presents methodology, interim analysis and partial results from that programme. Alterations to dynamic measures derived from axle-to-chassis forces for the case of standard-sized longitudinal air lines vs. the test case where larger longitudinal air lines were fitted are presented and discussed. This leads to conclusions regarding the possibility that dynamic loadings between heavy vehicle suspensions and chassis may be reduced by fitting larger longitudinal air lines to air-suspended heavy vehicles. Reductions in the shock and vibration loads to heavy vehicle suspension components could lead to lighter and more economical chassis and suspensions. This could therefore lead to reduced tare and increased payloads without an increase in gross vehicle mass.
Resumo:
For many decades correlation and power spectrum have been primary tools for digital signal processing applications in the biomedical area. The information contained in the power spectrum is essentially that of the autocorrelation sequence; which is sufficient for complete statistical descriptions of Gaussian signals of known means. However, there are practical situations where one needs to look beyond autocorrelation of a signal to extract information regarding deviation from Gaussianity and the presence of phase relations. Higher order spectra, also known as polyspectra, are spectral representations of higher order statistics, i.e. moments and cumulants of third order and beyond. HOS (higher order statistics or higher order spectra) can detect deviations from linearity, stationarity or Gaussianity in the signal. Most of the biomedical signals are non-linear, non-stationary and non-Gaussian in nature and therefore it can be more advantageous to analyze them with HOS compared to the use of second order correlations and power spectra. In this paper we have discussed the application of HOS for different bio-signals. HOS methods of analysis are explained using a typical heart rate variability (HRV) signal and applications to other signals are reviewed.
Resumo:
Daylighting in tropical and sub-tropical climates presents a unique challenge that is generally not well understood by designers. In a sub-tropical region such as Brisbane, Australia the majority of the year comprises of sunny clear skies with few overcast days and as a consequence windows can easily become sources of overheating and glare. The main strategy in dealing with this issue is extensive shading on windows. However, this in turn prevents daylight penetration into buildings often causing an interior to appear gloomy and dark even though there is more than sufficient daylight available. As a result electric lighting is the main source of light, even during the day. Innovative daylight devices which redirect light from windows offer a potential solution to this issue. These devices can potentially improve daylighting in buildings by increasing the illumination within the environment decreasing the high contrast between the window and work regions and deflecting potentially glare causing sunlight away from the observer. However, the performance of such innovative daylighting devices are generally quantified under overcast skies (i.e. daylight factors) or skies without sun, which are typical of European climates and are misleading when considering these devices for tropical or sub-tropical climates. This study sought to compare four innovative window daylighting devices in RADIANCE; light shelves, laser cut panels, micro-light guides and light redirecting blinds. These devices were simulated in RADIANCE under sub-tropical skies (for Brisbane) within the test case of a typical CBD office space. For each device the quantity of light redirected and its distribution within the space was used as the basis for comparison. In addition, glare analysis on each device was conducted using Weinold and Christoffersons evalglare. The analysis was conducted for selected hours for a day in each season. The majority of buildings that humans will occupy in their lifetime are already constructed, and extensive remodelling of most of these buildings is unlikely. Therefore the most effective way to improve daylighting in the near future will be through the alteration existing window spaces. Thus it will be important to understand the performance of daylighting systems with respect to the climate it is to be used in. This type of analysis is important to determine the applicability of a daylighting strategy so that designers can achieve energy efficiency as well the health benefits of natural daylight.
Resumo:
Plant biosecurity requires statistical tools to interpret field surveillance data in order to manage pest incursions that threaten crop production and trade. Ultimately, management decisions need to be based on the probability that an area is infested or free of a pest. Current informal approaches to delimiting pest extent rely upon expert ecological interpretation of presence / absence data over space and time. Hierarchical Bayesian models provide a cohesive statistical framework that can formally integrate the available information on both pest ecology and data. The overarching method involves constructing an observation model for the surveillance data, conditional on the hidden extent of the pest and uncertain detection sensitivity. The extent of the pest is then modelled as a dynamic invasion process that includes uncertainty in ecological parameters. Modelling approaches to assimilate this information are explored through case studies on spiralling whitefly, Aleurodicus dispersus and red banded mango caterpillar, Deanolis sublimbalis. Markov chain Monte Carlo simulation is used to estimate the probable extent of pests, given the observation and process model conditioned by surveillance data. Statistical methods, based on time-to-event models, are developed to apply hierarchical Bayesian models to early detection programs and to demonstrate area freedom from pests. The value of early detection surveillance programs is demonstrated through an application to interpret surveillance data for exotic plant pests with uncertain spread rates. The model suggests that typical early detection programs provide a moderate reduction in the probability of an area being infested but a dramatic reduction in the expected area of incursions at a given time. Estimates of spiralling whitefly extent are examined at local, district and state-wide scales. The local model estimates the rate of natural spread and the influence of host architecture, host suitability and inspector efficiency. These parameter estimates can support the development of robust surveillance programs. Hierarchical Bayesian models for the human-mediated spread of spiralling whitefly are developed for the colonisation of discrete cells connected by a modified gravity model. By estimating dispersal parameters, the model can be used to predict the extent of the pest over time. An extended model predicts the climate restricted distribution of the pest in Queensland. These novel human-mediated movement models are well suited to demonstrating area freedom at coarse spatio-temporal scales. At finer scales, and in the presence of ecological complexity, exploratory models are developed to investigate the capacity for surveillance information to estimate the extent of red banded mango caterpillar. It is apparent that excessive uncertainty about observation and ecological parameters can impose limits on inference at the scales required for effective management of response programs. The thesis contributes novel statistical approaches to estimating the extent of pests and develops applications to assist decision-making across a range of plant biosecurity surveillance activities. Hierarchical Bayesian modelling is demonstrated as both a useful analytical tool for estimating pest extent and a natural investigative paradigm for developing and focussing biosecurity programs.
Resumo:
Scalable high-resolution tiled display walls are becoming increasingly important to decision makers and researchers because high pixel counts in combination with large screen areas facilitate content rich, simultaneous display of computer-generated visualization information and high-definition video data from multiple sources. This tutorial is designed to cater for new users as well as researchers who are currently operating tiled display walls or 'OptiPortals'. We will discuss the current and future applications of display wall technology and explore opportunities for participants to collaborate and contribute in a growing community. Multiple tutorial streams will cover both hands-on practical development, as well as policy and method design for embedding these technologies into the research process. Attendees will be able to gain an understanding of how to get started with developing similar systems themselves, in addition to becoming familiar with typical applications and large-scale visualisation techniques. Presentations in this tutorial will describe current implementations of tiled display walls that highlight the effective usage of screen real-estate with various visualization datasets, including collaborative applications such as visualcasting, classroom learning and video conferencing. A feature presentation for this tutorial will be given by Jurgen Schulze from Calit2 at the University of California, San Diego. Jurgen is an expert in scientific visualization in virtual environments, human-computer interaction, real-time volume rendering, and graphics algorithms on programmable graphics hardware.
Resumo:
A review of the literature related to issues involved in irrigation induced agricultural development (IIAD) reveals that: (1) the magnitude, sensitivity and distribution of social welfare of IIAD is not fully analysed; (2) the impacts of excessive pesticide use on farmers’ health are not adequately explained; (3) no analysis estimates the relationship between farm level efficiency and overuse of agro-chemical inputs under imperfect markets; and (4) the method of incorporating groundwater extraction costs is misleading. This PhD thesis investigates these issues by using primary data, along with secondary data from Sri Lanka. The overall findings of the thesis can be summarised as follows. First, the thesis demonstrates that Sri Lanka has gained a positive welfare change as a result of introducing new irrigation technology. The change in the consumer surplus is Rs.48,236 million, while the change in the producer surplus is Rs. 14,274 millions between 1970 and 2006. The results also show that the long run benefits and costs of IIAD depend critically on the magnitude of the expansion of the irrigated area, as well as the competition faced by traditional farmers (agricultural crowding out effects). The traditional sector’s ability to compete with the modern sector depends on productivity improvements, reducing production costs and future structural changes (spillover effects). Second, the thesis findings on pesticides used for agriculture show that, on average, a farmer incurs a cost of approximately Rs. 590 to 800 per month during a typical cultivation period due to exposure to pesticides. It is shown that the value of average loss in earnings per farmer for the ‘hospitalised’ sample is Rs. 475 per month, while it is approximately Rs. 345 per month for the ‘general’ farmers group during a typical cultivation season. However, the average willingness to pay (WTP) to avoid exposure to pesticides is approximately Rs. 950 and Rs. 620 for ‘hospitalised’ and ‘general’ farmers’ samples respectively. The estimated percentage contribution for WTP due to health costs, lost earnings, mitigating expenditure, and disutility are 29, 50, 5 and 16 per cent respectively for hospitalised farmers, while they are 32, 55, 8 and 5 per cent respectively for ‘general’ farmers. It is also shown that given market imperfections for most agricultural inputs, farmers are overusing pesticides with the expectation of higher future returns. This has led to an increase in inefficiency in farming practices which is not understood by the farmers. Third, it is found that various groundwater depletion studies in the economics literature have provided misleading optimal water extraction quantity levels. This is due to a failure to incorporate all production costs in the relevant models. It is only by incorporating quality changes to quantity deterioration, that it is possible to derive socially optimal levels. Empirical results clearly show that the benefits per hectare per month considering both the avoidance costs of deepening agro-wells by five feet from the existing average, as well as the avoidance costs of maintaining the water salinity level at 1.8 (mmhos/Cm), is approximately Rs. 4,350 for farmers in the Anuradhapura district and Rs. 5,600 for farmers in the Matale district.
Resumo:
The self-assembling behavior and microscopic structure of zinc oxide nanoparticle Langmuir-Blodgett monolayer films were investigated for the case of zinc oxide nanoparticles coated with a hydrophobic layer of dodecanethiol. Evolution of nanoparticle film structure as a function of surface pressure (π) at the air-water interface was monitored in situ using Brewster’s angle microscopy, where it was determined that π=16 mN/m produced near-defect-free monolayer films. Transmission electron micrographs of drop-cast and Langmuir-Schaefer deposited films of the dodecanethiol-coated zinc oxide nanoparticles revealed that the nanoparticle preparation method yielded a microscopic structure that consisted of one-dimensional rodlike assemblies of nanoparticles with typical dimensions of 25 x 400 nm, encased in the organic dodecanethiol layer. These nanoparticle-containing rodlike micelles were aligned into ordered arrangements of parallel rods using the Langmuir-Blodgett technique.
Resumo:
Dynamic load sharing can be defined as a measure of the ability of a heavy vehicle multi-axle group to equalise load across its wheels under typical travel conditions; i.e. in the dynamic sense at typical travel speeds and operating conditions of that vehicle. Various attempts have been made to quantify the ability of heavy vehicles to equalise the load across their wheels during travel. One of these was the concept of the load sharing coefficient (LSC). Other metrics such as the dynamic load coefficient (DLC), peak dynamic wheel force (PDWF) and dynamic impact force (DIF) have been used to compare one heavy vehicle suspension with another for potential road damage. This paper compares these metrics and determines a relationship between DLC and LSC with sensitivity analysis of this relationship. The shortcomings of the presently-available metrics are discussed with a new metric proposed - the dynamic load equalisation (DLE) measure.
Resumo:
Pathological mineralization of articular cartilage is a characteristic feature of osteoarthritis (OA); however, the underlying mechanisms, and their relevance to cartilage degeneration, are not clear. The involvement of subchondral bone changes in OA have been reported previously with the characterization of abnormal subchondral bone mineral density (BMD), osteiod volume, altered bone mechanical parameters and an increase in bone turnover markers. A number of osteoarthritic animal models have demonstrated that subchondral bone changes often precede cartilage degeneration. In this study site specific localization of mineralization markers were detected in the OA cartilage. Chondrocytes and osteoblasts derived from OA cartilage and subchondral bone showed a significant increase in the mRNA expressions of mineralization markers. Interestingly, osteoblasts from OA subchondral bone could significantly decrease cartilage matrix expression; whereas, increase mineralization of chondrocytes (Figure 1). Osteogenic factors, such as CBFA1, ALP, and type X collagen (Col-X), were detected in chondrocytes under mineralization conditions (Figure 2). Furthermore, chondrocyte mineralization was followed by increased mRNA and protein levels of MMP-2, MMP-9 and MMP-13, all of which are detrimental to cartilage integrity in vivo. The data reported here suggests that the upregulation of subchondral bone-mineralization, typical of OA progression, causes cartilage mineralization, and that the mineralization of chondrocytes induce increased MMP levels with a subsequent degradation of the articular cartilage.
Resumo:
Introduction: Osteoarthritis (OA) is the most common musculoskeletal disorder and represents a major health burden to society. In the course of the pathological development of OA, articular cartilage chondrocytes (ACCs) undergo a typical phenotype changes characterized by the expression of hypertrophic differentiation markers. Also, the adjacent subchondral bone shows signs of abnormal mineral density and enhanced production of bone turnover markers, indicative of osteoblast dysfunction. However, the mechanism(s) by which these changes occur during the OA development are not completely understood. Materials and Methods: ACCs and subchondral bone osteoblasts (SBOs) were harvested from OA and healthy patients for the cross-talk studies between normal and OA ACCs and SBOs. The involvement of mitogen activated protein kinase (MAPK) signalling pathway during the cell-cell interactions was analysed by zymography, ELISA and western blotting methods. Results: The direct and in-direct co-culture studies showed that OA (ACCs and SBOs) cells induced osteoarthritic changes of normal (ACC and SBOs) cells. This altered cell interaction induced by OA cells significantly aggravated the proteolytic activity, which resulted cartilage degeneration. The altered cell interaction appeared to significantly activate ERK 1/2 phosphorylation and inhibition of MAPK-ERK 1/2 pathway reversed the osteoarthrtitic phenotypic changes. Discussion and Conclusion: Our study has demonstrated that the altered bi-directional communication of SBOs and ACCs are critical for initiation and progression of OA related changes and that this process is mediated by MAPK signalling pathways. Targeting these altered interactions by the use of MAPK inhibitors may provide the scientific rationale for the development of novel therapeutic strategies in the treatment and management of OA related disorders.
Resumo:
A trend in design and implementation of modern industrial automation systems is to integrate computing, communication and control into a unified framework at different levels of machine/factory operations and information processing. These distributed control systems are referred to as networked control systems (NCSs). They are composed of sensors, actuators, and controllers interconnected over communication networks. As most of communication networks are not designed for NCS applications, the communication requirements of NCSs may be not satisfied. For example, traditional control systems require the data to be accurate, timely and lossless. However, because of random transmission delays and packet losses, the control performance of a control system may be badly deteriorated, and the control system rendered unstable. The main challenge of NCS design is to both maintain and improve stable control performance of an NCS. To achieve this, communication and control methodologies have to be designed. In recent decades, Ethernet and 802.11 networks have been introduced in control networks and have even replaced traditional fieldbus productions in some real-time control applications, because of their high bandwidth and good interoperability. As Ethernet and 802.11 networks are not designed for distributed control applications, two aspects of NCS research need to be addressed to make these communication networks suitable for control systems in industrial environments. From the perspective of networking, communication protocols need to be designed to satisfy communication requirements for NCSs such as real-time communication and high-precision clock consistency requirements. From the perspective of control, methods to compensate for network-induced delays and packet losses are important for NCS design. To make Ethernet-based and 802.11 networks suitable for distributed control applications, this thesis develops a high-precision relative clock synchronisation protocol and an analytical model for analysing the real-time performance of 802.11 networks, and designs a new predictive compensation method. Firstly, a hybrid NCS simulation environment based on the NS-2 simulator is designed and implemented. Secondly, a high-precision relative clock synchronization protocol is designed and implemented. Thirdly, transmission delays in 802.11 networks for soft-real-time control applications are modeled by use of a Markov chain model in which real-time Quality-of- Service parameters are analysed under a periodic traffic pattern. By using a Markov chain model, we can accurately model the tradeoff between real-time performance and throughput performance. Furthermore, a cross-layer optimisation scheme, featuring application-layer flow rate adaptation, is designed to achieve the tradeoff between certain real-time and throughput performance characteristics in a typical NCS scenario with wireless local area network. Fourthly, as a co-design approach for both a network and a controller, a new predictive compensation method for variable delay and packet loss in NCSs is designed, where simultaneous end-to-end delays and packet losses during packet transmissions from sensors to actuators is tackled. The effectiveness of the proposed predictive compensation approach is demonstrated using our hybrid NCS simulation environment.
Resumo:
Information overload has become a serious issue for web users. Personalisation can provide effective solutions to overcome this problem. Recommender systems are one popular personalisation tool to help users deal with this issue. As the base of personalisation, the accuracy and efficiency of web user profiling affects the performances of recommender systems and other personalisation systems greatly. In Web 2.0, the emerging user information provides new possible solutions to profile users. Folksonomy or tag information is a kind of typical Web 2.0 information. Folksonomy implies the users‘ topic interests and opinion information. It becomes another source of important user information to profile users and to make recommendations. However, since tags are arbitrary words given by users, folksonomy contains a lot of noise such as tag synonyms, semantic ambiguities and personal tags. Such noise makes it difficult to profile users accurately or to make quality recommendations. This thesis investigates the distinctive features and multiple relationships of folksonomy and explores novel approaches to solve the tag quality problem and profile users accurately. Harvesting the wisdom of crowds and experts, three new user profiling approaches are proposed: folksonomy based user profiling approach, taxonomy based user profiling approach, hybrid user profiling approach based on folksonomy and taxonomy. The proposed user profiling approaches are applied to recommender systems to improve their performances. Based on the generated user profiles, the user and item based collaborative filtering approaches, combined with the content filtering methods, are proposed to make recommendations. The proposed new user profiling and recommendation approaches have been evaluated through extensive experiments. The effectiveness evaluation experiments were conducted on two real world datasets collected from Amazon.com and CiteULike websites. The experimental results demonstrate that the proposed user profiling and recommendation approaches outperform those related state-of-the-art approaches. In addition, this thesis proposes a parallel, scalable user profiling implementation approach based on advanced cloud computing techniques such as Hadoop, MapReduce and Cascading. The scalability evaluation experiments were conducted on a large scaled dataset collected from Del.icio.us website. This thesis contributes to effectively use the wisdom of crowds and expert to help users solve information overload issues through providing more accurate, effective and efficient user profiling and recommendation approaches. It also contributes to better usages of taxonomy information given by experts and folksonomy information contributed by users in Web 2.0.
Resumo:
Traffic oscillations are typical features of congested traffic flow that are characterized by recurring decelerations followed by accelerations (stop-and-go driving). The negative environmental impacts of these oscillations are widely accepted, but their impact on traffic safety has been debated. This paper describes the impact of freeway traffic oscillations on traffic safety. This study employs a matched case-control design using high-resolution traffic and crash data from a freeway segment. Traffic conditions prior to each crash were taken as cases, while traffic conditions during the same periods on days without crashes were taken as controls. These were also matched by presence of congestion, geometry and weather. A total of 82 cases and about 80,000 candidate controls were extracted from more than three years of data from 2004 to 2007. Conditional logistic regression models were developed based on the case-control samples. To verify consistency in the results, 20 different sets of controls were randomly extracted from the candidate pool for varying control-case ratios. The results reveal that the standard deviation of speed (thus, oscillations) is a significant variable, with an average odds ratio of about 1.08. This implies that the likelihood of a (rear-end) crash increases by about 8% with an additional unit increase in the standard deviation of speed. The average traffic states prior to crashes were less significant than the speed variations in congestion.
Resumo:
Item folksonomy or tag information is a kind of typical and prevalent web 2.0 information. Item folksonmy contains rich opinion information of users on item classifications and descriptions. It can be used as another important information source to conduct opinion mining. On the other hand, each item is associated with taxonomy information that reflects the viewpoints of experts. In this paper, we propose to mine for users’ opinions on items based on item taxonomy developed by experts and folksonomy contributed by users. In addition, we explore how to make personalized item recommendations based on users’ opinions. The experiments conducted on real word datasets collected from Amazon.com and CiteULike demonstrated the effectiveness of the proposed approaches.
Resumo:
Traffic oscillations are typical features of congested traffic flow that are characterized by recurring decelerations followed by accelerations. However, people have limited knowledge on this complex topic. In this research, 1) the impact of traffic oscillations on freeway crash occurrences has been measured using the matched case-control design. The results consistently reveal that oscillations have a more significant impact on freeway safety than the average traffic states. 2) Wavelet Transform has been adopted to locate oscillations' origins and measure their characteristics along their propagation paths using vehicle trajectory data. 3) Lane changing maneuver's impact on the immediate follower is measured and modeled. The knowledge and the new models generated from this study could provide better understanding on fundamentals of congested traffic; enable improvements to existing traffic control strategies and freeway crash countermeasures; and instigate people to develop new operational strategies with the objective of reducing the negative effects of oscillatory driving.