972 resultados para Multi-path mitigation
Resumo:
In order to establish the influence of the drying air characteristics on the drying performance and fluidization quality of bovine intestine for pet food, several drying tests have been carried out in a laboratory scale heat pump assisted fluid bed dryer. Bovine intestine samples were heat pump fluidized bed dried at atmospheric pressure and at temperatures below and above the materials freezing points, equipped with a continuous monitoring system. The investigation of the drying characteristics have been conducted in the temperature range −10 to 25 ◦C and the airflow in the range 1.5–2.5 m/s. Some experiments were conducted as single temperature drying experiments and others as two stage drying experiments employing two temperatures. An Arrhenius-type equation was used to interpret the influence of the drying air temperature on the effective diffusivity, calculated with the method of slopes in terms of energy activation, and this was found to be sensitive to the temperature. The effective diffusion coefficient of moisture transfer was determined by the Fickian method using uni-dimensional moisture movement in both moisture, removal by evaporation and combined sublimation and evaporation. Correlations expressing the effective moisture diffusivity and drying temperature are reported. Bovine particles were characterized according to the Geldart classification and the minimum fluidization velocity was calculated using the Ergun Equation and generalized equation for all drying conditions at the beginning and end of the trials. Walli’s model was used to categorize stability of the fluidization at the beginning and end of the dryingv for each trial. The determined Walli’s values were positive at the beginning and end of all trials indicating stable fluidization at the beginning and end for each drying condition.
Resumo:
The ability to identify and assess user engagement with transmedia productions is vital to the success of individual projects and the sustainability of this mode of media production as a whole. It is essential that industry players have access to tools and methodologies that offer the most complete and accurate picture of how audiences/users engage with their productions and which assets generate the most valuable returns of investment. Drawing upon research conducted with Hoodlum Entertainment, a Brisbane-based transmedia producer, this project involved an initial assessment of the way engagement tends to be understood, why standard web analytics tools are ill-suited to measuring it, how a customised tool could offer solutions, and why this question of measuring engagement is so vital to the future of transmedia as a sustainable industry. Working with data provided by Hoodlum Entertainment and Foxtel Marketing, the outcome of the study was a prototype for a custom data visualisation tool that allowed access, manipulation and presentation of user engagement data, both historic and predictive. The prototyped interfaces demonstrate how the visualization tool would collect and organise data specific to multiplatform projects by aggregating data across a number of platform reporting tools. Such a tool is designed to encompass not only platforms developed by the transmedia producer but also sites developed by fans. This visualisation tool accounted for multiplatform experience projects whose top level is comprised of people, platforms and content. People include characters, actors, audience, distributors and creators. Platforms include television, Facebook and other relevant social networks, literature, cinema and other media that might be included in the multiplatform experience. Content refers to discreet media texts employed within the platform, such as tweet, a You Tube video, a Facebook post, an email, a television episode, etc. Core content is produced by the creators’ multiplatform experiences to advance the narrative, while complimentary content generated by audience members offers further contributions to the experience. Equally important is the timing with which the components of the experience are introduced and how they interact with and impact upon each other. Being able to combine, filter and sort these elements in multiple ways we can better understand the value of certain components of a project. It also offers insights into the relationship between the timing of the release of components and user activity associated with them, which further highlights the efficacy (or, indeed, failure) of assets as catalysts for engagement. In collaboration with Hoodlum we have developed a number of design scenarios experimenting with the ways in which data can be visualised and manipulated to tell a more refined story about the value of user engagement with certain project components and activities. This experimentation will serve as the basis for future research.
Resumo:
The European Early Lung Cancer (EUELC) project aims to determine if specific genetic alterations occurring in lung carcinogenesis are detectable in the respiratory epithelium. In order to pursue this objective, nonsmall cell lung cancer (NSCLC) patients with a very high risk of developing progressive lung cancer were recruited from 12 centres in eight European countries: France, Germany, southern Ireland, Italy, the Netherlands, Poland, Spain and the UK. In addition, NSCLC patients were followed up every 6 months for 36 months. A European Bronchial Tissue Bank was set up at the University of Liverpool (Liverpool, UK) to optimise the use of biological specimens. The molecular - pathological investigations were subdivided into specific work packages that were delivered by EUELC Partners. The work packages encompassed mutational analysis, genetic instability, methylation profiling, expression profiling utilising immunohistochemistry and chip-based technologies, as well as in-depth analysis of FHIT and RARβ genes, the telomerase catalytic subunit hTERT and genotyping of susceptibility genes in specific pathways. The EUELC project engendered a tremendous collaborative effort, and it enabled the EUELC Partners to establish protocols for assessing molecular biomarkers in early lung cancer with the view to using such biomarkers for early diagnosis and as intermediate end-points in future chemopreventive programmes. Copyright©ERS Journals Ltd 2009.
Resumo:
Capacity of current and future high data rate wireless communications depend significantly on how well changes in the wireless channel are predicted and tracked. Generally, this can be estimated by transmitting known symbols. However, this increases overheads if the channel varies over time. Given today’s bandwidth demand and the increased necessity for mobile wireless devices, the contributions of this research are very significant. This study has developed a novel and efficient channel tracking algorithm that can recursively update the channel estimation for wireless broadband communications reducing overheads, therefore increasing the speed of wireless communication systems.
Resumo:
This thesis presents an analysis of the resource allocation problem in Orthogonal Frequency Division Multiplexing based multi-hop wireless communications systems. The study analyzed the tractable nature of the problem and designed several heuristic and fairness-aware resource allocation algorithms. These algorithms are fast and efficient and therefore can improve power management in wireless systems significantly.
Resumo:
Family members living with a relative diagnosed with schizophrenia have reported challenges and traumatic stressors, as well as perceived benefits and personal growth. This study explored factors associated with posttraumatic growth (PTG) within such families. Personality, stress, coping, social support and PTG were assessed in 110 family members. Results revealed that a multiplicative mediational path model with social support and emotional or instrumental coping strategies as multi-mediators had a significant indirect effect on the relationship between extraversion and PTG. Clinically relevant concepts that map onto the multi-mediator model are discussed, translating these findings into clinical practice to facilitate naturally occurring PTG processes.
Resumo:
This thesis investigates condition monitoring (CM) of diesel engines using acoustic emission (AE) techniques. The AE signals recorded from a small size diesel engine are mixtures of multiple sources from multiple cylinders. Thus, it is difficult to interpret the information conveyed in the signals for CM purposes. This thesis develops a series of practical signal processing techniques to overcome this problem. Various experimental studies conducted to assess the CM capabilities of AE analysis for diesel engines. A series of modified signal processing techniques were proposed. These techniques showed promising results of capability for CM of multiple cylinders diesel engine using multiple AE sensors.
Resumo:
Multi-Microgrids (MMGs) have been proposed to connect distributed generators (DG), microgrids (MG), and medium-voltage (MV) loads with the distribution system. A flexible protection scheme that enables an islanded MMG to continue operation during fault conditions is yet to be developed. In this paper, a protection scheme for an islanded MMG that utilises MG controllers and communication links is proposed. The MMG model used includes two MGs connected to the distribution system. Each MG consists of diesel, wind, and photovoltaic (PV) microsources. The effectiveness of the proposed protection scheme is evaluated by simulation.
Resumo:
A key challenge for the 21st Century is to make our cities more liveable and foster economically sustainable, environmentally responsible, and socially inclusive communities. Design thinking, particularly a human-centred approach, offers a way to tackle this challenge. Findings from two recent Australian research projects highlight how facilitating sustainable, liveable communities in a humid sub-tropical environment requires an in-depth understanding of people’s perspectives, experiences and practices. Project 1 (‘Research House’) documents the reflections of a family who lived in a ‘test’ sustainable house for two years, outlining their experience and evaluations of universal design and sustainable technologies. The study family was very impressed with the natural lighting, natural ventilation, spaciousness and ease of access, which contributed significantly to their comfort and the liveability of their home. Project 2 (‘Inner-Urban High Density Living’) explored Brisbane residents’ opinions about high-density living, through a survey (n=636), interviews (n=24), site observations (over 300 hours) and environmental monitoring, assessing opinions on the liveability of their individual dwelling, the multi-unit host building and the surrounding neighbourhood. Nine areas, categorised into three general domains, were identified as essential for enhancing high density liveability. In terms of the dwelling, thermal comfort/ventilation, natural light, noise mitigation were important; shared space, good neighbour protocols, and support for environmentally sustainable behaviour were desired in the building/complex; and accessible/sustainable transport, amenities and services, sense of community were considered important in the surrounding neighbourhood. Combined, these findings emphasise the importance and complexity associated with designing liveable building, cities and communities, illustrating how adopting a design thinking, human-centred approach will help create sustainable communities that will meet the needs of current and future generations.
Resumo:
This thesis presents a multi-criteria optimisation study of group replacement schedules for water pipelines, which is a capital-intensive and service critical decision. A new mathematical model was developed, which minimises total replacement costs while maintaining a satisfactory level of services. The research outcomes are expected to enrich the body of knowledge of multi-criteria decision optimisation, where group scheduling is required. The model has the potential to optimise replacement planning for other types of linear asset networks resulting in bottom-line benefits for end users and communities. The results of a real case study show that the new model can effectively reduced the total costs and service interruptions.
Resumo:
Association rule mining is one technique that is widely used when querying databases, especially those that are transactional, in order to obtain useful associations or correlations among sets of items. Much work has been done focusing on efficiency, effectiveness and redundancy. There has also been a focusing on the quality of rules from single level datasets with many interestingness measures proposed. However, with multi-level datasets now being common there is a lack of interestingness measures developed for multi-level and cross-level rules. Single level measures do not take into account the hierarchy found in a multi-level dataset. This leaves the Support-Confidence approach, which does not consider the hierarchy anyway and has other drawbacks, as one of the few measures available. In this chapter we propose two approaches which measure multi-level association rules to help evaluate their interestingness by considering the database’s underlying taxonomy. These measures of diversity and peculiarity can be used to help identify those rules from multi-level datasets that are potentially useful.
Resumo:
In this paper we present a method for autonomously tuning the threshold between learning and recognizing a place in the world, based on both how the rodent brain is thought to process and calibrate multisensory data and the pivoting movement behaviour that rodents perform in doing so. The approach makes no assumptions about the number and type of sensors, the robot platform, or the environment, relying only on the ability of a robot to perform two revolutions on the spot. In addition, it self-assesses the quality of the tuning process in order to identify situations in which tuning may have failed. We demonstrate the autonomous movement-driven threshold tuning on a Pioneer 3DX robot in eight locations spread over an office environment and a building car park, and then evaluate the mapping capability of the system on journeys through these environments. The system is able to pick a place recognition threshold that enables successful environment mapping in six of the eight locations while also autonomously flagging the tuning failure in the remaining two locations. We discuss how the method, in combination with parallel work on autonomous weighting of individual sensors, moves the parameter dependent RatSLAM system significantly closer to sensor, platform and environment agnostic operation.
Resumo:
The selection of optimal camera configurations (camera locations, orientations, etc.) for multi-camera networks remains an unsolved problem. Previous approaches largely focus on proposing various objective functions to achieve different tasks. Most of them, however, do not generalize well to large scale networks. To tackle this, we propose a statistical framework of the problem as well as propose a trans-dimensional simulated annealing algorithm to effectively deal with it. We compare our approach with a state-of-the-art method based on binary integer programming (BIP) and show that our approach offers similar performance on small scale problems. However, we also demonstrate the capability of our approach in dealing with large scale problems and show that our approach produces better results than two alternative heuristics designed to deal with the scalability issue of BIP. Last, we show the versatility of our approach using a number of specific scenarios.
Resumo:
Diagnostics of rolling element bearings have been traditionally developed for constant operating conditions, and sophisticated techniques, like Spectral Kurtosis or Envelope Analysis, have proven their effectiveness by means of experimental tests, mainly conducted in small-scale laboratory test-rigs. Algorithms have been developed for the digital signal processing of data collected at constant speed and bearing load, with a few exceptions, allowing only small fluctuations of these quantities. Owing to the spreading of condition based maintenance in many industrial fields, in the last years a need for more flexible algorithms emerged, asking for compatibility with highly variable operating conditions, such as acceleration/deceleration transients. This paper analyzes the problems related with significant speed and load variability, discussing in detail the effect that they have on bearing damage symptoms, and propose solutions to adapt existing algorithms to cope with this new challenge. In particular, the paper will i) discuss the implication of variable speed on the applicability of diagnostic techniques, ii) address quantitatively the effects of load on the characteristic frequencies of damaged bearings and iii) finally present a new approach for bearing diagnostics in variable conditions, based on envelope analysis. The research is based on experimental data obtained by using artificially damaged bearings installed on a full scale test-rig, equipped with actual train traction system and reproducing the operation on a real track, including all the environmental noise, owing to track irregularity and electrical disturbances of such a harsh application.