156 resultados para Multiple-scale processing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our aim is to develop a set of leading performance indicators to enable managers of large projects to forecast during project execution how various stakeholders will perceive success months or even years into the operation of the output. Large projects have many stakeholders who have different objectives for the project, its output, and the business objectives they will deliver. The output of a large project may have a lifetime that lasts for years, or even decades, and ultimate impacts that go beyond its immediate operation. How different stakeholders perceive success can change with time, and so the project manager needs leading performance indicators that go beyond the traditional triple constraint to forecast how key stakeholders will perceive success months or even years later. In this article, we develop a model for project success that identifies how project stakeholders might perceive success in the months and years following a project. We identify success or failure factors that will facilitate or mitigate against achievement of those success criteria, and a set of potential leading performance indicators that forecast how stakeholders will perceive success during the life of the project's output. We conducted a scale development study with 152 managers of large projects and identified two project success factor scales and seven stakeholder satisfaction scales that can be used by project managers to predict stakeholder satisfaction on projects and so may be used by the managers of large projects for the basis of project control.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A simple and effective down-sample algorithm, Peak-Hold-Down-Sample (PHDS) algorithm is developed in this paper to enable a rapid and efficient data transfer in remote condition monitoring applications. The algorithm is particularly useful for high frequency Condition Monitoring (CM) techniques, and for low speed machine applications since the combination of the high sampling frequency and low rotating speed will generally lead to large unwieldy data size. The effectiveness of the algorithm was evaluated and tested on four sets of data in the study. One set of the data was extracted from the condition monitoring signal of a practical industry application. Another set of data was acquired from a low speed machine test rig in the laboratory. The other two sets of data were computer simulated bearing defect signals having either a single or multiple bearing defects. The results disclose that the PHDS algorithm can substantially reduce the size of data while preserving the critical bearing defect information for all the data sets used in this work even when a large down-sample ratio was used (i.e., 500 times down-sampled). In contrast, the down-sample process using existing normal down-sample technique in signal processing eliminates the useful and critical information such as bearing defect frequencies in a signal when the same down-sample ratio was employed. Noise and artificial frequency components were also induced by the normal down-sample technique, thus limits its usefulness for machine condition monitoring applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Multidimensional Loss Scale: Initial Development and Psychometric Evaluation The Multidimensional Loss Scale (MLS) represents the first instrument designed specifically to measure loss in refugee populations. Researchers developed initial items of the Multidimensional Loss Scale to assess Experience of Loss Events and Loss Distress in a culturally sensitive manner across multiple domains (social, material, intra-personal and cultural). A sample of 70 recently settled Burmese adult refugees completed a battery of questionnaires, including new scale items. Analyses explored the scale’s factor structure, internal consistency, convergent validity and divergent validity. Principal Axis Factoring supported a five-factor model: Loss of Symbolic Self, Loss of Interdependence, Loss of Home, Interpersonal Loss, and Loss of Intrapersonal Integrity. Chronbach’s Alphas indicated satisfactory internal consistency for Experience of Loss Events (.85) and Loss Distress (.92). Convergent and divergent validity of Loss Distress were supported by moderate correlations with interpersonal grief and trauma symptoms and weak correlations with depression and anxiety. The new scale was well received by people from refugee backgrounds and shows promise for application in future research and practice

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose and evaluate a speaker attribution system using a complete-linkage clustering method. Speaker attribution refers to the annotation of a collection of spoken audio based on speaker identities. This can be achieved using diarization and speaker linking. The main challenge associated with attribution is achieving computational efficiency when dealing with large audio archives. Traditional agglomerative clustering methods with model merging and retraining are not feasible for this purpose. This has motivated the use of linkage clustering methods without retraining. We first propose a diarization system using complete-linkage clustering and show that it outperforms traditional agglomerative and single-linkage clustering based diarization systems with a relative improvement of 40% and 68%, respectively. We then propose a complete-linkage speaker linking system to achieve attribution and demonstrate a 26% relative improvement in attribution error rate (AER) over the single-linkage speaker linking approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of Wireless Sensor Networks (WSNs) for Structural Health Monitoring (SHM) has become a promising approach due to many advantages such as low cost, fast and flexible deployment. However, inherent technical issues such as data synchronization error and data loss have prevented these distinct systems from being extensively used. Recently, several SHM-oriented WSNs have been proposed and believed to be able to overcome a large number of technical uncertainties. Nevertheless, there is limited research verifying the applicability of those WSNs with respect to demanding SHM applications like modal analysis and damage identification. This paper first presents a brief review of the most inherent uncertainties of the SHM-oriented WSN platforms and then investigates their effects on outcomes and performance of the most robust Output-only Modal Analysis (OMA) techniques when employing merged data from multiple tests. The two OMA families selected for this investigation are Frequency Domain Decomposition (FDD) and Data-driven Stochastic Subspace Identification (SSI-data) due to the fact that they both have been widely applied in the past decade. Experimental accelerations collected by a wired sensory system on a large-scale laboratory bridge model are initially used as clean data before being contaminated by different data pollutants in sequential manner to simulate practical SHM-oriented WSN uncertainties. The results of this study show the robustness of FDD and the precautions needed for SSI-data family when dealing with SHM-WSN uncertainties. Finally, the use of the measurement channel projection for the time-domain OMA techniques and the preferred combination of the OMA techniques to cope with the SHM-WSN uncertainties is recommended.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analysis of fossils from cave deposits at Mount Etna (eastern-central Queensland) has established that a species-rich rainforest palaeoenvironment existed in that area during the middle Pleistocene. This unexpected finding has implications for several fields (e.g., biogeography/phylogeography of rainforest-adapted taxa, and the impact of climate change on rainforest communities), but it was unknown whether the Mount Etna sites represented a small refugial patch of rainforest or was more widespread. In this study numerous bone deposits in caves in north-east Queensland are analysed to reconstruct the environmental history of the area during the late Quaternary. Study sites are in the Chillagoe/Mitchell Palmer and Broken River/Christmas Creek areas. The cave fossil records in these study areas are compared with dated (middle Pleistocene-Holocene) cave sites in the Mount Etna area. Substantial taxonomic work on the Mount Etna faunas (particularly dasyurid marsupials and murine rodents) is also presented as a prerequisite for meaningful comparison with the study sites further north. Middle Pleistocene sites at Mount Etna contain species indicative of a rainforest palaeoenvironment. Small mammal assemblages in the Mount Etna rainforest sites (>500-280 ka) are unexpectedly diverse and composed almost entirely of new species. Included in the rainforest assemblages are lineages with no extant representatives in rainforest (e.g., Leggadina), one genus previously known only from New Guinea (Abeomelomys), and forms that appear to bridge gaps between related but morphologically-divergent extant taxa ('B-rat' and 'Pseudomys C'). Curiously, some taxa (e.g., Melomys spp.) are notable for their absence from the Mount Etna rainforest sites. After 280 ka the rainforest faunas are replaced by species adapted to open, dry habitats. At that time the extinct ‘rainforest’ dasyurids and rodents are replaced by species that are either extant or recently extant. By the late Pleistocene all ‘rainforest’ and several ‘dry’ taxa are locally or completely extinct, and the small mammal fauna resembles that found in the area today. The faunal/environmental changes recorded in the Mount Etna sites were interpreted by previous workers as the result of shifts in climate during the Pleistocene. Many samples from caves in the Chillagoe/Mitchell-Palmer and Broken River/Christmas Creek areas are held in the Queensland Museum’s collection. These, supplemented with additional samples collected in the field as well as samples supplied by other workers, were systematically and palaeoecologically analysed for the first time. Palaeoecological interpretation of the faunal assemblages in the sites suggests that they encompass a similar array of palaeoenvironments as the Mount Etna sites. ‘Rainforest’ sites at the Broken River are here interpreted as being of similar age to those at Mount Etna, suggesting the possibility of extensive rainforest coverage in eastern tropical Queensland during part of the Pleistocene. Likewise, faunas suggesting open, dry palaeoenvironments are found at Chillagoe, the Broken River and Mount Etna, and may be of similar age. The 'dry' faunal assemblage at Mount Etna (Elephant hole Cave) dates to 205-170 ka. Dating of one of the Chillagoe sites (QML1067) produced a maximum age for the deposit of approximately 200 ka, and the site is interpreted as being close to that age, supporting the interpretation of roughly contemporaneous deposition at Mount Etna and Chillagoe. Finally, study sites interpreted as being of late Pleistocene-Holocene age show faunal similarities to sites of that age near Mount Etna. This study has several important implications for the biogeography and phylogeography of murine rodents, and represents a major advance in the study of the Australian murine fossil record. Likewise the survey of the northern study areas is the first systematic analysis of multiple sites in those areas, and is thus a major contribution to knowledge of tropical Australian faunas during the Quaternary. This analysis suggests that climatic changes during the Pleistocene affected a large area of eastern tropical Queensland in similar ways. Further fieldwork and dating is required to properly analyse the geographical extent and timing of faunal change in eastern tropical Queensland.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To date, the formation of deposits on heat exchanger surfaces is the least understood problem in the design of heat exchangers for processing industries. Dr East has related the structure of the deposits to solution composition and has developed predictive models for composite fouling of calcium oxalate and silica in sugar factory evaporators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Global Navigation Satellite Systems (GNSS)-based observation systems can provide high precision positioning and navigation solutions in real time, in the order of subcentimetre if we make use of carrier phase measurements in the differential mode and deal with all the bias and noise terms well. However, these carrier phase measurements are ambiguous due to unknown, integer numbers of cycles. One key challenge in the differential carrier phase mode is to fix the integer ambiguities correctly. On the other hand, in the safety of life or liability-critical applications, such as for vehicle safety positioning and aviation, not only is high accuracy required, but also the reliability requirement is important. This PhD research studies to achieve high reliability for ambiguity resolution (AR) in a multi-GNSS environment. GNSS ambiguity estimation and validation problems are the focus of the research effort. Particularly, we study the case of multiple constellations that include initial to full operations of foreseeable Galileo, GLONASS and Compass and QZSS navigation systems from next few years to the end of the decade. Since real observation data is only available from GPS and GLONASS systems, the simulation method named Virtual Galileo Constellation (VGC) is applied to generate observational data from another constellation in the data analysis. In addition, both full ambiguity resolution (FAR) and partial ambiguity resolution (PAR) algorithms are used in processing single and dual constellation data. Firstly, a brief overview of related work on AR methods and reliability theory is given. Next, a modified inverse integer Cholesky decorrelation method and its performance on AR are presented. Subsequently, a new measure of decorrelation performance called orthogonality defect is introduced and compared with other measures. Furthermore, a new AR scheme considering the ambiguity validation requirement in the control of the search space size is proposed to improve the search efficiency. With respect to the reliability of AR, we also discuss the computation of the ambiguity success rate (ASR) and confirm that the success rate computed with the integer bootstrapping method is quite a sharp approximation to the actual integer least-squares (ILS) method success rate. The advantages of multi-GNSS constellations are examined in terms of the PAR technique involving the predefined ASR. Finally, a novel satellite selection algorithm for reliable ambiguity resolution called SARA is developed. In summary, the study demonstrats that when the ASR is close to one, the reliability of AR can be guaranteed and the ambiguity validation is effective. The work then focuses on new strategies to improve the ASR, including a partial ambiguity resolution procedure with a predefined success rate and a novel satellite selection strategy with a high success rate. The proposed strategies bring significant benefits of multi-GNSS signals to real-time high precision and high reliability positioning services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Organizations make increasingly use of social media in order to compete for customer awareness and improve the quality of their goods and services. Multiple techniques of social media analysis are already in use. Nevertheless, theoretical underpinnings and a sound research agenda are still unavailable in this field at the present time. In order to contribute to setting up such an agenda, we introduce digital social signal processing (DSSP) as a new research stream in IS that requires multi-facetted investigations. Our DSSP concept is founded upon a set of four sequential activities: sensing digital social signals that are emitted by individuals on social media; decoding online data of social media in order to reconstruct digital social signals; matching the signals with consumers’ life events; and configuring individualized goods and service offerings tailored to the individual needs of customers. We further contribute to tying loose ends of different research areas together, in order to frame DSSP as a field for further investigation. We conclude with developing a research agenda.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CubIT is a multi-user, large-scale presentation and collaboration framework installed at the Queensland University of Technology’s (QUT) Cube facility, an interactive facility made up 48 multi-touch screens and very large projected display screens. CubIT was built to make the Cube facility accessible to QUT’s academic and student population. The system allows users to upload, interact with and share media content on the Cube’s very large display surfaces. CubIT implements a unique combination of features including RFID authentication, content management through multiple interfaces, multi-user shared workspace support, drag and drop upload and sharing, dynamic state control between different parts of the system and execution and synchronisation of the system across multiple computing nodes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This project addresses the viability of lightweight, low power consumption, flexible, large format LED screens. The investigation encompasses all aspects of the electrical and mechanical design, individually and as a system, and achieves a successful full scale prototype. The prototype implements novel techniques to achieve large displacement colour aliasing, a purely passive thermal management solution, a rapid deployment system, individual seven bit LED current control with two way display communication, auto-configuration and complete signal redundancy, all of which are in direct response to industry needs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Chaperonin 10 (Cpn10) is a mitochondrial molecule involved in protein folding. The aim of this study was to determine the safety profile of Cpn10 in patients with multiple sclerosis (MS). Methods A total of 50 patients with relapse-remitting or secondary progressive MS were intravenously administered 5 mg or 10 mg of Cpn10 weekly for 12 weeks in a double-blind, randomized, placebo controlled, phase II trial. Clinical reviews, including Expanded Disability Status Scale and magnetic resonance imaging (MRI) with Gadolinium, were undertaken every 4 weeks. Stimulation of patient peripheral blood mononuclear cells with lipopolysaccharide ex vivo was used to measure the in vivo activity of Cpn10. Results No significant differences in the frequency of adverse events were seen between treatment and placebo arms. Leukocytes from both groups of Cpn10-treated patients produced significantly lower levels of critical proinflammatory cytokines. A trend toward improvement in new Gadolinium enhancing lesions on MRI was observed, but this difference was not statistically significant. No differences in clinical outcome measures were seen. Conclusions Cpn10 is safe and well tolerated when administered to patients with MS for 3 months, however, a further extended phase II study primarily focused on efficacy is warranted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distributed Wireless Smart Camera (DWSC) network is a special type of Wireless Sensor Network (WSN) that processes captured images in a distributed manner. While image processing on DWSCs sees a great potential for growth, with its applications possessing a vast practical application domain such as security surveillance and health care, it suffers from tremendous constraints. In addition to the limitations of conventional WSNs, image processing on DWSCs requires more computational power, bandwidth and energy that presents significant challenges for large scale deployments. This dissertation has developed a number of algorithms that are highly scalable, portable, energy efficient and performance efficient, with considerations of practical constraints imposed by the hardware and the nature of WSN. More specifically, these algorithms tackle the problems of multi-object tracking and localisation in distributed wireless smart camera net- works and optimal camera configuration determination. Addressing the first problem of multi-object tracking and localisation requires solving a large array of sub-problems. The sub-problems that are discussed in this dissertation are calibration of internal parameters, multi-camera calibration for localisation and object handover for tracking. These topics have been covered extensively in computer vision literatures, however new algorithms must be invented to accommodate the various constraints introduced and required by the DWSC platform. A technique has been developed for the automatic calibration of low-cost cameras which are assumed to be restricted in their freedom of movement to either pan or tilt movements. Camera internal parameters, including focal length, principal point, lens distortion parameter and the angle and axis of rotation, can be recovered from a minimum set of two images of the camera, provided that the axis of rotation between the two images goes through the camera's optical centre and is parallel to either the vertical (panning) or horizontal (tilting) axis of the image. For object localisation, a novel approach has been developed for the calibration of a network of non-overlapping DWSCs in terms of their ground plane homographies, which can then be used for localising objects. In the proposed approach, a robot travels through the camera network while updating its position in a global coordinate frame, which it broadcasts to the cameras. The cameras use this, along with the image plane location of the robot, to compute a mapping from their image planes to the global coordinate frame. This is combined with an occupancy map generated by the robot during the mapping process to localised objects moving within the network. In addition, to deal with the problem of object handover between DWSCs of non-overlapping fields of view, a highly-scalable, distributed protocol has been designed. Cameras that follow the proposed protocol transmit object descriptions to a selected set of neighbours that are determined using a predictive forwarding strategy. The received descriptions are then matched at the subsequent camera on the object's path using a probability maximisation process with locally generated descriptions. The second problem of camera placement emerges naturally when these pervasive devices are put into real use. The locations, orientations, lens types etc. of the cameras must be chosen in a way that the utility of the network is maximised (e.g. maximum coverage) while user requirements are met. To deal with this, a statistical formulation of the problem of determining optimal camera configurations has been introduced and a Trans-Dimensional Simulated Annealing (TDSA) algorithm has been proposed to effectively solve the problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As of June 2009, 361 genome-wide association studies (GWAS) had been referenced by the HuGE database. GWAS require DNA from many thousands of individuals, relying on suitable DNA collections. We recently performed a multiple sclerosis (MS) GWAS where a substantial component of the cases (24%) had DNA derived from saliva. Genotyping was done on the Illumina genotyping platform using the Infinium Hap370CNV DUO microarray. Additionally, we genotyped 10 individuals in duplicate using both saliva- and blood-derived DNA. The performance of blood- versus saliva-derived DNA was compared using genotyping call rate, which reflects both the quantity and quality of genotyping per sample and the “GCScore,” an Illumina genotyping quality score, which is a measure of DNA quality. We also compared genotype calls and GCScores for the 10 sample pairs. Call rates were assessed for each sample individually. For the GWAS samples, we compared data according to source of DNA and center of origin. We observed high concordance in genotyping quality and quantity between the paired samples and minimal loss of quality and quantity of DNA in the saliva samples in the large GWAS sample, with the blood samples showing greater variation between centers of origin. This large data set highlights the usefulness of saliva DNA for genotyping, especially in high-density single-nucleotide polymorphism microarray studies such as GWAS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-periodic structural variation has been found in the high Tc cuprates, YBa2Cu3O7-x and Hg0.67Pb0.33Ba2Ca2Cu 3O8+δ, by image analysis of high resolution transmission electron microscope (HRTEM) images. We use two methods for analysis of the HRTEM images. The first method is a means for measuring the bending of lattice fringes at twin planes. The second method is a low-pass filter technique which enhances information contained by diffuse-scattered electrons and reveals what appears to be an interference effect between domains of differing lattice parameter in the top and bottom of the thin foil. We believe that these methods of image analysis could be usefully applied to the many thousands of HRTEM images that have been collected by other workers in the high temperature superconductor field. This work provides direct structural evidence for phase separation in high Tc cuprates, and gives support to recent stripes models that have been proposed to explain various angle resolved photoelectron spectroscopy and nuclear magnetic resonance data. We believe that the structural variation is a response to an opening of an electronic solubility gap where holes are not uniformly distributed in the material but are confined to metallic stripes. Optimum doping may occur as a consequence of the diffuse boundaries between stripes which arise from spinodal decomposition. Theoretical ideas about the high Tc cuprates which treat the cuprates as homogeneous may need to be modified in order to take account of this type of structural variation.