77 resultados para decentralised data fusion framework
Resumo:
There is an implicit assumption in the UK Treasury’s publications on public-private partnerships (PPP) – also more commonly known in the United Kingdom as private finance initiative (PFI) - that accountability and value for money (VFM) are related concepts. While recent academic studies on PPP/PFI (from now on as PFI) have focused on VFM, there is a notable absence of studies exploring the ‘presumed’ relationships between accountability and VFM. Drawing on Dubnick’s (Dubnick and Romzek, 1991, 1993; Dubnick, 1996, 1998, 2003, 2005; Dubnick and Justice, 2002) framework for accountability and PFI literature, we develop a research framework for exploring potential relationships between accountability and VFM in PFI projects by proposing alternative accountability cultures, processes and mechanisms for PFI. The PFI accountability model is then exposed to four criteria - warrantability, tractability, measurability and feasibility. Our preliminary interviews provide us guidance in identifying some of the cultures, processes and mechanisms indicated in our model which should enable future researchers to test not only the UK Government’s claimed relationships between accountability and VFM using more specific PFI empirical data, but also a potential relationship between accountability and performance in general.
Resumo:
The foundational concept of Network Enabled Capability relies on effective, timely information sharing. This information is used in analysis, trade and scenario studies, and ultimately decision-making. In this paper, the concept of visual analytics is explored as an enabler to facilitate rapid, defensible, and superior decision-making. By coupling analytical reasoning with the exceptional human capability to rapidly internalize and understand visual data, visual analytics allows individual and collaborative decision-making to occur in the face of vast and disparate data, time pressures, and uncertainty. An example visual analytics framework is presented in the form of a decision-making environment centered on the Lockheed C-5A and C-5M aircraft. This environment allows rapid trade studies to be conducted on design, logistics, and capability within the aircraft?s operational roles. Through this example, the use of a visual analytics decision-making environment within a military environment is demonstrated.
Resumo:
The Water Framework Directive (WFD) has initiated a shift towards a targeted approach to implementation through its focus on river basin districts as management units and the natural ecological characteristics of waterbodies. Due to its role in eutrophication, phosphorus (P) has received considerable attention, resulting in a significant body of research, which now forms the evidence base for the programme of measures (POMs) adopted in WFD River Basin Management Plans (RBMP). Targeting POMs at critical sources areas (CSAs) of P could significantly improve environmental efficiency and cost effectiveness of proposed mitigation strategies. This paper summarises the progress made towards targeting mitigation measures at CSAs in Irish catchments. A review of current research highlights that knowledge related to P export at field scale is relatively comprehensive however; the availability of site-specific data and tools limits widespread identification of CSA at this scale. Increasing complexity of hydrological processes at larger scales limits accurate identification of CSA at catchment scale. Implementation of a tiered approach, using catchment scale tools in conjunction with field-by-field surveys could decrease uncertainty and provide a more practical and cost effective method of delineating CSA in a range of catchments. Despite scientific and practical uncertainties, development of a tiered CSA-based approach to assist in the development of supplementary measures would provide a means of developing catchment-specific and cost-effective programmes of measures for diffuse P. The paper presents a conceptual framework for such an approach, which would have particular relevance for the development of supplementary measures in High Status Waterbodies (HSW). The cost and resources necessary for implementation are justified based on HSWs’ value as undisturbed reference condition ecosystems.
Resumo:
The finite element method plays an extremely important role in forging process design as it provides a valid means to quantify forging errors and thereby govern die shape modification to improve the dimensional accuracy of the component. However, this dependency on process simulation could raise significant problems and present a major drawback if the finite element simulation results were inaccurate. This paper presents a novel approach to assess the dimensional accuracy and shape quality of aeroengine blades formed from finite element hot-forging simulation. The proposed virtual inspection system uses conventional algorithms adopted by modern coordinate measurement processes as well as the latest free-form surface evaluation techniques to provide a robust framework for virtual forging error assessment. Established techniques for the physical registration of real components have been adapted to localise virtual models in relation to a nominal Design Coordinate System. Blades are then automatically analysed using a series of intelligent routines to generate measurement data and compute dimensional errors. The results of a comparison study indicate that the virtual inspection results and actual coordinate measurement data are highly comparable, validating the approach as an effective and accurate means to quantify forging error in a virtual environment. Consequently, this provides adequate justification for the implementation of the virtual inspection system in the virtual process design, modelling and validation of forged aeroengine blades in industry.
Resumo:
In this paper we present a novel method for performing speaker recognition with very limited training data and in the presence of background noise. Similarity-based speaker recognition is considered so that speaker models can be created with limited training speech data. The proposed similarity is a form of cosine similarity used as a distance measure between speech feature vectors. Each speech frame is modelled using subband features, and into this framework, multicondition training and optimal feature selection are introduced, making the system capable of performing speaker recognition in the presence of realistic, time-varying noise, which is unknown during training. Speaker identi?cation experiments were carried out using the SPIDRE database. The performance of the proposed new system for noise compensation is compared to that of an oracle model; the speaker identi?cation accuracy for clean speech by the new system trained with limited training data is compared to that of a GMM trained with several minutes of speech. Both comparisons have demonstrated the effectiveness of the new model. Finally, experiments were carried out to test the new model for speaker identi?cation given limited training data and with differing levels and types of realistic background noise. The results have demonstrated the robustness of the new system.
Resumo:
We detail the calculations of North Sea Large Fish Indicator values for 2009-2011, demonstrating an apparent stall in recovery. Therefore, recovery to the Marine Strategy Framework Directive's good environmental status of 0.3 by the 2020 deadline now looks less certain and may take longer than was expected using data from 2006 to 2008.
Resumo:
Purpose: This paper investigates the link between two knowledge areas that have not been previously linked conceptually; stakeholder management and corporate culture. Focussing on the UK Construction Industry, the research study demonstrates mutual dependency of each of these areas on the other and establishes a theoretical framework with real potential to impact positively upon industry.
Design/methodology/approach: The study utilises both qualitative and quantitative data collection and then analysis to produce results contributing to the final framework. Semi-structured interviews were used and analysed through a cognitive mapping procedure. The result of this stage, set in the context of previous research, facilitated a questionnaire to be developed which helped gather quantitative values from a larger sample to enhance the final framework.
Findings: The data suggests that stakeholder management and corporate culture are key areas of an organisation’s success, and that this importance will only grow in future. A clearly identifiable relationship was established between the two theoretical areas and a framework developed and quantified.
Originality/value: It is evident that change is needed within the UK Construction Industry. Companies must employ ethical and social stakeholder management and manage their corporate culture like any other aspect of their business. Successfully doing this will lead to more successful projects, better reputation and survival. The findings of this project begin to show how change may occur and how companies might intentionally deploy advantageous configurations of corporate culture and stakeholder management.
Resumo:
Burkholderia cenocepacia is an opportunistic pathogen causing serious infections in patients with cystic fibrosis. The widespread distribution of this bacterium in the environment suggests that it must adapt to stress to be able to survive. We identified in B. cenocepacia K56-2 a gene predicted to encode RpoE, the extra-cytoplasmic stress response regulator. The rpoE gene is the first gene of a predicted operon encoding proteins homologous to RseA, RseB, MucD and a protein of unknown function. The genomic organization and the co-transcription of these genes were confirmed by PCR and RT-PCR. The mucD and rpoE genes were mutated, giving rise to B. cenocepacia RSF24 and RSF25, respectively. While mutant RSF24 did not demonstrate any growth defects under the conditions tested, RSF25 was compromised for growth under temperature (44 degrees C) and osmotic stress (426 mM NaCl). Expression of RpoE in trans could complement the osmotic growth defect but exacerbated temperature sensitivity in both RSF25 and wild-type K56-2. Inactivation of rpoE altered the bacterial cell surface, as indicated by increased binding of the fluorescent dye calcofluor white and by an altered outer-membrane protein profile. These cell surface changes were restored by complementation with a plasmid encoding rpoE. Macrophage infections in which bacterial colocalization with fluorescent dextran was examined demonstrated that the rpoE mutant could not delay the fusion of B. cenocepacia-containing vacuoles with lysosomes, in contrast to the parental strain K56-2. These data show that B. cenocepacia rpoE is required for bacterial growth under certain stress conditions and for the ability of intracellular bacteria to delay phagolysosomal fusion in macrophages.
Resumo:
Researchers and managers broadly agree that original equipment manufacturers (OEMs), which have opportunities to produce both new and remanufactured products, are better off by centrally controlling their manufacturing and remanufacturing activities. Thus, OEMs should not remanufacture used products until the remanufacturing cost is sufficiently low to overcome the negative impact of new product cannibalisation. In this paper, we present a contrasting view of the manufacturing–remanufacturing conflict: OEMs sometimes benefit from the decentralised control mode under which they ignore the internal cannibalisation rather than the remanufacturing option. We consider a decentralised closed-loop supply chain in which one OEM can purchase new components from one supplier to produce new products and collect used products from consumers to produce remanufactured products. The key feature of our model is that the OEM can select a centralised or decentralised control mode to manage its manufacturing and remanufacturing activities before the supplier prices the new component. In a steady state period setting, we analyse the players’ optimal decisions and compare the OEM's profits under centralised and decentralised control modes. Our analytic results reveal that the decentralised control within the OEM can outperform the centralised control when the cost structure of producing new and remanufactured products satisfies certain conditions. Finally, the key findings are distilled in a conceptual framework and its managerial implications are discussed.
Resumo:
The maintenance of biodiversity is a fundamental theme of the Marine Strategy Framework Directive. Appropriate indicators to monitor change in biodiversity, along with associated targets representing "good environmental status" (GES), are required to be in place by July 2012. A method for selecting species-specific metrics to fulfil various specified indicator roles is proposed for demersal fish communities. Available data frequently do not extend far enough back in time to allow GES to be defined empirically. In such situations, trends-based targets offer a pragmatic solution. A method is proposed for setting indicator-level targets for the number of species-specific metrics required to meet their trends-based metric-level targets. This is based on demonstrating significant departures from the binomial distribution. The procedure is trialled using North Sea demersal fish survey data. Although fisheries management in the North Sea has improved in recent decades, management goals to stop further decline in biodiversity, and to initiate recovery, are yet to be met.
Resumo:
This paper describes a data model for content representation of temporal media in an IP based sensor network. The model is formed by introducing the idea of semantic-role from linguistics into the underlying concepts of formal event representation with the aim of developing a common event model. The architecture of a prototype system for a multi camera surveillance system, based on the proposed model is described. The important aspects of the proposed model are its expressiveness, its ability to model content of temporal media, and its suitability for use with a natural language interface. It also provides a platform for temporal information fusion, as well as organizing sensor annotations by help of ontologies.
Resumo:
This paper presents a novel method of audio-visual feature-level fusion for person identification where both the speech and facial modalities may be corrupted, and there is a lack of prior knowledge about the corruption. Furthermore, we assume there are limited amount of training data for each modality (e.g., a short training speech segment and a single training facial image for each person). A new multimodal feature representation and a modified cosine similarity are introduced to combine and compare bimodal features with limited training data, as well as vastly differing data rates and feature sizes. Optimal feature selection and multicondition training are used to reduce the mismatch between training and testing, thereby making the system robust to unknown bimodal corruption. Experiments have been carried out on a bimodal dataset created from the SPIDRE speaker recognition database and AR face recognition database with variable noise corruption of speech and occlusion in the face images. The system's speaker identification performance on the SPIDRE database, and facial identification performance on the AR database, is comparable with the literature. Combining both modalities using the new method of multimodal fusion leads to significantly improved accuracy over the unimodal systems, even when both modalities have been corrupted. The new method also shows improved identification accuracy compared with the bimodal systems based on multicondition model training or missing-feature decoding alone.
Resumo:
The enzyme UDP-galactose 4'-epimerase (GALE) catalyses the reversible epimerisation of both UDP-galactose and UDP-N-acetyl-galactosamine. Deficiency of the human enzyme (hGALE) is associated with type III galactosemia. The majority of known mutations in hGALE are missense and private thus making clinical guidance difficult. In this study a bioinformatics approach was employed to analyse the structural effects due to each mutation using both the UDP-glucose and UDP-N-acetylglucosamine bound structures of the wild-type protein. Changes to the enzyme's overall stability, substrate/cofactor binding and propensity to aggregate were also predicted. These predictions were found to be in good agreement with previous in vitro and in vivo studies when data was available and allowed for the differentiation of those mutants that severely impair the enzyme's activity against UDP-galactose. Next this combination of techniques were applied to another twenty-six reported variants from the NCBI dbSNP database that have yet to be studied to predict their effects. This identified p.I14T, p.R184H and p.G302R as likely severely impairing mutations. Although severely impaired mutants were predicted to decrease the protein's stability, overall predicted stability changes only weakly correlated with residual activity against UDP-galactose. This suggests other protein functions such as changes in cofactor and substrate binding may also contribute to the mechanism of impairment. Finally this investigation shows that this combination of different in silico approaches is useful in predicting the effects of mutations and that it could be the basis of an initial prediction of likely clinical severity when new hGALE mutants are discovered.
Resumo:
The advent of next generation sequencing technologies (NGS) has expanded the area of genomic research, offering high coverage and increased sensitivity over older microarray platforms. Although the current cost of next generation sequencing is still exceeding that of microarray approaches, the rapid advances in NGS will likely make it the platform of choice for future research in differential gene expression. Connectivity mapping is a procedure for examining the connections among diseases, genes and drugs by differential gene expression initially based on microarray technology, with which a large collection of compound-induced reference gene expression profiles have been accumulated. In this work, we aim to test the feasibility of incorporating NGS RNA-Seq data into the current connectivity mapping framework by utilizing the microarray based reference profiles and the construction of a differentially expressed gene signature from a NGS dataset. This would allow for the establishment of connections between the NGS gene signature and those microarray reference profiles, alleviating the associated incurring cost of re-creating drug profiles with NGS technology. We examined the connectivity mapping approach on a publicly available NGS dataset with androgen stimulation of LNCaP cells in order to extract candidate compounds that could inhibit the proliferative phenotype of LNCaP cells and to elucidate their potential in a laboratory setting. In addition, we also analyzed an independent microarray dataset of similar experimental settings. We found a high level of concordance between the top compounds identified using the gene signatures from the two datasets. The nicotine derivative cotinine was returned as the top candidate among the overlapping compounds with potential to suppress this proliferative phenotype. Subsequent lab experiments validated this connectivity mapping hit, showing that cotinine inhibits cell proliferation in an androgen dependent manner. Thus the results in this study suggest a promising prospect of integrating NGS data with connectivity mapping. © 2013 McArt et al.
Resumo:
Hydrodynamic models are a powerful tool that can be used by a wide range of end users to assist in predicting the effects of both physical and biological processes on local environmental conditions. This paper describes the development of a tidal model for Strangford Lough, Northern Ireland, a body of water renowned for the location of the first grid-connected tidal turbine, SeaGen, as well as the UK’s third Marine Nature Reserve. Using MIKE 21 modelling software, the development, calibration and performance of the modelare described in detail. Strangford Lough has a complex flow pattern with high flows through the Narrows (~3.5 m/s) linking the main body of the Lough to the Irish Sea and intricate flow patterns around the numerous islands. With the aid of good quality tidal and current data obtained throughout the Lough during the model development, the surface elevation and current magnitude between the observed and numerical model were almost identical with model skill >0.98 and >0.84 respectively. The applicability of the model is such that it can be used as an important tool for the prediction of important ecological processes as well as engineering applications within Strangford Lough.