21 resultados para Robust Performance

em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Background

Interaction of a drug or chemical with a biological system can result in a gene-expression profile or signature characteristic of the event. Using a suitably robust algorithm these signatures can potentially be used to connect molecules with similar pharmacological or toxicological properties by gene expression profile. Lamb et al first proposed the Connectivity Map [Lamb et al (2006), Science 313, 1929–1935] to make successful connections among small molecules, genes, and diseases using genomic signatures.

Results

Here we have built on the principles of the Connectivity Map to present a simpler and more robust method for the construction of reference gene-expression profiles and for the connection scoring scheme, which importantly allows the valuation of statistical significance of all the connections observed. We tested the new method with two randomly generated gene signatures and three experimentally derived gene signatures (for HDAC inhibitors, estrogens, and immunosuppressive drugs, respectively). Our testing with this method indicates that it achieves a higher level of specificity and sensitivity and so advances the original method.

Conclusion

The method presented here not only offers more principled statistical procedures for testing connections, but more importantly it provides effective safeguard against false connections at the same time achieving increased sensitivity. With its robust performance, the method has potential use in the drug development pipeline for the early recognition of pharmacological and toxicological properties in chemicals and new drug candidates, and also more broadly in other 'omics sciences.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The use of biosensors attached to the body for health monitoring is now readily accepted, and the merits of such systems and their potential impact on healthcare receive much attention. Wearable medical systems used in clinical applications to monitor vital signs must be comfortable to wear, yet have robust performance to ensure reliable communications links. Additionally, and vital to the success of these innovations, is that these solutions are disposable to avoid risk of patient infection and this means that they must be ultra-low cost. Antennas optimized for printing using conductive inks offer new exciting advances in making a truly disposable solution.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

A compact implantable printed meandered folded dipole antenna with a volume of 101.8 mm3 and robust performance is presented for operation in the 2.4 GHz medical ISM bands. The implant antenna is shown to maintain its return loss performance in the 2360???2400 MHz, 2400???2483.5 MHz and 2483.5???2500 MHz frequency bands, simulated in eleven different body tissue types with a broad range of electrical properties. Bandwidth and resonant frequency changes are reported for the same antenna implanted in high water content tissues such as muscle and skin as well as low water content tissues such as subcutaneous fat and bone. The antenna was also shown to maintain its return loss performance as it was moved towards a tissue boundary within a simulated phantom testbed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of this paper is to analyse vulnerability and robustness of small and medium size enterprises (SMEs) supply chains and to consider contextual factors that might influence the success of their disturbance management: Risky product and business environment. By using an exploratory case study it is shown how these contextual factors attribute vulnerability sources, contribute to the robustness of a company’s performance and supply chain vulnerability, as well as how a company seeks to manage internal and external vulnerability sources. The exploratory case is based on a fresh food supply chain of a manufacturing SME operating in a developing market.
Case findings suggest that fresh food supply chains of a manufacturing SME in developing markets are prone to disruptions of their logistics and production processes due to ‘riskiness’ of fresh food products, the ‘riskiness’ of developing markets, as well as ‘riskiness’ of SMEs themselves. However, this does not necessarily indicate the vulnerability of an SME and its entire supply chain. Findings indicate that SMEs can be very successful in disturbance management by selective use of redesign strategies that aim to prevent or reduce the impact of disturbances. More precise, it is likely that an SME can achieve robust performance by employing preventive redesign strategies in managing disturbances that result from internal, company related vulnerability sources, while impact reduction strategies are likely to contribute to robust performance of an SME if used to manage disturbances that result from internal, supply chain related vulnerability sources, as well as external vulnerability sources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Face recognition with unknown, partial distortion and occlusion is a practical problem, and has a wide range of applications, including security and multimedia information retrieval. The authors present a new approach to face recognition subject to unknown, partial distortion and occlusion. The new approach is based on a probabilistic decision-based neural network, enhanced by a statistical method called the posterior union model (PUM). PUM is an approach for ignoring severely mismatched local features and focusing the recognition mainly on the reliable local features. It thereby improves the robustness while assuming no prior information about the corruption. We call the new approach the posterior union decision-based neural network (PUDBNN). The new PUDBNN model has been evaluated on three face image databases (XM2VTS, AT&T and AR) using testing images subjected to various types of simulated and realistic partial distortion and occlusion. The new system has been compared to other approaches and has demonstrated improved performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Co-localisation is a widely used measurement in immunohistochemical analysis to determine if fluorescently labelled biological entities, such as cells, proteins or molecules share a same location. However the measurement of co-localisation is challenging due to the complex nature of such fluorescent images, especially when multiple focal planes are captured. The current state-of-art co-localisation measurements of 3-dimensional (3D) image stacks are biased by noise and cross-overs from non-consecutive planes.

Method: In this study, we have developed Co-localisation Intensity Coefficients (CICs) and Co-localisation Binary Coefficients (CBCs), which uses rich z-stack data from neighbouring focal planes to identify similarities between image intensities of two and potentially more fluorescently-labelled biological entities. This was developed using z-stack images from murine organotypic slice cultures from central nervous system tissue, and two sets of pseudo-data. A large amount of non-specific cross-over situations are excluded using this method. This proposed method is also proven to be robust in recognising co-localisations even when images are polluted with a range of noises.

Results: The proposed CBCs and CICs produce robust co-localisation measurements which are easy to interpret, resilient to noise and capable of removing a large amount of false positivity, such as non-specific cross-overs. Performance of this method of measurement is significantly more accurate than existing measurements, as determined statistically using pseudo datasets of known values. This method provides an important and reliable tool for fluorescent 3D neurobiological studies, and will benefit other biological studies which measure fluorescence co-localisation in 3D.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performance of the surface zone of concrete is acknowledged as a major factor governing the rate of deterioration of reinforced concrete structures as it provides the only barrier to the ingress of water containing dissolved ionic species such as chlorides which, ultimately, initiate corrosion of the reinforcement. In-situ monitoring of cover-zone concrete is therefore critical in attempting to make realistic predictions as to the in-service performance of the structure. To this end, this paper presents developments in a remote interrogation system to allow continuous, real-time monitoring of the cover-zone concrete from an office setting. Use is made of a multi-electrode array embedded within cover-zone concrete to acquire discretized electrical resistivity and temperature measurements, with both parameters monitored spatially and temporally. On-site instrumentation, which allows remote interrogation of concrete samples placed at a marine exposure site, is detailed, together with data handling and processing procedures. Site-measurements highlight the influence of temperature on electrical resistivity and an Arrhenius-based temperature correction protocol is developed using on-site measurements to standardize resistivity data to a reference temperature; this is an advancement over the use of laboratory-based procedures. The testing methodology and interrogation system represents a robust, low-cost and high-value technique which could be deployed for intelligent monitoring of reinforced concrete structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Handling appearance variations is a very challenging problem for visual tracking. Existing methods usually solve this problem by relying on an effective appearance model with two features: (1) being capable of discriminating the tracked target from its background, (2) being robust to the target's appearance variations during tracking. Instead of integrating the two requirements into the appearance model, in this paper, we propose a tracking method that deals with these problems separately based on sparse representation in a particle filter framework. Each target candidate defined by a particle is linearly represented by the target and background templates with an additive representation error. Discriminating the target from its background is achieved by activating the target templates or the background templates in the linear system in a competitive manner. The target's appearance variations are directly modeled as the representation error. An online algorithm is used to learn the basis functions that sparsely span the representation error. The linear system is solved via ℓ1 minimization. The candidate with the smallest reconstruction error using the target templates is selected as the tracking result. We test the proposed approach using four sequences with heavy occlusions, large pose variations, drastic illumination changes and low foreground-background contrast. The proposed approach shows excellent performance in comparison with two latest state-of-the-art trackers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High effectiveness and leanness of modern supply chains (SCs) increase their vulnerability, i.e. susceptibility to disturbances reflected in non-robust SC performances. Both the SC management literature and SC professionals indicate the need for the development of SC vulnerability assessment tools. In this article, a new method for vulnerability assessment, the VULA method, is presented. The VULA method helps to identify how much a company would underperform on a specific Key Performance Indicator in the case of a disturbance, how often this would happen and how long it would last. It ultimately informs the decision about whether process redesign is appropriate and what kind of redesign strategies should be used in order to increase the SC's robustness. The applicability of the VULA method is demonstrated in the context of a meat SC using discrete-event simulation to conduct the performance analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the maximum weighted stream posterior (MWSP) model as a robust and efficient stream integration method for audio-visual speech recognition in environments, where the audio or video streams may be subjected to unknown and time-varying corruption. A significant advantage of MWSP is that it does not require any specific measurements of the signal in either stream to calculate appropriate stream weights during recognition, and as such it is modality-independent. This also means that MWSP complements and can be used alongside many of the other approaches that have been proposed in the literature for this problem. For evaluation we used the large XM2VTS database for speaker-independent audio-visual speech recognition. The extensive tests include both clean and corrupted utterances with corruption added in either/both the video and audio streams using a variety of types (e.g., MPEG-4 video compression) and levels of noise. The experiments show that this approach gives excellent performance in comparison to another well-known dynamic stream weighting approach and also compared to any fixed-weighted integration approach in both clean conditions or when noise is added to either stream. Furthermore, our experiments show that the MWSP approach dynamically selects suitable integration weights on a frame-by-frame basis according to the level of noise in the streams and also according to the naturally fluctuating relative reliability of the modalities even in clean conditions. The MWSP approach is shown to maintain robust recognition performance in all tested conditions, while requiring no prior knowledge about the type or level of noise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The operation of supply chains (SCs) has for many years been focused on efficiency, leanness and responsiveness. This has resulted in reduced slack in operations, compressed cycle times, increased productivity and minimised inventory levels along the SC. Combined with tight tolerance settings for the realisation of logistics and production processes, this has led to SC performances that are frequently not robust. SCs are becoming increasingly vulnerable to disturbances, which can decrease the competitive power of the entire chain in the market. Moreover, in the case of food SCs non-robust performances may ultimately result in empty shelves in grocery stores and supermarkets.
The overall objective of this research is to contribute to Supply Chain Management (SCM) theory by developing a structured approach to assess SC vulnerability, so that robust performances of food SCs can be assured. We also aim to help companies in the food industry to evaluate their current state of vulnerability, and to improve their performance robustness through a better understanding of vulnerability issues. The following research questions (RQs) stem from these objectives:
RQ1: What are the main research challenges related to (food) SC robustness?
RQ2: What are the main elements that have to be considered in the design of robust SCs and what are the relationships between these elements?
RQ3: What is the relationship between the contextual factors of food SCs and the use of disturbance management principles?
RQ4: How to systematically assess the impact of disturbances in (food) SC processes on the robustness of (food) SC performances?
To answer these RQs we used different methodologies, both qualitative and quantitative. For each question, we conducted a literature survey to identify gaps in existing research and define the state of the art of knowledge on the related topics. For the second and third RQ, we conducted both exploration and testing on selected case studies. Finally, to obtain more detailed answers to the fourth question, we used simulation modelling and scenario analysis for vulnerability assessment.
Main findings are summarised as follows.
Based on an extensive literature review, we answered RQ1. The main research challenges were related to the need to define SC robustness more precisely, to identify and classify disturbances and their causes in the context of the specific characteristics of SCs and to make a systematic overview of (re)design strategies that may improve SC robustness. Also, we found that it is useful to be able to discriminate between varying degrees of SC vulnerability and to find a measure that quantifies the extent to which a company or SC shows robust performances when exposed to disturbances.
To address RQ2, we define SC robustness as the degree to which a SC shows an acceptable performance in (each of) its Key Performance Indicators (KPIs) during and after an unexpected event that caused a disturbance in one or more logistics processes. Based on the SCM literature we identified the main elements needed to achieve robust performances and structured them together to form a conceptual framework for the design of robust SCs. We then explained the logic of the framework and elaborate on each of its main elements: the SC scenario, SC disturbances, SC performance, sources of food SC vulnerability, and redesign principles and strategies.
Based on three case studies, we answered RQ3. Our major findings show that the contextual factors have a consistent relationship to Disturbance Management Principles (DMPs). The product and SC environment characteristics are contextual factors that are hard to change and these characteristics initiate the use of specific DMPs as well as constrain the use of potential response actions. The process and the SC network characteristics are contextual factors that are easier to change, and they are affected by the use of the DMPs. We also found a notable relationship between the type of DMP likely to be used and the particular combination of contextual factors present in the observed SC.
To address RQ4, we presented a new method for vulnerability assessments, the VULA method. The VULA method helps to identify how much a company is underperforming on a specific Key Performance Indicator (KPI) in the case of a disturbance, how often this would happen and how long it would last. It ultimately informs the decision maker about whether process redesign is needed and what kind of redesign strategies should be used in order to increase the SC’s robustness. The VULA method is demonstrated in the context of a meat SC using discrete-event simulation. The case findings show that performance robustness can be assessed for any KPI using the VULA method.
To sum-up the project, all findings were incorporated within an integrated framework for designing robust SCs. The integrated framework consists of the following steps: 1) Description of the SC scenario and identification of its specific contextual factors; 2) Identification of disturbances that may affect KPIs; 3) Definition of the relevant KPIs and identification of the main disturbances through assessment of the SC performance robustness (i.e. application of the VULA method); 4) Identification of the sources of vulnerability that may (strongly) affect the robustness of performances and eventually increase the vulnerability of the SC; 5) Identification of appropriate preventive or disturbance impact reductive redesign strategies; 6) Alteration of SC scenario elements as required by the selected redesign strategies and repeat VULA method for KPIs, as defined in Step 3.
Contributions of this research are listed as follows. First, we have identified emerging research areas - SC robustness, and its counterpart, vulnerability. Second, we have developed a definition of SC robustness, operationalized it, and identified and structured the relevant elements for the design of robust SCs in the form of a research framework. With this research framework, we contribute to a better understanding of the concepts of vulnerability and robustness and related issues in food SCs. Third, we identified the relationship between contextual factors of food SCs and specific DMPs used to maintain robust SC performances: characteristics of the product and the SC environment influence the selection and use of DMPs; processes and SC networks are influenced by DMPs. Fourth, we developed specific metrics for vulnerability assessments, which serve as a basis of a VULA method. The VULA method investigates different measures of the variability of both the duration of impacts from disturbances and the fluctuations in their magnitude.
With this project, we also hope to have delivered practical insights into food SC vulnerability. First, the integrated framework for the design of robust SCs can be used to guide food companies in successful disturbance management. Second, empirical findings from case studies lead to the identification of changeable characteristics of SCs that can serve as a basis for assessing where to focus efforts to manage disturbances. Third, the VULA method can help top management to get more reliable information about the “health” of the company.
The two most important research opportunities are: First, there is a need to extend and validate our findings related to the research framework and contextual factors through further case studies related to other types of (food) products and other types of SCs. Second, there is a need to further develop and test the VULA method, e.g.: to use other indicators and statistical measures for disturbance detection and SC improvement; to define the most appropriate KPI to represent the robustness of a complete SC. We hope this thesis invites other researchers to pick up these challenges and help us further improve the robustness of (food) SCs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a novel method of audio-visual feature-level fusion for person identification where both the speech and facial modalities may be corrupted, and there is a lack of prior knowledge about the corruption. Furthermore, we assume there are limited amount of training data for each modality (e.g., a short training speech segment and a single training facial image for each person). A new multimodal feature representation and a modified cosine similarity are introduced to combine and compare bimodal features with limited training data, as well as vastly differing data rates and feature sizes. Optimal feature selection and multicondition training are used to reduce the mismatch between training and testing, thereby making the system robust to unknown bimodal corruption. Experiments have been carried out on a bimodal dataset created from the SPIDRE speaker recognition database and AR face recognition database with variable noise corruption of speech and occlusion in the face images. The system's speaker identification performance on the SPIDRE database, and facial identification performance on the AR database, is comparable with the literature. Combining both modalities using the new method of multimodal fusion leads to significantly improved accuracy over the unimodal systems, even when both modalities have been corrupted. The new method also shows improved identification accuracy compared with the bimodal systems based on multicondition model training or missing-feature decoding alone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Before a natural sound can be recognized, an auditory signature of its source must be learned through experience. Here we used random waveforms to probe the formation of new memories for arbitrary complex sounds. A behavioral measure was designed, based on the detection of repetitions embedded in noises up to 4 s long. Unbeknownst to listeners, some noise samples reoccurred randomly throughout an experimental block. Results showed that repeated exposure induced learning for otherwise totally unpredictable and meaningless sounds. The learning was unsupervised and resilient to interference from other task-relevant noises. When memories were formed, they emerged rapidly, performance became abruptly near-perfect, and multiple noises were remembered for several weeks. The acoustic transformations to which recall was tolerant suggest that the learned features were local in time. We propose that rapid sensory plasticity could explain how the auditory brain creates useful memories from the ever-changing, but sometimes repeating, acoustical world. © 2010 Elsevier Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we explore various arithmetic units for possible use in high-speed, high-yield ALUs operated at scaled supply voltage with adaptive clock stretching. We demonstrate that careful logic optimization of the existing arithmetic units (to create hybrid units) indeed make them further amenable to supply voltage scaling. Such hybrid units result from mixing right amount of fast arithmetic into the slower ones. Simulations on different hybrid adder and multipliers in BPTM 70 nm technology show 18%-50% improvements in power compared to standard adders with only 2%-8% increase in die-area at iso-yield. These optimized datapath units can be used to construct voltage scalable robust ALUs that can operate at high clock frequency with minimal performance degradation due to occasional clock stretching. © 2009 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sparse representation based visual tracking approaches have attracted increasing interests in the community in recent years. The main idea is to linearly represent each target candidate using a set of target and trivial templates while imposing a sparsity constraint onto the representation coefficients. After we obtain the coefficients using L1-norm minimization methods, the candidate with the lowest error, when it is reconstructed using only the target templates and the associated coefficients, is considered as the tracking result. In spite of promising system performance widely reported, it is unclear if the performance of these trackers can be maximised. In addition, computational complexity caused by the dimensionality of the feature space limits these algorithms in real-time applications. In this paper, we propose a real-time visual tracking method based on structurally random projection and weighted least squares techniques. In particular, to enhance the discriminative capability of the tracker, we introduce background templates to the linear representation framework. To handle appearance variations over time, we relax the sparsity constraint using a weighed least squares (WLS) method to obtain the representation coefficients. To further reduce the computational complexity, structurally random projection is used to reduce the dimensionality of the feature space while preserving the pairwise distances between the data points in the feature space. Experimental results show that the proposed approach outperforms several state-of-the-art tracking methods.