76 resultados para MS-based methods


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Importanceof the field: Survivin is a prominent anti-apoptotic molecule expressed widely in the majority of cancers. Overexpression of survivin leads to uncontrolled cancer cell growth and drug resistance. Efficient downregulation of survivin expression and its functions can sensitise the tumour cells to various therapeutic interventions such as chemotherapeutic agents leading to cell apoptosis.

Areas covered in this review: The article thoroughly analyses up-to-date information on the knowledge generated from the survivin patents. Various key areas of research in terms of understanding survivin biology and its targeting are discussed in detail.

What the reader will gain: The article clearly gives an insight on the recent developments undertaken to understand the roles of survivin in cancer and in validating various treatment paradigms that suppress survivin expression in cancer cells.

Take home message:  Most recent developments are helpful for effectively downregulating survivin expression by using various therapeutic platforms such as chemotherapeutic drugs, immunotechnology, antisense, dominant negative survivin mutant, RNA interference and peptide-based methods. However, selective and specific targeting of survivin in cancer cells still poses a major challenge. Nanotechnology-based platforms are currently under development to enable site-specific targeting of survivin in tumour cells.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aims To review the current research of hidden populations of illicit drugs users using web-based methods and discuss major advantages and disadvantages.

Methods Systematic review of 16 databases, including PubMed, PsycINFO (EBSCOhost), CSA Sociological Abstracts, Expanded Academic ASAP and Google Scholar.

Findings Substances researched were most commonly ‘party/club drugs’ (such as ecstasy) and cannabis. All of the studies reviewed concluded that the internet is a useful tool for reaching hidden populations, but is likely to impose some bias in samples. Advantages include: access to previously under-researched target groups; speed; international applications; increased ease of data entry; and improved confidentiality for respondents. The major disadvantage is a lack of representativeness of samples.

Conclusions Internet research is successful at accessing hidden populations of illicit drugs users, when appropriately targeted and provides unprecedented opportunities for research across a wide range of topics within the addictions field. Findings are unlikely to be generalisable to the general public, but appropriate for describing target populations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Civil infrastructures begin to deteriorate once they are built and used. Detecting the damages in a structure to maintain its safety is a topic that has received considerable attention in the literature in recent years. In vibration-based methods, the first few modes are used to assess the locations and the amount of damage. However, a small number of the global modes are not sufficient to reliably detect minor damage in the structure. Also, a common limitation of these techniques is that they require a high-fidelity model of the structure to start with, which is usually not available. Recently, guided waves (GW) have been found as an effective and efficient way to detect incipient damages due to its capacity of relatively long propagation range as well as its flexibility in selecting sensitive mode-frequency combinations. In this paper, an integrated structural health monitoring test scheme is developed to detect damages in reinforced concrete (RC) beams. Each beam is loaded at the middle span progressively to damage. During each loading step, acoustic emission (AE) method is used as a passive monitoring method to catch the AE signals caused by the crack opening and propagation. After each loading step, vibration tests and guided wave tests are conducted as a combined active monitoring measure. The modal parameters and wave propagation results are used to derive the damage information. Experimental results show that the integrated method is efficient to detect incipient damages in RC structures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Network traffic classification is an essential component for network management and security systems. To address the limitations of traditional port-based and payload-based methods, recent studies have been focusing on alternative approaches. One promising direction is applying machine learning techniques to classify traffic flows based on packet and flow level statistics. In particular, previous papers have illustrated that clustering can achieve high accuracy and discover unknown application classes. In this work, we present a novel semi-supervised learning method using constrained clustering algorithms. The motivation is that in network domain a lot of background information is available in addition to the data instances themselves. For example, we might know that flow ƒ1 and ƒ2 are using the same application protocol because they are visiting the same host address at the same port simultaneously. In this case, ƒ1 and ƒ2 shall be grouped into the same cluster ideally. Therefore, we describe these correlations in the form of pair-wise must-link constraints and incorporate them in the process of clustering. We have applied three constrained variants of the K-Means algorithm, which perform hard or soft constraint satisfaction and metric learning from constraints. A number of real-world traffic traces have been used to show the availability of constraints and to test the proposed approach. The experimental results indicate that by incorporating constraints in the course of clustering, the overall accuracy and cluster purity can be significantly improved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Nonnegative matrix factorization based methods provide one of the simplest and most effective approaches to text mining. However, their applicability is mainly limited to analyzing a single data source. In this paper, we propose a novel joint matrix factorization framework which can jointly analyze multiple data sources by exploiting their shared and individual structures. The proposed framework is flexible to handle any arbitrary sharing configurations encountered in real world data. We derive an efficient algorithm for learning the factorization and show that its convergence is theoretically guaranteed. We demonstrate the utility and effectiveness of the proposed framework in two real-world applications–improving social media retrieval using auxiliary sources and cross-social media retrieval. Representing each social media source using their textual tags, for both applications, we show that retrieval performance exceeds the existing state-of-the-art techniques. The proposed solution provides a generic framework and can be applicable to a wider context in data mining wherever one needs to exploit mutual and individual knowledge present across multiple data sources.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, typing biometrics is applied as an additional security measure to the password-based or Personal Identification Number (PIN)-based systems to authenticate the identity of computer users. In particular, keystroke pressure and latency signals are analyzed using the Fuzzy Min-Max (FMM) neural network for authentication purposes. A special pressure-sensitive keyboard is designed to collect keystroke pressure signals, in addition to the latency signals, from computer users when they type their passwords. Based on the keystroke pressure and latency signals, the FMM network is employed to classify the computer users into two categories, i.e., genuine users or impostors. To assess the effectiveness of the proposed approach, two sets of experiments are conducted, and the results are compared with those from statistical methods and neural network models. The experimental outcomes positively demonstrate the potentials of using typing biometrics and the FMM network to provide an additional security layer for the current password-based or PIN-based methods in authenticating the identity of computer users.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objectives: 

To provide an in-depth analysis of outcome measures used in the evaluation of chronic disease self-management programs consistent with the Stanford curricula.

Methods:
Based on a systematic review on self-management programs, effect sizes derived from reported outcome measures are categorized according to the quality of life appraisal model developed by Schwartz and Rapkin which classifies outcomes from performance-based measures (e.g., clinical outcomes) to evaluation-based measures (e.g., emotional well-being).

Results:
The majority of outcomes assessed in self-management trials are based on evaluation-based methods. Overall, effects on knowledge—the only performance-based measure observed in selected trials—are generally medium to large. In contrast, substantially more inconsistent results are found for both perception- and evaluation-based measures that mostly range between nil and small positive effects.

Conclusions:
Effectiveness of self-management interventions and resulting recommendations for health policy makers are most frequently derived from highly variable evaluation-based measures, that is, types of outcomes that potentially carry a substantial amount of measurement error and/or bias such as response shift. Therefore, decisions regarding the value and efficacy of chronic disease self-management programs need to be interpreted with care. More research, especially qualitative studies, is needed to unravel cognitive processes and the role of response shift bias in the measurement of change.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Sparse representation has been introduced to address many recognition problems in computer vision. In this paper, we propose a new framework for object categorization based on sparse representation of local features. Unlike most of previous sparse coding based methods in object classification that only use sparse coding to extract high-level features, the proposed method incorporates sparse representation and classification into a unified framework. Therefore, it does not need a further classifier. Experimental results show that the proposed method achieved better or comparable accuracy than the well known bag-of-features representation with various classifiers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Quantification of uncertainties associated with wind power generation forecasts is essential for optimal management of wind farms and their successful integration into power systems. This paper investigates two neural network-based methods for direct and rapid construction of prediction intervals (PIs) for short-term forecasting of power generation in wind farms. The lower upper bound estimation and bootstrap methods are used to quantify uncertainties associated with forecasts. The effectiveness and efficiency of these two general methods for uncertainty quantification is examined using twenty four month data from a wind farm in Australia. PIs with a confidence level of 90% are constructed for four forecasting horizons: five, ten, fifteen, and thirty minutes. Quantitative measures are applied for objective evaluation and unbiased comparison of PI quality. Demonstrated results indicate that reliable PIs can be constructed in a short time without resorting to complicate computational methods or models. Also quantitative comparison reveals that bootstrap PIs are more suitable for short prediction horizon, and lower upper bound estimation PIs are more appropriate for longer forecasting horizons.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The problem of automatic face recognition (AFR) concerns matching a detected (roughly localized) face against a database of known faces with associated identities. This task, although very intuitive to humans and despite the vast amounts of research behind it, still poses a significant challenge to computer-based methods. For reviews of the literature and commercial state-of-the-art see [21, 372] and [252, 253]. Much AFR research has concentrated on the user authentication paradigm (e.g. [10, 30, 183]). In contrast, we consider the content-based multimedia retrieval setup: our aim is to retrieve, and rank by confidence, film shots based on the presence of specific actors. A query to the system consists of the user choosing the person of interest in one or more keyframes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Classical proinflammatory eicosanoids, and more recently discovered lipid mediators with anti-inflammatory and proresolving bioactivity, exert a complex role in the initiation, control, and resolution of inflammation. Using a targeted lipidomics approach, we investigated circulating lipid mediator responses to resistance exercise and treatment with the NSAID ibuprofen. Human subjects undertook a single bout of unaccustomed resistance exercise (80% of one repetition maximum) following oral ingestion of ibuprofen (400 mg) or placebo control. Venous blood was collected during early recovery (0–3 h and 24 h postexercise), and serum lipid mediator composition was analyzed by LC-MS-based targeted lipidomics. Postexercise recovery was characterized by elevated levels of cyclooxygenase (COX)-1 and 2-derived prostanoids (TXB2, PGE2, PGD2, PGF2α, and PGI2), lipooxygenase (5-LOX, 12-LOX, and 15-LOX)-derived hydroxyeicosatetraenoic acids (HETEs), and leukotrienes (e.g., LTB4), and epoxygenase (CYP)-derived epoxy/dihydroxy eicosatrienoic acids (EpETrEs/DiHETrEs). Additionally, we detected elevated levels of bioactive lipid mediators with anti-inflammatory and proresolving properties, including arachidonic acid-derived lipoxins (LXA4 and LXB4), and the EPA (E-series) and DHA (D-series)-derived resolvins (RvD1 and RvE1), and protectins (PD1 isomer 10S, 17S-diHDoHE). Ibuprofen treatment blocked exercise-induced increases in COX-1 and COX-2-derived prostanoids but also resulted in off-target reductions in leukotriene biosynthesis, and a diminished proresolving lipid mediator response. CYP pathway product metabolism was also altered by ibuprofen treatment, as indicated by elevated postexercise serum 5,6-DiHETrE and 8,9-DiHETrE only in those receiving ibuprofen. These findings characterize the blood inflammatory lipid mediator response to unaccustomed resistance exercise in humans and show that acute proinflammatory signals are mechanistically linked to the induction of a biological active inflammatory resolution program, regulated by proresolving lipid mediators during postexercise recovery.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Clubroot, caused by Plasmodiophora brassicae, is one of the most important diseases of brassicas. Management of clubroot is difficult, and the best means of avoiding the disease include planting in areas where P. brassicae is not present and using plants and growing media free from pathogen inoculum. As P. brassicae is not culturable, its detection has traditionally relied on plant bioassays, which are time-consuming and require large amounts of glasshouse space. More recently, fluorescence microscopy, serology, and DNA-based methods have all been used to test soil, water, or plant samples for clubroot. The use of fluorescence microscopy to detect and count pathogen spores in the soil requires significant operator skill and is unlikely to serve as the basis for a routine diagnostic test. By contrast, serologic assays are inexpensive and amenable to high-throughput screening but need to be based on monoclonal antibodies because polyclonal antisera cannot be reproduced and are therefore of limited quantity. Several polymerase chain reaction (PCR)-based assays have also been developed; these are highly specific for P. brassicae and have been well-correlated with disease severity. As such, PCR-based diagnostic tests have been adopted to varying extents in Canada and Australia, but wide implementation has been restricted by sample processing costs. Efforts are underway to develop inexpensive serologic on-farm diagnostic kits and to improve quantification of pathogen inoculum levels through real-time PCR. Proper detection and quantification of P. brassicae will likely play an increasingly important role in the development of effective clubroot management strategies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Texture classification is one of the most important tasks in computer vision field and it has been extensively investigated in the last several decades. Previous texture classification methods mainly used the template matching based methods such as Support Vector Machine and k-Nearest-Neighbour for classification. Given enough training images the state-of-the-art texture classification methods could achieve very high classification accuracies on some benchmark databases. However, when the number of training images is limited, which usually happens in real-world applications because of the high cost of obtaining labelled data, the classification accuracies of those state-of-the-art methods would deteriorate due to the overfitting effect. In this paper we aim to develop a novel framework that could correctly classify textural images with only a small number of training images. By taking into account the repetition and sparsity property of textures we propose a sparse representation based multi-manifold analysis framework for texture classification from few training images. A set of new training samples are generated from each training image by a scale and spatial pyramid, and then the training samples belonging to each class are modelled by a manifold based on sparse representation. We learn a dictionary of sparse representation and a projection matrix for each class and classify the test images based on the projected reconstruction errors. The framework provides a more compact model than the template matching based texture classification methods, and mitigates the overfitting effect. Experimental results show that the proposed method could achieve reasonably high generalization capability even with as few as 3 training images, and significantly outperforms the state-of-the-art texture classification approaches on three benchmark datasets. © 2014 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper deals with blind separation of spatially correlated signals mixed by an instantaneous system. Taking advantage of the fact that the source signals are accessible in some man-made systems such as wireless communication systems, we preprocess the source signals in transmitters by a set of properly designed first-order precoders and then the coded signals are transmitted. At the receiving side, information about the precoders are utilized to perform signal separation. Compared with the existing precoder-based methods, the new method only employs the simplest first-order precoders, which reduces the delay in data transmission and is easier to implement in practical applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The reduction of size of ensemble classifiers is important for various security applications. The majority of known pruning algorithms belong to the following three categories: ranking based, clustering based, and optimization based methods. The present paper introduces and investigates a new pruning technique. It is called a Three-Level Pruning Technique, TLPT, because it simultaneously combines all three approaches in three levels of the process. This paper investigates the TLPT method combining the state-of-the-art ranking of the Ensemble Pruning via Individual Contribution ordering, EPIC, the clustering of the K-Means Pruning, KMP, and the optimisation method of Directed Hill Climbing Ensemble Pruning, DHCEP, for a phishing dataset. Our new experiments presented in this paper show that the TLPT is competitive in comparison to EPIC, KMP and DHCEP, and can achieve better outcomes. These experimental results demonstrate the effectiveness of the TLPT technique in this example of information security application.