976 resultados para least common subgraph algorithm


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper formulates a linear kernel support vector machine (SVM) as a regularized least-squares (RLS) problem. By defining a set of indicator variables of the errors, the solution to the RLS problem is represented as an equation that relates the error vector to the indicator variables. Through partitioning the training set, the SVM weights and bias are expressed analytically using the support vectors. It is also shown how this approach naturally extends to Sums with nonlinear kernels whilst avoiding the need to make use of Lagrange multipliers and duality theory. A fast iterative solution algorithm based on Cholesky decomposition with permutation of the support vectors is suggested as a solution method. The properties of our SVM formulation are analyzed and compared with standard SVMs using a simple example that can be illustrated graphically. The correctness and behavior of our solution (merely derived in the primal context of RLS) is demonstrated using a set of public benchmarking problems for both linear and nonlinear SVMs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The incidence of melanoma has increased rapidly over the past 30 years, and the disease is now the sixth most common cancer among men and women in the U.K. Many patients are diagnosed with or develop metastatic disease, and survival is substantially reduced in these patients. Mutations in the BRAF gene have been identified as key drivers of melanoma cells and are found in around 50% of cutaneous melanomas. Vemurafenib (Zelboraf(®) ; Roche Molecular Systems Inc., Pleasanton, CA, U.S.A.) is the first licensed inhibitor of mutated BRAF, and offers a new first-line option for patients with unresectable or metastatic melanoma who harbour BRAF mutations. Vemurafenib was developed in conjunction with a companion diagnostic, the cobas(®) 4800 BRAF V600 Mutation Test. The purpose of this paper is to make evidence-based recommendations to facilitate the implementation of BRAF mutation testing and targeted therapy in patients with metastatic melanoma in the U.K. The recommendations are the result of a meeting of an expert panel and have been reviewed by melanoma specialists and representatives of the National Cancer Research Network Clinical Study Group on behalf of the wider melanoma community. This article is intended to be a starting point for practical advice and recommendations, which will no doubt be updated as we gain further experience in personalizing therapy for patients with melanoma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The evolution of wireless communication systems leads to Dynamic Spectrum Allocation for Cognitive Radio, which requires reliable spectrum sensing techniques. Among the spectrum sensing methods proposed in the literature, those that exploit cyclostationary characteristics of radio signals are particularly suitable for communication environments with low signal-to-noise ratios, or with non-stationary noise. However, such methods have high computational complexity that directly raises the power consumption of devices which often have very stringent low-power requirements. We propose a strategy for cyclostationary spectrum sensing with reduced energy consumption. This strategy is based on the principle that p processors working at slower frequencies consume less power than a single processor for the same execution time. We devise a strict relation between the energy savings and common parallel system metrics. The results of simulations show that our strategy promises very significant savings in actual devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We surveyed macroinvertebrate communities in 31 hill streams in the Vouga River and Mondego River catchments in central Portugal. Despite applying a "least-impacted" criterion, channel and bank management was common, with 38% of streams demonstrating channel modification (damming) and 80% with evidence of bank modification. Principal component analysis (PCA) at the family and species level related the macroinvertebrates to habitat variables derived at three spatial scales -- site (20 m), reach (200 m), and catchment. Variation in community structure between sites was similar at the species and family level and was statistically related to pH, conductivity, temperature, flow, shade, and substrate size at the site scale; channel and bank habitat and riparian vegetation and land-use at the reach scale; and altitude and slope at the catchment scale. While the effects of river management were apparent in various ecologically important habitat features at the site and reach scale, a direct relationship with macroinvertebrate assemblages was only apparent between the extent of walled banks and the secondary PCA axis described by species data. The strong relationship between catchment scale variables and descriptors of physical structure at the reach and site scale suggests that catchment-scale parameters are valuable predicators of macroinvertebrate community structure in these streams despite the anthropogenic modifications of the natural habitat.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The evolution of wireless communication systems leads to Dynamic Spectrum Allocation for Cognitive Radio, which requires reliable spectrum sensing techniques. Among the spectrum sensing methods proposed in the literature, those that exploit cyclostationary characteristics of radio signals are particularly suitable for communication environments with low signal-to-noise ratios, or with non-stationary noise. However, such methods have high computational complexity that directly raises the power consumption of devices which often have very stringent low-power requirements. We propose a strategy for cyclostationary spectrum sensing with reduced energy consumption. This strategy is based on the principle that p processors working at slower frequencies consume less power than a single processor for the same execution time. We devise a strict relation between the energy savings and common parallel system metrics. The results of simulations show that our strategy promises very significant savings in actual devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Close similarities have been found between the otoliths of sea-caught and laboratory-reared larvae of the common sole Solea solea (L.), given appropriate temperatures and nourishment of the latter. But from hatching to mouth formation. and during metamorphosis, sole otoliths have proven difficult to read because the increments may be less regular and low contrast. In this study, the growth increments in otoliths of larvae reared at 12 degrees C were counted by light microscopy to test the hypothesis of daily deposition, with some results verified using scanning electron microscopy (SEM), and by image analysis in order to compare the reliability of the 2 methods in age estimation. Age was first estimated (in days posthatch) from light micrographs of whole mounted otoliths. Counts were initiated from the increment formed at the time of month opening (Day 4). The average incremental deposition rate was consistent with the daily hypothesis. However, the light-micrograph readings tended to underestimate the mean ages of the larvae. Errors were probably associated with the low-contrast increments: those deposited after the mouth formation during the transition to first feeding, and those deposited from the onset of eye migration (about 20 d posthatch) during metamorphosis. SEM failed to resolve these low-contrast areas accurately because of poor etching. A method using image analysis was applied to a subsample of micrograph-counted otoliths. The image analysis was supported by an algorithm of pattern recognition (Growth Demodulation Algorithm, GDA). On each otolith, the GDA method integrated the growth pattern of these larval otoliths to averaged data from different radial profiles, in order to demodulate the exponential trend of the signal before spectral analysis (Fast Fourier Transformation, FFT). This second method both allowed more precise designation of increments, particularly for low-contrast areas, and more accurate readings but increased error in mean age estimation. The variability is probably due to a still rough perception of otolith increments by the GDA method, counting being achieved through a theoretical exponential pattern and mean estimates being given by FFT. Although this error variability was greater than expected, the method provides for improvement in both speed and accuracy in otolith readings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this dissertation, we apply mathematical programming techniques (i.e., integer programming and polyhedral combinatorics) to develop exact approaches for influence maximization on social networks. We study four combinatorial optimization problems that deal with maximizing influence at minimum cost over a social network. To our knowl- edge, all previous work to date involving influence maximization problems has focused on heuristics and approximation. We start with the following viral marketing problem that has attracted a significant amount of interest from the computer science literature. Given a social network, find a target set of customers to seed with a product. Then, a cascade will be caused by these initial adopters and other people start to adopt this product due to the influence they re- ceive from earlier adopters. The idea is to find the minimum cost that results in the entire network adopting the product. We first study a problem called the Weighted Target Set Selection (WTSS) Prob- lem. In the WTSS problem, the diffusion can take place over as many time periods as needed and a free product is given out to the individuals in the target set. Restricting the number of time periods that the diffusion takes place over to be one, we obtain a problem called the Positive Influence Dominating Set (PIDS) problem. Next, incorporating partial incentives, we consider a problem called the Least Cost Influence Problem (LCIP). The fourth problem studied is the One Time Period Least Cost Influence Problem (1TPLCIP) which is identical to the LCIP except that we restrict the number of time periods that the diffusion takes place over to be one. We apply a common research paradigm to each of these four problems. First, we work on special graphs: trees and cycles. Based on the insights we obtain from special graphs, we develop efficient methods for general graphs. On trees, first, we propose a polynomial time algorithm. More importantly, we present a tight and compact extended formulation. We also project the extended formulation onto the space of the natural vari- ables that gives the polytope on trees. Next, building upon the result for trees---we derive the polytope on cycles for the WTSS problem; as well as a polynomial time algorithm on cycles. This leads to our contribution on general graphs. For the WTSS problem and the LCIP, using the observation that the influence propagation network must be a directed acyclic graph (DAG), the strong formulation for trees can be embedded into a formulation on general graphs. We use this to design and implement a branch-and-cut approach for the WTSS problem and the LCIP. In our computational study, we are able to obtain high quality solutions for random graph instances with up to 10,000 nodes and 20,000 edges (40,000 arcs) within a reasonable amount of time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Continuous sensor stream data are often recorded as a series of discrete points in a database from which knowledge can be retrieved through queries. Two classes of uncertainties inevitably happen in sensor streams that we present as follows. The first is Uncertainty due to Discrete Sampling (DS Uncertainty); even if every discrete point is correct, the discrete sensor stream is uncertain – that is, it is not exactly like the continuous stream – since some critical points are missing due to the limited capabilities of the sensing equipment and the database server. The second is Uncertainty due to Sampling Error (SE Uncertainty); sensor readings for the same situation cannot be repeated exactly when we record them at different times or use different sensors since different sampling errors exist. These two uncertainties reduce the efficiency and accuracy of querying common patterns. However, already known algorithms generally only resolve SE Uncertainty. In this paper, we propose a novel method of Correcting Imprecise Readings and Compressing Excrescent (CIRCE) points. Particularly, to resolve DS Uncertainty, a novel CIRCE core algorithm is developed in the CIRCE method to correct the missing critical points while compressing the original sensor streams. The experimental study based on various sizes of sensor stream datasets validates that the CIRCE core algorithm is more efficient and more accurate than a counterpart algorithm to compress sensor streams. We also resolve the SE Uncertainty problem in the CIRCE method. The application for querying longest common route patterns validates the effectiveness of our CIRCE method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The popularity of online location services provides opportunities to discover useful knowledge from trajectories of moving objects. This paper addresses the problem of mining longest common route (LCR) patterns. As a trajectory of a moving object is generally represented by a sequence of discrete locations sampled with an interval, the different trajectory instances along the same route may be denoted by different sequences of points (location, timestamp). Thus, the most challenging task in the mining process is to abstract trajectories by the right points. We propose a novel mining algorithm for LCR patterns based on turning regions (LCRTurning), which discovers a sequence of turning regions to abstract a trajectory and then maps the problem into the traditional problem of mining longest common subsequences (LCS). Effectiveness of LCRTurning algorithm is validated by an experimental study based on various sizes of simulated moving objects datasets. © 2011 Springer-Verlag.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Androgen-deprivation therapy is offered to men with prostate cancer who have a rising prostate-specific antigen after curative therapy (PSA relapse) or who are considered not suitable for curative treatment; however, the optimal timing for its introduction is uncertain. We aimed to assess whether immediate androgen-deprivation therapy improves overall survival compared with delayed therapy. Methods In this randomised, multicentre, phase 3, non-blinded trial, we recruited men through 29 oncology centres in Australia, New Zealand, and Canada. Men with prostate cancer were eligible if they had a PSA relapse after previous attempted curative therapy (radiotherapy or surgery, with or without postoperative radiotherapy) or if they were not considered suitable for curative treatment (because of age, comorbidity, or locally advanced disease). We used a database-embedded, dynamically balanced, randomisation algorithm, coordinated by the Cancer Council Victoria, to randomly assign participants (1:1) to immediate androgen-deprivation therapy (immediate therapy arm) or to delayed androgen-deprivation therapy (delayed therapy arm) with a recommended interval of at least 2 years unless clinically contraindicated. Randomisation for participants with PSA relapse was stratified by type of previous therapy, relapse-free interval, and PSA doubling time; randomisation for those with non-curative disease was stratified by metastatic status; and randomisation in both groups was stratified by planned treatment schedule (continuous or intermittent) and treatment centre. Clinicians could prescribe any form and schedule of androgen-deprivation therapy and group assignment was not masked. The primary outcome was overall survival in the intention-to-treat population. The trial closed to accrual in 2012 after review by the independent data monitoring committee, but data collection continued for 18 months until Feb 26, 2014. It is registered with the Australian New Zealand Clinical Trials Registry (ACTRN12606000301561) and ClinicalTrials.gov (NCT00110162). Findings Between Sept 3, 2004, and July 13, 2012, we recruited 293 men (261 with PSA relapse and 32 with non-curable disease). We randomly assigned 142 men to the immediate therapy arm and 151 to the delayed therapy arm. Median follow-up was 5 years (IQR 3·3–6·2) from the date of randomisation. 16 (11%) men died in the immediate therapy arm and 30 (20%) died in the delayed therapy arm. 5-year overall survival was 86·4% (95% CI 78·5–91·5) in the delayed therapy arm versus 91·2% (84·2–95·2) in the immediate therapy arm (log-rank p=0·047). After Cox regression, the unadjusted HR for overall survival for immediate versus delayed arm assignment was 0·55 (95% CI 0·30–1·00; p=0·050). 23 patients had grade 3 treatment-related adverse events. 105 (36%) men had adverse events requiring hospital admission; none of these events were attributable to treatment or differed between treatment-timing groups. The most common serious adverse events were cardiovascular, which occurred in nine (6%) patients in the delayed therapy arm and 13 (9%) in the immediate therapy arm. Interpretation Immediate receipt of androgen-deprivation therapy significantly improved overall survival compared with delayed intervention in men with PSA-relapsed or non-curable prostate cancer. The results provide benchmark evidence of survival rates and morbidity to discuss with men when considering their treatment options. Funding Australian National Health and Medical Research Council and Cancer Councils, The Royal Australian and New Zealand College of Radiologists, Mayne Pharma Australia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two novelties are introduced: (i) a finite-strain semi-implicit integration algorithm compatible with current element technologies and (ii) the application to assumed-strain hexahedra. The Löwdin algo- rithm is adopted to obtain evolving frames applicable to finite strain anisotropy and a weighted least- squares algorithm is used to determine the mixed strain. Löwdin frames are very convenient to model anisotropic materials. Weighted least-squares circumvent the use of internal degrees-of-freedom. Het- erogeneity of element technologies introduce apparently incompatible constitutive requirements. Assumed-strain and enhanced strain elements can be either formulated in terms of the deformation gradient or the Green–Lagrange strain, many of the high-performance shell formulations are corotational and constitutive constraints (such as incompressibility, plane stress and zero normal stress in shells) also depend on specific element formulations. We propose a unified integration algorithm compatible with possibly all element technologies. To assess its validity, a least-squares based hexahedral element is implemented and tested in depth. Basic linear problems as well as 5 finite-strain examples are inspected for correctness and competitive accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper compares the performance of the complex nonlinear least squares algorithm implemented in the LEVM/LEVMW software with the performance of a genetic algorithm in the characterization of an electrical impedance of known topology. The effect of the number of measured frequency points and of measurement uncertainty on the estimation of circuit parameters is presented. The analysis is performed on the equivalent circuit impedance of a humidity sensor.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An unstructured mesh �nite volume discretisation method for simulating di�usion in anisotropic media in two-dimensional space is discussed. This technique is considered as an extension of the fully implicit hybrid control-volume �nite-element method and it retains the local continuity of the ux at the control volume faces. A least squares function recon- struction technique together with a new ux decomposition strategy is used to obtain an accurate ux approximation at the control volume face, ensuring that the overall accuracy of the spatial discretisation maintains second order. This paper highlights that the new technique coincides with the traditional shape function technique when the correction term is neglected and that it signi�cantly increases the accuracy of the previous linear scheme on coarse meshes when applied to media that exhibit very strong to extreme anisotropy ratios. It is concluded that the method can be used on both regular and irregular meshes, and appears independent of the mesh quality.