951 resultados para Selection techniques


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research proposes a method for extracting technology intelligence (TI) systematically from a large set of document data. To do this, the internal and external sources in the form of documents, which might be valuable for TI, are first identified. Then the existing techniques and software systems applicable to document analysis are examined. Finally, based on the reviews, a document-mining framework designed for TI is suggested and guidelines for software selection are proposed. The research output is expected to support intelligence operatives in finding suitable techniques and software systems for getting value from document-mining and thus facilitate effective knowledge management. Copyright © 2012 Inderscience Enterprises Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a new technique called‘Tilt Menu’ for better extending selection capabilities of pen-based interfaces.The Tilt Menu is implemented by using 3D orientation information of pen devices while performing selection tasks.The Tilt Menu has the potential to aid traditional onehanded techniques as it simultaneously generates the secondary input (e.g., a command or parameter selection) while drawing/interacting with a pen tip without having to use the second hand or another device. We conduct two experiments to explore the performance of the Tilt Menu. In the first experiment, we analyze the effect of parameters of the Tilt Menu, such as the menu size and orientation of the item, on its usability. Results of the first experiment suggest some design guidelines for the Tilt Menu. In the second experiment, the Tilt Menu is compared to two types of techniques while performing connect-the-dot tasks using freeform drawing mechanism. Results of the second experiment show that the Tilt Menu perform better in comparison to the Tool Palette, and is as good as the Toolglass.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the most attractive features of derivative spectrometry is its higher resolving power. In the present power, numerical derivative techniques are evaluated from the viewpoint of increase in selectivity, the latter being expressed in terms of the interferent equivalent concentration (IEC). Typical spectral interferences are covered, including flat background, sloped background, simple curved background and various types of line overlap with different overlapping degrees, which were defined as the ratio of the net interfering signal at the analysis wavelength to the peak signal of the interfering line. the IECs in the derivative spectra are decreased by one to two order of magnitudes compared to those in the original spectra, and in the most cases, assume values below the conventional detection limits. The overlapping degree is the dominant factor that determines whether an analysis line can be resolved from an interfering line with the derivative techniques. Generally, the second derivative technique is effective only for line overlap with an overlapping degree of less than 0.8. The effects of other factors such as line shape, data smoothing, step size and the intensity ratio of analyte to interferent on the performance of the derivative techniques are also discussed. All results are illustrated with practical examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

R. Jensen and Q. Shen. Fuzzy-Rough Sets Assisted Attribute Selection. IEEE Transactions on Fuzzy Systems, vol. 15, no. 1, pp. 73-89, 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computational Intelligence and Feature Selection provides a high level audience with both the background and fundamental ideas behind feature selection with an emphasis on those techniques based on rough and fuzzy sets, including their hybridizations. It introduces set theory, fuzzy set theory, rough set theory, and fuzzy-rough set theory, and illustrates the power and efficacy of the feature selections described through the use of real-world applications and worked examples. Program files implementing major algorithms covered, together with the necessary instructions and datasets, are available on the Web.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As distributed information services like the World Wide Web become increasingly popular on the Internet, problems of scale are clearly evident. A promising technique that addresses many of these problems is service (or document) replication. However, when a service is replicated, clients then need the additional ability to find a "good" provider of that service. In this paper we report on techniques for finding good service providers without a priori knowledge of server location or network topology. We consider the use of two principal metrics for measuring distance in the Internet: hops, and round-trip latency. We show that these two metrics yield very different results in practice. Surprisingly, we show data indicating that the number of hops between two hosts in the Internet is not strongly correlated to round-trip latency. Thus, the distance in hops between two hosts is not necessarily a good predictor of the expected latency of a document transfer. Instead of using known or measured distances in hops, we show that the extra cost at runtime incurred by dynamic latency measurement is well justified based on the resulting improved performance. In addition we show that selection based on dynamic latency measurement performs much better in practice that any static selection scheme. Finally, the difference between the distribution of hops and latencies is fundamental enough to suggest differences in algorithms for server replication. We show that conclusions drawn about service replication based on the distribution of hops need to be revised when the distribution of latencies is considered instead.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Few educational resources have been developed to inform patients' renal replacement therapy (RRT) selection decisions. Patients progressing toward end stage renal disease (ESRD) must decide among multiple treatment options with varying characteristics. Complex information about treatments must be adequately conveyed to patients with different educational backgrounds and informational needs. Decisions about treatment options also require family input, as families often participate in patients' treatment and support patients' decisions. We describe the development, design, and preliminary evaluation of an informational, evidence-based, and patient-and family-centered decision aid for patients with ESRD and varying levels of health literacy, health numeracy, and cognitive function. METHODS: We designed a decision aid comprising a complementary video and informational handbook. We based our development process on data previously obtained from qualitative focus groups and systematic literature reviews. We simultaneously developed the video and handbook in "stages." For the video, stages included (1) directed interviews with culturally appropriate patients and families and preliminary script development, (2) video production, and (3) screening the video with patients and their families. For the handbook, stages comprised (1) preliminary content design, (2) a mixed-methods pilot study among diverse patients to assess comprehension of handbook material, and (3) screening the handbook with patients and their families. RESULTS: The video and handbook both addressed potential benefits and trade-offs of treatment selections. The 50-minute video consisted of demographically diverse patients and their families describing their positive and negative experiences with selecting a treatment option. The video also incorporated health professionals' testimonials regarding various considerations that might influence patients' and families' treatment selections. The handbook was comprised of written words, pictures of patients and health care providers, and diagrams describing the findings and quality of scientific studies comparing treatments. The handbook text was written at a 4th to 6th grade reading level. Pilot study results demonstrated that a majority of patients could understand information presented in the handbook. Patient and families screening the nearly completed video and handbook reviewed the materials favorably. CONCLUSIONS: This rigorously designed decision aid may help patients and families make informed decisions about their treatment options for RRT that are well aligned with their values.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Agglomerative cluster analyses encompass many techniques, which have been widely used in various fields of science. In biology, and specifically ecology, datasets are generally highly variable and may contain outliers, which increase the difficulty to identify the number of clusters. Here we present a new criterion to determine statistically the optimal level of partition in a classification tree. The criterion robustness is tested against perturbated data (outliers) using an observation or variable with values randomly generated. The technique, called Random Simulation Test (RST), is tested on (1) the well-known Iris dataset [Fisher, R.A., 1936. The use of multiple measurements in taxonomic problems. Ann. Eugenic. 7, 179–188], (2) simulated data with predetermined numbers of clusters following Milligan and Cooper [Milligan, G.W., Cooper, M.C., 1985. An examination of procedures for determining the number of clusters in a data set. Psychometrika 50, 159–179] and finally (3) is applied on real copepod communities data previously analyzed in Beaugrand et al. [Beaugrand, G., Ibanez, F., Lindley, J.A., Reid, P.C., 2002. Diversity of calanoid copepods in the North Atlantic and adjacent seas: species associations and biogeography. Mar. Ecol. Prog. Ser. 232, 179–195]. The technique is compared to several standard techniques. RST performed generally better than existing algorithms on simulated data and proved to be especially efficient with highly variable datasets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Feature selection and feature weighting are useful techniques for improving the classification accuracy of K-nearest-neighbor (K-NN) rule. The term feature selection refers to algorithms that select the best subset of the input feature set. In feature weighting, each feature is multiplied by a weight value proportional to the ability of the feature to distinguish pattern classes. In this paper, a novel hybrid approach is proposed for simultaneous feature selection and feature weighting of K-NN rule based on Tabu Search (TS) heuristic. The proposed TS heuristic in combination with K-NN classifier is compared with several classifiers on various available data sets. The results have indicated a significant improvement in the performance in classification accuracy. The proposed TS heuristic is also compared with various feature selection algorithms. Experiments performed revealed that the proposed hybrid TS heuristic is superior to both simple TS and sequential search algorithms. We also present results for the classification of prostate cancer using multispectral images, an important problem in biomedicine.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gabor features have been recognized as one of the most successful face representations. Encouraged by the results given by this approach, other kind of facial representations based on Steerable Gaussian first order kernels and Harris corner detector are proposed in this paper. In order to reduce the high dimensional feature space, PCA and LDA techniques are employed. Once the features have been extracted, AdaBoost learning algorithm is used to select and combine the most representative features. The experimental results on XM2VTS database show an encouraging recognition rate, showing an important improvement with respect to face descriptors only based on Gabor filters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical downscaling (SD) methods have become a popular, low-cost and accessible means of bridging the gap between the coarse spatial resolution at which climate models output climate scenarios and the finer spatial scale at which impact modellers require these scenarios, with various different SD techniques used for a wide range of applications across the world. This paper compares the Generator for Point Climate Change (GPCC) model and the Statistical DownScaling Model (SDSM)—two contrasting SD methods—in terms of their ability to generate precipitation series under non-stationary conditions across ten contrasting global climates. The mean, maximum and a selection of distribution statistics as well as the cumulative frequencies of dry and wet spells for four different temporal resolutions were compared between the models and the observed series for a validation period. Results indicate that both methods can generate daily precipitation series that generally closely mirror observed series for a wide range of non-stationary climates. However, GPCC tends to overestimate higher precipitation amounts, whilst SDSM tends to underestimate these. This infers that GPCC is more likely to overestimate the effects of precipitation on a given impact sector, whilst SDSM is likely to underestimate the effects. GPCC performs better than SDSM in reproducing wet and dry day frequency, which is a key advantage for many impact sectors. Overall, the mixed performance of the two methods illustrates the importance of users performing a thorough validation in order to determine the influence of simulated precipitation on their chosen impact sector.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel method for the light-curve characterization of Pan-STARRS1 Medium Deep Survey (PS1 MDS) extragalactic sources into stochastic variables (SVs) and burst-like (BL) transients, using multi-band image-differencing time-series data. We select detections in difference images associated with galaxy hosts using a star/galaxy catalog extracted from the deep PS1 MDS stacked images, and adopt a maximum a posteriori formulation to model their difference-flux time-series in four Pan-STARRS1 photometric bands gP1, rP1, iP1, and zP1. We use three deterministic light-curve models to fit BL transients; a Gaussian, a Gamma distribution, and an analytic supernova (SN) model, and one stochastic light-curve model, the Ornstein-Uhlenbeck process, in order to fit variability that is characteristic of active galactic nuclei (AGNs). We assess the quality of fit of the models band-wise and source-wise, using their estimated leave-out-one cross-validation likelihoods and corrected Akaike information criteria. We then apply a K-means clustering algorithm on these statistics, to determine the source classification in each band. The final source classification is derived as a combination of the individual filter classifications, resulting in two measures of classification quality, from the averages across the photometric filters of (1) the classifications determined from the closest K-means cluster centers, and (2) the square distances from the clustering centers in the K-means clustering spaces. For a verification set of AGNs and SNe, we show that SV and BL occupy distinct regions in the plane constituted by these measures. We use our clustering method to characterize 4361 extragalactic image difference detected sources, in the first 2.5 yr of the PS1 MDS, into 1529 BL, and 2262 SV, with a purity of 95.00% for AGNs, and 90.97% for SN based on our verification sets. We combine our light-curve classifications with their nuclear or off-nuclear host galaxy offsets, to define a robust photometric sample of 1233 AGNs and 812 SNe. With these two samples, we characterize their variability and host galaxy properties, and identify simple photometric priors that would enable their real-time identification in future wide-field synoptic surveys.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background
Low patient adherence to treatment is associated with poorer health outcomes in bronchiectasis. We sought to use the Theoretical Domains Framework (TDF) (a framework derived from 33 psychological theories) and behavioural change techniques (BCTs) to define the content of an intervention to change patients’ adherence in bronchiectasis (Stage 1 and 2) and stakeholder expert panels to define its delivery (Stage 3).

Methods
We conducted semi-structured interviews with patients with bronchiectasis about barriers and motivators to adherence to treatment and focus groups or interviews with bronchiectasis healthcare professionals (HCPs) about their ability to change patients’ adherence to treatment. We coded these data to the 12 domain TDF to identify relevant domains for patients and HCPs (Stage 1). Three researchers independently mapped relevant domains for patients and HCPs to a list of 35 BCTs to identify two lists (patient and HCP) of potential BCTs for inclusion (Stage 2). We presented these lists to three expert panels (two with patients and one with HCPs/academics from across the UK). We asked panels who the intervention should target, who should deliver it, at what intensity, in what format and setting, and using which outcome measures (Stage 3).

Results
Eight TDF domains were perceived to influence patients’ and HCPs’ behaviours: Knowledge, Skills, Beliefs about capability, Beliefs about consequences, Motivation, Social influences, Behavioural regulation and Nature of behaviours (Stage 1). Twelve BCTs common to patients and HCPs were included in the intervention: Monitoring, Self-monitoring, Feedback, Action planning, Problem solving, Persuasive communication, Goal/target specified:behaviour/outcome, Information regarding behaviour/outcome, Role play, Social support and Cognitive restructuring (Stage 2). Participants thought that an individualised combination of these BCTs should be delivered to all patients, by a member of staff, over several one-to-one and/or group visits in secondary care. Efficacy should be measured using pulmonary exacerbations, hospital admissions and quality of life (Stage 3).

Conclusions
Twelve BCTs form the intervention content. An individualised selection from these 12 BCTs will be delivered to all patients over several face-to-face visits in secondary care. Future research should focus on developing physical materials to aid delivery of the intervention prior to feasibility and pilot testing. If effective, this intervention may improve adherence and health outcomes for those with bronchiectasis in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High gene flow is considered the norm for most marine organisms and is expected to limit their ability to adapt to local environments. Few studies have directly compared the patterns of differentiation at neutral and selected gene loci in marine organisms. We analysed a transcriptome-derived panel of 281 SNPs in Atlantic herring (Clupea harengus), a highly migratory small pelagic fish, for elucidating neutral and selected genetic variation among populations and to identify candidate genes for environmental adaptation. We analysed 607 individuals from 18 spawning locations in the northeast Atlantic, including two temperature clines (5-12 °C) and two salinity clines (5-35‰). By combining genome scan and landscape genetic analyses, four genetically distinct groups of herring were identified: Baltic Sea, Baltic-North Sea transition area, North Sea/British Isles and North Atlantic; notably, samples exhibited divergent clustering patterns for neutral and selected loci. We found statistically strong evidence for divergent selection at 16 outlier loci on a global scale, and significant correlations with temperature and salinity at nine loci. On regional scales, we identified two outlier loci with parallel patterns across temperature clines and five loci associated with temperature in the North Sea/North Atlantic. Likewise, we found seven replicated outliers, of which five were significantly associated with low salinity across both salinity clines. Our results reveal a complex pattern of varying spatial genetic variation among outlier loci, likely reflecting adaptations to local environments. In addition to disclosing the fine scale of local adaptation in a highly vagile species, our data emphasize the need to preserve functionally important biodiversity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the potential improvement in signal reliability for indoor off-body communications channels operating at 5.8 GHz using switched diversity techniques. In particular we investigate the performance of switch-and-stay combining (SSC), switch-and-examine combining (SEC) and switch-and-examine combining with post-examining selection (SECps) schemes which utilize multiple spatially separated antennas at the base station. During the measurements a test subject, wearing an antenna on his chest, performed a number of walking movements towards and then away from a uniform linear array. It was found that all of the considered diversity schemes provided a worthwhile signal improvement. However, the performance of the diversity systems varied according to the switching threshold that was adopted. To model the fading envelope observed at the output of each of the combiners, we have applied diversity specific equations developed under the assumption of Nakagami-$m$ fading. As a measure of the goodness-of-fit, the Kullback-Leibler divergence between the empirical and theoretical probability density functions (PDFs) was calculated and found to be close to 0. To assist with the interpretation of the goodness-of-fit achieved in this study, the standard deviation, $\sigma$, of a zero-mean, $\sigma^2$ variance Gaussian PDF used to approximate a zero-mean, unit variance Gaussian PDF is also presented. These were generally quite close to 1 indicating that the theoretical models provided an adequate fit to the measured data.