989 resultados para adaptive operator selection
Resumo:
Given $n$ independent replicates of a jointly distributed pair $(X,Y)\in {\cal R}^d \times {\cal R}$, we wish to select from a fixed sequence of model classes ${\cal F}_1, {\cal F}_2, \ldots$ a deterministic prediction rule $f: {\cal R}^d \to {\cal R}$ whose risk is small. We investigate the possibility of empirically assessingthe {\em complexity} of each model class, that is, the actual difficulty of the estimation problem within each class. The estimated complexities are in turn used to define an adaptive model selection procedure, which is based on complexity penalized empirical risk.The available data are divided into two parts. The first is used to form an empirical cover of each model class, and the second is used to select a candidate rule from each cover based on empirical risk. The covering radii are determined empirically to optimize a tight upper bound on the estimation error. An estimate is chosen from the list of candidates in order to minimize the sum of class complexity and empirical risk. A distinguishing feature of the approach is that the complexity of each model class is assessed empirically, based on the size of its empirical cover.Finite sample performance bounds are established for the estimates, and these bounds are applied to several non-parametric estimation problems. The estimates are shown to achieve a favorable tradeoff between approximation and estimation error, and to perform as well as if the distribution-dependent complexities of the model classes were known beforehand. In addition, it is shown that the estimate can be consistent,and even possess near optimal rates of convergence, when each model class has an infinite VC or pseudo dimension.For regression estimation with squared loss we modify our estimate to achieve a faster rate of convergence.
Resumo:
This paper presents automated segmentation of structuresin the Head and Neck (H\&N) region, using an activecontour-based joint registration and segmentation model.A new atlas selection strategy is also used. Segmentationis performed based on the dense deformation fieldcomputed from the registration of selected structures inthe atlas image that have distinct boundaries, onto thepatient's image. This approach results in robustsegmentation of the structures of interest, even in thepresence of tumors, or anatomical differences between theatlas and the patient image. For each patient, an atlasimage is selected from the available atlas-database,based on the similarity metric value, computed afterperforming an affine registration between each image inthe atlas-database and the patient's image. Unlike manyof the previous approaches in the literature, thesimilarity metric is not computed over the entire imageregion; rather, it is computed only in the regions ofsoft tissue structures to be segmented. Qualitative andquantitative evaluation of the results is presented.
Resumo:
To solve multi-objective problems, multiple reward signals are often scalarized into a single value and further processed using established single-objective problem solving techniques. While the field of multi-objective optimization has made many advances in applying scalarization techniques to obtain good solution trade-offs, the utility of applying these techniques in the multi-objective multi-agent learning domain has not yet been thoroughly investigated. Agents learn the value of their decisions by linearly scalarizing their reward signals at the local level, while acceptable system wide behaviour results. However, the non-linear relationship between weighting parameters of the scalarization function and the learned policy makes the discovery of system wide trade-offs time consuming. Our first contribution is a thorough analysis of well known scalarization schemes within the multi-objective multi-agent reinforcement learning setup. The analysed approaches intelligently explore the weight-space in order to find a wider range of system trade-offs. In our second contribution, we propose a novel adaptive weight algorithm which interacts with the underlying local multi-objective solvers and allows for a better coverage of the Pareto front. Our third contribution is the experimental validation of our approach by learning bi-objective policies in self-organising smart camera networks. We note that our algorithm (i) explores the objective space faster on many problem instances, (ii) obtained solutions that exhibit a larger hypervolume, while (iii) acquiring a greater spread in the objective space.
Resumo:
International audience
Resumo:
Accuracy and mesh generation are key issues for the high-resolution hydrodynamic modelling of the whole Great Barrier Reef. Our objective is to generate suitable unstructured grids that can resolve topological and dynamical features like tidal jets and recirculation eddies in the wake of islands. A new strategy is suggested to refine the mesh in areas of interest taking into account the bathymetric field and an approximated distance to islands and reefs. Such a distance is obtained by solving an elliptic differential operator, with specific boundary conditions. Meshes produced illustrate both the validity and the efficiency of the adaptive strategy. Selection of refinement and geometrical parameters is discussed. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
The identification of signatures of natural selection in genomic surveys has become an area of intense research, stimulated by the increasing ease with which genetic markers can be typed. Loci identified as subject to selection may be functionally important, and hence (weak) candidates for involvement in disease causation. They can also be useful in determining the adaptive differentiation of populations, and exploring hypotheses about speciation. Adaptive differentiation has traditionally been identified from differences in allele frequencies among different populations, summarised by an estimate of F-ST. Low outliers relative to an appropriate neutral population-genetics model indicate loci subject to balancing selection, whereas high outliers suggest adaptive (directional) selection. However, the problem of identifying statistically significant departures from neutrality is complicated by confounding effects on the distribution of F-ST estimates, and current methods have not yet been tested in large-scale simulation experiments. Here, we simulate data from a structured population at many unlinked, diallelic loci that are predominantly neutral but with some loci subject to adaptive or balancing selection. We develop a hierarchical-Bayesian method, implemented via Markov chain Monte Carlo (MCMC), and assess its performance in distinguishing the loci simulated under selection from the neutral loci. We also compare this performance with that of a frequentist method, based on moment-based estimates of F-ST. We find that both methods can identify loci subject to adaptive selection when the selection coefficient is at least five times the migration rate. Neither method could reliably distinguish loci under balancing selection in our simulations, even when the selection coefficient is twenty times the migration rate.
Resumo:
As a consequence of selective pressure exerted by the immune response during hepatitis C virus (HCV) infection, a high rate of nucleotide mutations in the viral genome is observed which leads to the emergence of viral escape mutants. The aim of this study was to evaluate the evolution of the amino acid (aa) sequence of the HCV nonstructural protein 3 (NS3) in viral isolates after liver transplantation. Six patients with HCV-induced liver disease undergoing liver transplantation (LT) were followed up for sequence analysis. Hepatitis C recurrence was observed in all patients after LT. The rate of synonymous (dS) nucleotide substitutions was much higher than that of nonsynonymous (dN) ones in the NS3 encoding region. The high values of the dS/dN ratios suggest no sustained adaptive evolution selection pressure and, therefore, absence of specific NS3 viral populations. Clinical genotype assignments were supported by phylogenetic analysis. Serial samples from each patient showed lower mean nucleotide genetic distance when compared with samples of the same HCV genotype and subtype. The NS3 samples studied had an N-terminal aa sequence with several differences as compared with reference ones, mainly in genotype 1b-infected patients. After LT, as compared with the sequences before, a few reverted aa substitutions and several established aa substitutions were observed at the N-terminal of NS3. Sites described to be involved in important functions of NS3, notably those of the catalytic triad and zinc binding, remained unaltered in terms of aa sequence. Rare or frequent aa substitutions occurred indiscriminately in different positions. Several cytotoxic T lymphocyte epitopes described for HCV were present in our 1b samples. Nevertheless, the deduced secondary structure of the NS3 protease showed a few alterations in samples from genotype 3a patients, but none were seen in 1b cases. Our data, obtained from patients under important selective pressure during LT, show that the NS3 protease remains well conserved, mainly in HCV 3a patients. It reinforces its potential use as an antigenic candidate for further studies aiming at the development of a protective immune response.
Resumo:
Overview of the Passenger Service Connects Iowa City, Quad Cities and Chicago, 219.5 miles Twice‐daily service each way, 4 hours and 15 minutes travel time 246,800 passengers first year (676 per day) Project construction cost $310 million (80% federal, 14.5% Illinois, 5.4% Iowa) On‐time performance 90% or better (trains arrive within 10 minutes of schedule) Competitive passenger rail service operator selection Iowa’s annual share of operating cost support averages $3 million
Resumo:
This report addresses the problem of achieving cooperation within small- to medium- sized teams of heterogeneous mobile robots. I describe a software architecture I have developed, called ALLIANCE, that facilitates robust, fault tolerant, reliable, and adaptive cooperative control. In addition, an extended version of ALLIANCE, called L-ALLIANCE, is described, which incorporates a dynamic parameter update mechanism that allows teams of mobile robots to improve the efficiency of their mission performance through learning. A number of experimental results of implementing these architectures on both physical and simulated mobile robot teams are described. In addition, this report presents the results of studies of a number of issues in mobile robot cooperation, including fault tolerant cooperative control, adaptive action selection, distributed control, robot awareness of team member actions, improving efficiency through learning, inter-robot communication, action recognition, and local versus global control.
Resumo:
The transcriptome of an organism is its set of gene transcripts (mRNAs) at a defined spatial and temporal locus. Because gene expression is affected markedly by environmental and developmental perturbations, it is widely assumed that transcriptome divergence among taxa represents adaptive phenotypic selection. This assumption has been challenged by neutral theories which propose that stochastic processes drive transcriptome evolution. To test for evidence of neutral transcriptome evolution in plants, we quantified 18 494 gene transcripts in nonsenescent leaves of 14 taxa of Brassicaceae using robust cross-species transcriptomics which includes a two-step physical and in silico-based normalization procedure based on DNA similarity among taxa. Transcriptome divergence correlates positively with evolutionary distance between taxa and with variation in gene expression among samples. Results are similar for pseudogenes and chloroplast genes evolving at different rates. Remarkably, variation in transcript abundance among root-cell samples correlates positively with transcriptome divergence among root tissues and among taxa. Because neutral processes affect transcriptome evolution in plants, many differences in gene expression among or within taxa may be nonfunctional, reflecting ancestral plasticity and founder effects. Appropriate null models are required when comparing transcriptomes in space and time.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
In this thesis, I study skin lesion detection and its applications to skin cancer diagnosis. A skin lesion detection algorithm is proposed. The proposed algorithm is based color information and threshold. For the proposed algorithm, several color spaces are studied and the detection results are compared. Experimental results show that YUV color space can achieve the best performance. Besides, I develop a distance histogram based threshold selection method and the method is proven to be better than other adaptive threshold selection methods for color detection. Besides the detection algorithms, I also investigate GPU speed-up techniques for skin lesion extraction and the results show that GPU has potential applications in speeding-up skin lesion extraction. Based on the skin lesion detection algorithms proposed, I developed a mobile-based skin cancer diagnosis application. In this application, the user with an iPhone installed with the proposed application can use the iPhone as a diagnosis tool to find the potential skin lesions in a persons' skin and compare the skin lesions detected by the iPhone with the skin lesions stored in a database in a remote server.
Resumo:
Knowledge about the quality characteristics (QoS) of service com- positions is crucial for determining their usability and economic value. Ser- vice quality is usually regulated using Service Level Agreements (SLA). While end-to-end SLAs are well suited for request-reply interactions, more complex, decentralized, multiparticipant compositions (service choreographies) typ- ically involve multiple message exchanges between stateful parties and the corresponding SLAs thus encompass several cooperating parties with interde- pendent QoS. The usual approaches to determining QoS ranges structurally (which are by construction easily composable) are not applicable in this sce- nario. Additionally, the intervening SLAs may depend on the exchanged data. We present an approach to data-aware QoS assurance in choreographies through the automatic derivation of composable QoS models from partici- pant descriptions. Such models are based on a message typing system with size constraints and are derived using abstract interpretation. The models ob- tained have multiple uses including run-time prediction, adaptive participant selection, or design-time compliance checking. We also present an experimen- tal evaluation and discuss the benefits of the proposed approach.
Resumo:
This paper presents a mechanism to generate virtual buildings considering designer constraints and guidelines. This mechanism is implemented as a pipeline of different Variable Neighborhood Search (VNS) optimization processes in which several subproblems are tackled (1) rooms locations, (2) connectivity graph, and (3) element placement. The core VNS algorithm includes some variants to improve its performance, such as, for example constraint handling and biased operator selection. The optimization process uses a toolkit of construction primitives implemented as "smart objects" providing basic elements such as rooms, doors, staircases and other connectors. The paper also shows experimental results of the application of different designer constraints to a wide range of buildings from small houses to a large castle with several underground levels.
Resumo:
Interfaces designed according to ecological interface design (EID) display higher-order relations and properties of a work domain so that adaptive operator problem solving can be better supported under unanticipated system conditions. Previous empirical studies of EID have assumed that the raw data required to derive and communicate higher-order information would be available and reliable. The present research examines the relative advantages of an EID interface over a conventional piping-and-instrumentation diagram (PID) when instrumentation is maximally or only minimally adequate. Results show an interaction between interface and the adequacy of the instrumentation. Failure diagnosis performance with the EID interface with maximally adequate instrumentation is best overall. Performance with the EID interface drops more drastically from maximally to minimally adequate instrumentation than does performance with the PID interface, to the point where the EID interface with minimally adequate instrumentation supports nonsignificantly worse performance than does the equivalent PID interface. Actual or potential applications of this research include design of instrumentation and displays for complex industrial processes.