982 resultados para Mixed integer problems
Resumo:
Abstract
Resumo:
We describe the odorant binding proteins (OBPs) of the red imported fire ant, Solenopsis invicta, obtained from analyses of an EST library and separate 454 sequencing runs of two normalized cDNA libraries. We identified a total of 18 putative functional OBPs in this ant. A third of the fire ant OBPs are orthologs to honey bee OBPs. Another third of the OBPs belong to a lineage-specific expansion, which is a common feature of insect OBP evolution. Like other OBPs, the different fire ant OBPs share little sequence similarity (∼ 20%), rendering evolutionary analyses difficult. We discuss the resulting problems with sequence alignment, phylogenetic analysis, and tests of selection. As previously suggested, our results underscore the importance for careful exploration of the sensitivity to the effects of alignment methods for data comprising widely divergent sequences.
Resumo:
In this paper, mixed spectral-structural kernel machines are proposed for the classification of very-high resolution images. The simultaneous use of multispectral and structural features (computed using morphological filters) allows a significant increase in classification accuracy of remote sensing images. Subsequently, weighted summation kernel support vector machines are proposed and applied in order to take into account the multiscale nature of the scene considered. Such classifiers use the Mercer property of kernel matrices to compute a new kernel matrix accounting simultaneously for two scale parameters. Tests on a Zurich QuickBird image show the relevance of the proposed method : using the mixed spectral-structural features, the classification accuracy increases of about 5%, achieving a Kappa index of 0.97. The multikernel approach proposed provide an overall accuracy of 98.90% with related Kappa index of 0.985.
Resumo:
This paper presents the use of our multimodal mixed reality telecommunication system to support remote acting rehearsal. The rehearsals involved two actors, located in London and Barcelona, and a director in another location in London. This triadic audiovisual telecommunication was performed in a spatial and multimodal collaborative mixed reality environment based on the 'destination-visitor' paradigm, which we define and put into use. We detail our heterogeneous system architecture, which spans the three distributed and technologically asymmetric sites, and features a range of capture, display, and transmission technologies. The actors' and director's experience of rehearsing a scene via the system are then discussed, exploring successes and failures of this heterogeneous form of telecollaboration. Overall, the common spatial frame of reference presented by the system to all parties was highly conducive to theatrical acting and directing, allowing blocking, gross gesture, and unambiguous instruction to be issued. The relative inexpressivity of the actors' embodiments was identified as the central limitation of the telecommunication, meaning that moments relying on performing and reacting to consequential facial expression and subtle gesture were less successful.
Resumo:
Abstract
Resumo:
This paper deals with the goodness of the Gaussian assumption when designing second-order blind estimationmethods in the context of digital communications. The low- andhigh-signal-to-noise ratio (SNR) asymptotic performance of the maximum likelihood estimator—derived assuming Gaussiantransmitted symbols—is compared with the performance of the optimal second-order estimator, which exploits the actualdistribution of the discrete constellation. The asymptotic study concludes that the Gaussian assumption leads to the optimalsecond-order solution if the SNR is very low or if the symbols belong to a multilevel constellation such as quadrature-amplitudemodulation (QAM) or amplitude-phase-shift keying (APSK). On the other hand, the Gaussian assumption can yield importantlosses at high SNR if the transmitted symbols are drawn from a constant modulus constellation such as phase-shift keying (PSK)or continuous-phase modulations (CPM). These conclusions are illustrated for the problem of direction-of-arrival (DOA) estimation of multiple digitally-modulated signals.
Resumo:
A regularization method based on the non-extensive maximum entropy principle is devised. Special emphasis is given to the q=1/2 case. We show that, when the residual principle is considered as constraint, the q=1/2 generalized distribution of Tsallis yields a regularized solution for bad-conditioned problems. The so devised regularized distribution is endowed with a component which corresponds to the well known regularized solution of Tikhonov (1977).
Resumo:
This paper presents the use of our multimodal mixed reality telecommunication system to support remote acting rehearsal. The rehearsals involved two actors, located in London and Barcelona, and a director in another location in London. This triadic audiovisual telecommunication was performed in a spatial and multimodal collaborative mixed reality environment based on the 'destination-visitor' paradigm, which we define and put into use. We detail our heterogeneous system architecture, which spans the three distributed and technologically asymmetric sites, and features a range of capture, display, and transmission technologies. The actors' and director's experience of rehearsing a scene via the system are then discussed, exploring successes and failures of this heterogeneous form of telecollaboration. Overall, the common spatial frame of reference presented by the system to all parties was highly conducive to theatrical acting and directing, allowing blocking, gross gesture, and unambiguous instruction to be issued. The relative inexpressivity of the actors' embodiments was identified as the central limitation of the telecommunication, meaning that moments relying on performing and reacting to consequential facial expression and subtle gesture were less successful.
Resumo:
La régulation de la glycémie est une fonction complexe de l'organisme faisant intervenir de multiples mécanismes. Lors de la prise alimentaire, l'un des mécanismes impliqués dans l'homéostasie glucidique, notamment dans la sécrétion d'insuline, est l'axe entéroinsulaire. En effet, le contact des nutriments avec des cellules spécialisées réparties le long du tractus digestif déclenche la sécrétion d'hormones, appelées incretines, telles que le GLP-1 ou le GIP. Ces hormones gastro-intestinales potentialisent la sécrétion d'insuline (effet incrétine) et sont responsables d'une grande partie de la réponse insulinique à la prise orale de glucose.¦L'importance de ces hormones est particulièrement mise en évidence par des observations faites chez les sujets obèses ayant bénéficié d'une chirurgie bariatrique. En effet, après l'opération, la sensibilité à l'insuline et sa sécrétion sont améliorées chez des patients obèses diabétiques ou intolérants au glucose, alors que le pattern de sécrétion des hormones GI est nettement modifié avec notamment une augmentation de la sécrétion de GLP-1. L'augmentation de la sécrétion de ces hormones pourrait contribuer à l'amélioration de la tolérance glucidique en augmentant la sécrétion d'insuline en réponse à l'apport de nutriments. Cette activation exagérée de l'axe entéro-insulaire pourrait aussi contribuer à la pathogenèse des hypoglycémies postprandiales survenant parfois après un bypass gastrique¦Néanmoins, si le rôle des hormones gastro-intestinales est indubitale, il y a peu de données nous indiquant le rôle respectif des divers macronutriments composant un repas standard dans I'activation de l'axe entéro-insulaire. Dans ce travail, nous avons cherché à préciser le rôle spécifique de la partie lipidique et protéique d'un repas standard.¦Après avoir confirmé l'existence d'un effet incrétine lors de la consommation d'un repas test sous forme d'un sandwich, les résultats que nous avons obtenus montrent que l'ingestion de lipides en quantité correspondant à celle d'un repas standard augmente la sécrétion d'insuline, contribuant ainsi à l'effet incrétine, alors qu'à contrario, l'ingestion de protéines ne provoque pas d'augmentation de l'insulinémie et ainsi ne contribue pas à l'effet incrétine.¦Ces observations pourraient revêtir un intérêt pratique. En effet, la démonstration du rôle prépondérant d'un macronutriment dans l'effet incrétine suivant la prise d'un repas standard pourrait mener à des prescriptions diététiques dans le but d'améliorer le contrôle glycémique chez des patients diabétiques ou de diminuer les hypoglycémies suivant la prise alimentaire chez certains patients ayant bénéficié d'un bypass gastrique. De même, une meilleure compréhension du rôle des hormones incrétines a déjà ouvert de nouvelles perspectives thérapeutiques dans le traitement du diabète de type 2 avec le développement de nouvelles classes de médicaments telles que les analogues du GLP-1 ou les inhibiteurs de sa dégradation.
Resumo:
While equal political representation of all citizens is a fundamental democratic goal, it is hampered empirically in a multitude of ways. This study examines how the societal level of economic inequality affects the representation of relatively poor citizens by parties and governments. Using CSES survey data for citizens' policy preferences and expert placements of political parties, empirical evidence is found that in economically more unequal societies, the party system represents the preferences of relatively poor citizens worse than in more equal societies. This moderating effect of economic equality is also found for policy congruence between citizens and governments, albeit slightly less clear-cut.
Resumo:
The number of patients treated by haemodialysis (HD) is continuously increasing. The complications associated with vascular accesses represent the first cause of hospitalisation in these patients. Since 2001 nephrologists, surgeons, angiologists and radiologists at the CHUV are working to develop a multidisciplinary model that includes planning and monitoring of HD accesses. In this setting the echo-Doppler represents an important tool of investigation. Every patient is discussed and decisions are taken during a weekly multidisciplinary meeting. A network has been created with nephrologists of peripheral centres and other specialists. This model allows to centralize investigational information and coordinate patient care while keeping and even developing some investigational activities and treatment in peripheral centres.
Resumo:
Biological scaling analyses employing the widely used bivariate allometric model are beset by at least four interacting problems: (1) choice of an appropriate best-fit line with due attention to the influence of outliers; (2) objective recognition of divergent subsets in the data (allometric grades); (3) potential restrictions on statistical independence resulting from phylogenetic inertia; and (4) the need for extreme caution in inferring causation from correlation. A new non-parametric line-fitting technique has been developed that eliminates requirements for normality of distribution, greatly reduces the influence of outliers and permits objective recognition of grade shifts in substantial datasets. This technique is applied in scaling analyses of mammalian gestation periods and of neonatal body mass in primates. These analyses feed into a re-examination, conducted with partial correlation analysis, of the maternal energy hypothesis relating to mammalian brain evolution, which suggests links between body size and brain size in neonates and adults, gestation period and basal metabolic rate. Much has been made of the potential problem of phylogenetic inertia as a confounding factor in scaling analyses. However, this problem may be less severe than suspected earlier because nested analyses of variance conducted on residual variation (rather than on raw values) reveals that there is considerable variance at low taxonomic levels. In fact, limited divergence in body size between closely related species is one of the prime examples of phylogenetic inertia. One common approach to eliminating perceived problems of phylogenetic inertia in allometric analyses has been calculation of 'independent contrast values'. It is demonstrated that the reasoning behind this approach is flawed in several ways. Calculation of contrast values for closely related species of similar body size is, in fact, highly questionable, particularly when there are major deviations from the best-fit line for the scaling relationship under scrutiny.
Resumo:
In this paper, we consider a discrete-time risk process allowing for delay in claim settlement, which introduces a certain type of dependence in the process. From martingale theory, an expression for the ultimate ruin probability is obtained, and Lundberg-type inequalities are derived. The impact of delay in claim settlement is then investigated. To this end, a convex order comparison of the aggregate claim amounts is performed with the corresponding non-delayed risk model, and numerical simulations are carried out with Belgian market data.
Resumo:
Abstract The solvability of the problem of fair exchange in a synchronous system subject to Byzantine failures is investigated in this work. The fair exchange problem arises when a group of processes are required to exchange digital items in a fair manner, which means that either each process obtains the item it was expecting or no process obtains any information on, the inputs of others. After introducing a novel specification of fair exchange that clearly separates safety and liveness, we give an overview of the difficulty of solving such a problem in the context of a fully-connected topology. On one hand, we show that no solution to fair exchange exists in the absence of an identified process that every process can trust a priori; on the other, a well-known solution to fair exchange relying on a trusted third party is recalled. These two results lead us to complete our system model with a flexible representation of the notion of trust. We then show that fair exchange is solvable if and only if a connectivity condition, named the reachable majority condition, is satisfied. The necessity of the condition is proven by an impossibility result and its sufficiency by presenting a general solution to fair exchange relying on a set of trusted processes. The focus is then turned towards a specific network topology in order to provide a fully decentralized, yet realistic, solution to fair exchange. The general solution mentioned above is optimized by reducing the computational load assumed by trusted processes as far as possible. Accordingly, our fair exchange protocol relies on trusted tamperproof modules that have limited communication abilities and are only required in key steps of the algorithm. This modular solution is then implemented in the context of a pedagogical application developed for illustrating and apprehending the complexity of fair exchange. This application, which also includes the implementation of a wide range of Byzantine behaviors, allows executions of the algorithm to be set up and monitored through a graphical display. Surprisingly, some of our results on fair exchange seem contradictory with those found in the literature of secure multiparty computation, a problem from the field of modern cryptography, although the two problems have much in common. Both problems are closely related to the notion of trusted third party, but their approaches and descriptions differ greatly. By introducing a common specification framework, a comparison is proposed in order to clarify their differences and the possible origins of the confusion between them. This leads us to introduce the problem of generalized fair computation, a generalization of fair exchange. Finally, a solution to this new problem is given by generalizing our modular solution to fair exchange