974 resultados para election observers


Relevância:

10.00% 10.00%

Publicador:

Resumo:

L’amblyopie est un trouble développemental de la vision binoculaire. Elle est typiquement caractérisée par des atteintes de l’acuité visuelle et de la stéréoscopie. Toutefois, de plus en plus d’études indiquent la présence d’atteintes plus étendues telles que les difficultés d’attention visuelle ou de lecture. L’amblyopie est généralement expliquée par une suppression interoculaire au niveau cortical, considérée comme chronique ou permanente à l’extérieur de la période développementale. Or, un nombre croissant d’études suggèrent que des interactions binoculaires normales seraient présentes chez les amblyopes adultes. Dans une première étude, nous avons tenté d’identifier un marqueur électrophysiologique de la vision binoculaire. Nous avons enregistré des potentiels évoqués visuels chez des observateurs normaux à qui l’on a induit une dysfonction binoculaire. Les interactions binoculaires étaient caractérisées à l’aide de patrons (facilitation, moyennage et suppression) en comparant les réponses monoculaires et binoculaires. De plus, ces interactions étaient quantifiées à partir d’index d’intégration continus en soustrayant la somme des réponses monoculaires de la réponse binoculaire. Les résultats indiquaient que les patrons d’interaction n’étaient pas optimaux pour estimer les performances stéréoscopiques. Ces dernières étaient, en revanche, mieux expliquées par notre index d’intégration binoculaire. Ainsi, cette étude suggère que l’électrophysiologie est un bon prédicteur de la vision binoculaire. Dans une deuxième étude, nous avons examiné les corrélats neuronaux et comportementaux de la suppression interoculaire chez des amblyopes adultes et des observateurs normaux. Des potentiels évoqués visuels stationnaires ont été enregistrés en utilisant un paradigme de suppression par flash. La suppression était modulée par un changement de contraste du stimulus flash (10, 20, 30, ou 100%), ou le suppresseur, qui était présenté soit dans l’œil dominant ou non-dominant (ou amblyope). Sur le plan comportemental, la suppression interoculaire était observée indépendamment de l’œil stimulé par le flash chez les contrôles. Au contraire, chez les amblyopes, la suppression était asymétrique (c’est-à-dire supérieure lorsqu’elle provenait de l’œil dominant), ce qui suggérait une suppression chronique. De manière intéressante, l’œil amblyope a supprimé l’œil dominant à haut niveau de contraste. Sur le plan électrophysiologique, l’effet de suppression interoculaire observé à la région occipitale était équivalent dans chaque groupe. Toutefois, les réponses électrophysiologiques à la région frontale chez les amblyopes n’étaient pas modulées comme celles des contrôles; la suppression de l’œil amblyope était manifeste même à bas contraste. Nous résultats supportent ainsi l’existence d’interaction binoculaire fonctionnelle chez les amblyopes adultes ainsi que l’implication d’un réseau cortical étendu dans la suppression interoculaire. En somme, l’amblyopie est une condition complexe dont les atteintes corticales et les déficits fonctionnels semblent globaux. L’amblyopie ne doit plus être considérée comme limitée à une dysfonction de l’aire visuelle primaire. La suppression interoculaire semble un point central de cette problématique, mais encore beaucoup d’études seront nécessaires afin de déterminer l’ensemble des mécanismes impliqués dans celle-ci.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Le présent mémoire se donne pour fin d'étudier, en les replaçant au sein du contexte politique de leur époque, deux textes importants de la pensée politique de Justinien : l'Ekthesis d'Agapète le diacre, et le Dialogue de science politique. Ces deux textes représentent le point de vue de deux groupes d'acceptation de Constantinople – à savoir des groupes qui peuvent participer à l’élévation ou à la destitution d’un empereur : le clergé, et l'élite des sénateurs et des hauts-fonctionnaires. À partir de ce cadre conceptuel, il s'agira, pour ce mémoire, de définir les problématiques ayant trait à la fonction et la conception du pouvoir impérial et à la forme de l'État, telles que les présentent ces deux textes. À terme, devra émerger comme cadre interprétatif l'affrontement de deux tendances: d'une part, un hellénisme politique christianisé (associé au clergé), et, d'autre part, une romanité conjuguée à une vision néoplatonicienne du monde (associée à l'élite sénatoriale et fonctionnaire). Ces deux traditions posent des questions différentes. D'un côté, celle de l'orthodoxie de l'empereur et de la nécessité, pour ce dernier, de suivre les préceptes de l'éthique chrétienne, de se montrer digne de Dieu, dont il est le serviteur; de l'autre, celle de la sauvegarde de l'héritage romain, portant notamment sur le rôle du Sénat et l’importance de la loi, de même que le lien entre empereur et philosophe.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Les artéfacts métalliques entraînent un épaississement artéfactuel de la paroi des tuteurs en tomodensitométrie (TDM) avec réduction apparente de leur lumière. Cette étude transversale prospective, devis mesures répétées et observateurs avec méthode en aveugle, chez 24 patients consécutifs/71 tuteurs coronariens a pour objectif de comparer l’épaisseur de paroi des tuteurs en TDM après reconstruction par un algorithme avec renforcement des bords et un algorithme standard. Une angiographie coronarienne par TDM 256 coupes a été réalisée, avec reconstruction par algorithmes avec renforcement des bords et standard. L’épaisseur de paroi des tuteurs était mesurée par méthodes orthogonale (diamètres) et circonférentielle (circonférences). La qualité d’image des tuteurs était évaluée par échelle ordinale, et les données analysées par modèles linéaire mixte et régression logistique des cotes proportionnelles. L’épaisseur de paroi des tuteurs était inférieure avec l’algorithme avec renforcement des bords comparé à l’algorithme standard, avec les méthodes orthogonale (0,97±0,02 vs 1,09±0,03 mm, respectivement; p<0,001) et circonférentielle (1,13±0,02 vs 1,21±0,02 mm, respectivement; p<0,001). Le premier causait moins de surestimation par rapport à l’épaisseur nominale comparé au second, avec méthodes orthogonale (0,89±0,19 vs 1,00±0,26 mm, respectivement; p<0,001) et circonférentielle (1,06±0,26 vs 1,13±0,31 mm, respectivement; p=0,005) et diminuait de 6 % la surestimation. Les scores de qualité étaient meilleurs avec l’algorithme avec renforcement des bords (OR 3,71; IC 95% 2,33–5,92; p<0,001). En conclusion, la reconstruction des images avec l’algorithme avec renforcement des bords génère des parois de tuteurs plus minces, moins de surestimation, et de meilleurs scores de qualité d’image que l’algorithme standard.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Sharing of information with those in need of it has always been an idealistic goal of networked environments. With the proliferation of computer networks, information is so widely distributed among systems, that it is imperative to have well-organized schemes for retrieval and also discovery. This thesis attempts to investigate the problems associated with such schemes and suggests a software architecture, which is aimed towards achieving a meaningful discovery. Usage of information elements as a modelling base for efficient information discovery in distributed systems is demonstrated with the aid of a novel conceptual entity called infotron.The investigations are focused on distributed systems and their associated problems. The study was directed towards identifying suitable software architecture and incorporating the same in an environment where information growth is phenomenal and a proper mechanism for carrying out information discovery becomes feasible. An empirical study undertaken with the aid of an election database of constituencies distributed geographically, provided the insights required. This is manifested in the Election Counting and Reporting Software (ECRS) System. ECRS system is a software system, which is essentially distributed in nature designed to prepare reports to district administrators about the election counting process and to generate other miscellaneous statutory reports.Most of the distributed systems of the nature of ECRS normally will possess a "fragile architecture" which would make them amenable to collapse, with the occurrence of minor faults. This is resolved with the help of the penta-tier architecture proposed, that contained five different technologies at different tiers of the architecture.The results of experiment conducted and its analysis show that such an architecture would help to maintain different components of the software intact in an impermeable manner from any internal or external faults. The architecture thus evolved needed a mechanism to support information processing and discovery. This necessitated the introduction of the noveI concept of infotrons. Further, when a computing machine has to perform any meaningful extraction of information, it is guided by what is termed an infotron dictionary.The other empirical study was to find out which of the two prominent markup languages namely HTML and XML, is best suited for the incorporation of infotrons. A comparative study of 200 documents in HTML and XML was undertaken. The result was in favor ofXML.The concept of infotron and that of infotron dictionary, which were developed, was applied to implement an Information Discovery System (IDS). IDS is essentially, a system, that starts with the infotron(s) supplied as clue(s), and results in brewing the information required to satisfy the need of the information discoverer by utilizing the documents available at its disposal (as information space). The various components of the system and their interaction follows the penta-tier architectural model and therefore can be considered fault-tolerant. IDS is generic in nature and therefore the characteristics and the specifications were drawn up accordingly. Many subsystems interacted with multiple infotron dictionaries that were maintained in the system.In order to demonstrate the working of the IDS and to discover the information without modification of a typical Library Information System (LIS), an Information Discovery in Library Information System (lDLIS) application was developed. IDLIS is essentially a wrapper for the LIS, which maintains all the databases of the library. The purpose was to demonstrate that the functionality of a legacy system could be enhanced with the augmentation of IDS leading to information discovery service. IDLIS demonstrates IDS in action. IDLIS proves that any legacy system could be augmented with IDS effectively to provide the additional functionality of information discovery service.Possible applications of IDS and scope for further research in the field are covered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The theme of the thesis is centred around one important aspect of wireless sensor networks; the energy-efficiency.The limited energy source of the sensor nodes calls for design of energy-efficient routing protocols. The schemes for protocol design should try to minimize the number of communications among the nodes to save energy. Cluster based techniques were found energy-efficient. In this method clusters are formed and data from different nodes are collected under a cluster head belonging to each clusters and then forwarded it to the base station.Appropriate cluster head selection process and generation of desirable distribution of the clusters can reduce energy consumption of the network and prolong the network lifetime. In this work two such schemes were developed for static wireless sensor networks.In the first scheme, the energy wastage due to cluster rebuilding incorporating all the nodes were addressed. A tree based scheme is presented to alleviate this problem by rebuilding only sub clusters of the network. An analytical model of energy consumption of proposed scheme is developed and the scheme is compared with existing cluster based scheme. The simulation study proved the energy savings observed.The second scheme concentrated to build load-balanced energy efficient clusters to prolong the lifetime of the network. A voting based approach to utilise the neighbor node information in the cluster head selection process is proposed. The number of nodes joining a cluster is restricted to have equal sized optimum clusters. Multi-hop communication among the cluster heads is also introduced to reduce the energy consumption. The simulation study has shown that the scheme results in balanced clusters and the network achieves reduction in energy consumption.The main conclusion from the study was the routing scheme should pay attention on successful data delivery from node to base station in addition to the energy-efficiency. The cluster based protocols are extended from static scenario to mobile scenario by various authors. None of the proposals addresses cluster head election appropriately in view of mobility. An elegant scheme for electing cluster heads is presented to meet the challenge of handling cluster durability when all the nodes in the network are moving. The scheme has been simulated and compared with a similar approach.The proliferation of sensor networks enables users with large set of sensor information to utilise them in various applications. The sensor network programming is inherently difficult due to various reasons. There must be an elegant way to collect the data gathered by sensor networks with out worrying about the underlying structure of the network. The final work presented addresses a way to collect data from a sensor network and present it to the users in a flexible way.A service oriented architecture based application is built and data collection task is presented as a web service. This will enable composition of sensor data from different sensor networks to build interesting applications. The main objective of the thesis was to design energy-efficient routing schemes for both static as well as mobile sensor networks. A progressive approach was followed to achieve this goal.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis entitled “The right to freedom of information in india”.In a democracy, the citizens being the persons to choose their own governors, the right to know from the Government is a pre-condition for a properly evaluated election. Freedom of speech and expression, one of the repositories of self~government, forms the basis for the right to know in a wider scale. The functions which the free speech rights serve in a society also emphasize the need for more openness in the functioning of a democracy.Maintanance of law and order and investigation of crimes are highly important in a country like India, where no risk may be taken on account of the public‘s right to know. The Indian situations relating terrorist activities, riots based on language, region, religion and caste are important in this respect. The right to know of the citizens may be regulated in the interests of secrecy required in these areas.On the basis of the conclusions reached in this study, a draft Bill has been proposed for the passing of an Access to Public Documents Act. This Bill is appended to this Thesis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis attempts to investigate the problems associated with such schemes and suggests a software architecture, which is aimed towards achieving a meaningful discovery. Usage of information elements as a modelling base for efficient information discovery in distributed systems is demonstrated with the aid of a novel conceptual entity called infotron. The investigations are focused on distributed systems and their associated problems. The study was directed towards identifying suitable software architecture and incorporating the same in an environment where information growth is phenomenal and a proper mechanism for carrying out information discovery becomes feasible. An empirical study undertaken with the aid of an election database of constituencies distributed geographically, provided the insights required. This is manifested in the Election Counting and Reporting Software (ECRS) System. ECRS system is a software system, which is essentially distributed in nature designed to prepare reports to district administrators about the election counting process and to generate other miscellaneous statutory reports.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cluster based protocols like LEACH were found best suited for routing in wireless sensor networks. In mobility centric environments some improvements were suggested in the basic scheme. LEACH-Mobile is one such protocol. The basic LEACH protocol is improved in the mobile scenario by ensuring whether a sensor node is able to communicate with its cluster head. Since all the nodes, including cluster head is moving it will be better to elect a node as cluster head which is having less mobility related to its neighbours. In this paper, LEACH-Mobile protocol has been enhanced based on a mobility metric “remoteness” for cluster head election. This ensures high success rate in data transfer between the cluster head and the collector nodes even though nodes are moving. We have simulated and compared our LEACH-Mobile-Enhanced protocol with LEACHMobile. Results show that inclusion of neighbouring node information improves the routing protocol.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cluster based protocols like LEACH were found best suited for routing in wireless sensor networks. In mobility centric environments some improvements were suggested in the basic scheme. LEACH-Mobile is one such protocol. The basic LEACH protocol is improved in the mobile scenario by ensuring whether a sensor node is able to communicate with its cluster head. Since all the nodes, including cluster head is moving it will be better to elect a node as cluster head which is having less mobility related to its neighbours. In this paper, LEACH-Mobile protocol has been enhanced based on a mobility metric “remoteness” for cluster head election. This ensures high success rate in data transfer between the cluster head and the collector nodes even though nodes are moving. We have simulated and compared our LEACH-Mobile-Enhanced protocol with LEACHMobile. Results show that inclusion of neighbouring node information improves the routing protocol.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the texts and illustrations of medieval Mappae mundi, women are represented in three different roles: as biblical or Christian characters (such as Eve in the Garden of Eden, Lot's wife and female saints), as legendary or mythical figures (e.g. the Queen of Sheba, mermaids) and as 'other' creatures with a deviant behaviour from the European norm (e.g. Amazons). The principal goal of this study is to analyse the different strategies that were developed by male European observers to project a filtered image of woman in the selective medium of cartography from the tenth to the fifteenth century.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Genetic programming is known to provide good solutions for many problems like the evolution of network protocols and distributed algorithms. In such cases it is most likely a hardwired module of a design framework that assists the engineer to optimize specific aspects of the system to be developed. It provides its results in a fixed format through an internal interface. In this paper we show how the utility of genetic programming can be increased remarkably by isolating it as a component and integrating it into the model-driven software development process. Our genetic programming framework produces XMI-encoded UML models that can easily be loaded into widely available modeling tools which in turn posses code generation as well as additional analysis and test capabilities. We use the evolution of a distributed election algorithm as an example to illustrate how genetic programming can be combined with model-driven development. This example clearly illustrates the advantages of our approach – the generation of source code in different programming languages.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Genetic Programming can be effectively used to create emergent behavior for a group of autonomous agents. In the process we call Offline Emergence Engineering, the behavior is at first bred in a Genetic Programming environment and then deployed to the agents in the real environment. In this article we shortly describe our approach, introduce an extended behavioral rule syntax, and discuss the impact of the expressiveness of the behavioral description to the generation success, using two scenarios in comparison: the election problem and the distributed critical section problem. We evaluate the results, formulating criteria for the applicability of our approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Distributed systems are one of the most vital components of the economy. The most prominent example is probably the internet, a constituent element of our knowledge society. During the recent years, the number of novel network types has steadily increased. Amongst others, sensor networks, distributed systems composed of tiny computational devices with scarce resources, have emerged. The further development and heterogeneous connection of such systems imposes new requirements on the software development process. Mobile and wireless networks, for instance, have to organize themselves autonomously and must be able to react to changes in the environment and to failing nodes alike. Researching new approaches for the design of distributed algorithms may lead to methods with which these requirements can be met efficiently. In this thesis, one such method is developed, tested, and discussed in respect of its practical utility. Our new design approach for distributed algorithms is based on Genetic Programming, a member of the family of evolutionary algorithms. Evolutionary algorithms are metaheuristic optimization methods which copy principles from natural evolution. They use a population of solution candidates which they try to refine step by step in order to attain optimal values for predefined objective functions. The synthesis of an algorithm with our approach starts with an analysis step in which the wanted global behavior of the distributed system is specified. From this specification, objective functions are derived which steer a Genetic Programming process where the solution candidates are distributed programs. The objective functions rate how close these programs approximate the goal behavior in multiple randomized network simulations. The evolutionary process step by step selects the most promising solution candidates and modifies and combines them with mutation and crossover operators. This way, a description of the global behavior of a distributed system is translated automatically to programs which, if executed locally on the nodes of the system, exhibit this behavior. In our work, we test six different ways for representing distributed programs, comprising adaptations and extensions of well-known Genetic Programming methods (SGP, eSGP, and LGP), one bio-inspired approach (Fraglets), and two new program representations called Rule-based Genetic Programming (RBGP, eRBGP) designed by us. We breed programs in these representations for three well-known example problems in distributed systems: election algorithms, the distributed mutual exclusion at a critical section, and the distributed computation of the greatest common divisor of a set of numbers. Synthesizing distributed programs the evolutionary way does not necessarily lead to the envisaged results. In a detailed analysis, we discuss the problematic features which make this form of Genetic Programming particularly hard. The two Rule-based Genetic Programming approaches have been developed especially in order to mitigate these difficulties. In our experiments, at least one of them (eRBGP) turned out to be a very efficient approach and in most cases, was superior to the other representations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The design, reformulation, and final signing of Plan Colombia by the then US President, Bill Clinton, on the 13 July 2000 initiated in a new era of the US State´s involvement in supposedly sovereign-territorial issues of Colombian politics. The implementation of Plan Colombia there-on-after brought about a major realignment of political-military scales and terrains of conflict that have renewed discourses concerning the contemporary imperialist interests of key US-based but transnationally-projected social forces, leading to arguments that stress the invigorated geo-political dimension of present-day strategies of capitalist accumulation. With the election of Álvaro Uribe Vélez as Colombian President in May 2002 and his pledge to strengthen the national military campaign aganist the region´s longest-surviving insurgency guerrilla group, Las FARC-EP, as well as other guerrilla factions, combined with a new focus on establishing the State project of “Democratic Security”; the military realm of governance and attempts to ensure property security and expanding capitalist investment have attained precedence in Colombia´s national political domains. This working paper examines the interrelated nature of Plan Colombia -as a binational and indeed regional security strategy- and Uribe´s Democratic Security project as a means of showing the manner in which they have worked to pave the way for the implementation of a new “total market” regime of accumulation, based on large-scale agro-industrial investment which is accelerated through processes of accumulation via dispossession. As such, the political and social reconfigurations involved manifest the multifarious scales of governance that become intertwined in incorporating neoliberalism in specific regions of the world economy. Furthermore, the militarisation-securitisation of such policies also illustrate the explicit contradictions of neoliberalism in a peripheral context, where coercion seems to prevail, something which leads to a profound questioning of the extent to which neoliberalism can be thought of as a hegemonic politico-economic project.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Low perceptual familiarity with relatively rarer left-handed as opposed to more common right-handed individuals may result in athletes' poorer ability to anticipate the former's action intentions. Part of such left-right asymmetry in visual anticipation could be due to an inefficient gaze strategy during confrontation with left-handed individuals. To exemplify, observers may not mirror their gaze when viewing left- vs. right-handed actions but preferentially fixate on an opponent's right body side, irrespective of an opponent's handedness, owing to the predominant exposure to right-handed actions. So far empirical verification of such assumption, however, is lacking. Here we report on an experiment where team-handball goalkeepers' and non-goalkeepers' gaze behavior was recorded while they predicted throw direction of left- and right-handed 7-m penalties shown as videos on a computer monitor. As expected, goalkeepers were considerably more accurate than non-goalkeepers and prediction was better against right- than left-handed penalties. However, there was no indication of differences in gaze measures (i.e., number of fixations, overall and final fixation duration, time-course of horizontal or vertical fixation deviation) as a function of skill group or the penalty-takers' handedness. Findings suggest that inferior anticipation of left-handed compared to right-handed individuals' action intentions may not be associated with misalignment in gaze behavior. Rather, albeit looking similarly, accuracy differences could be due to observers' differential ability of picking up and interpreting the visual information provided by left- vs. right-handed movements.