989 resultados para Literature as object
Object-Oriented Genetic Programming for the Automatic Inference of Graph Models for Complex Networks
Resumo:
Complex networks are systems of entities that are interconnected through meaningful relationships. The result of the relations between entities forms a structure that has a statistical complexity that is not formed by random chance. In the study of complex networks, many graph models have been proposed to model the behaviours observed. However, constructing graph models manually is tedious and problematic. Many of the models proposed in the literature have been cited as having inaccuracies with respect to the complex networks they represent. However, recently, an approach that automates the inference of graph models was proposed by Bailey [10] The proposed methodology employs genetic programming (GP) to produce graph models that approximate various properties of an exemplary graph of a targeted complex network. However, there is a great deal already known about complex networks, in general, and often specific knowledge is held about the network being modelled. The knowledge, albeit incomplete, is important in constructing a graph model. However it is difficult to incorporate such knowledge using existing GP techniques. Thus, this thesis proposes a novel GP system which can incorporate incomplete expert knowledge that assists in the evolution of a graph model. Inspired by existing graph models, an abstract graph model was developed to serve as an embryo for inferring graph models of some complex networks. The GP system and abstract model were used to reproduce well-known graph models. The results indicated that the system was able to evolve models that produced networks that had structural similarities to the networks generated by the respective target models.
Resumo:
Lors de ces dix dernières années, le coût de la maintenance des systèmes orientés objets s'est accru jusqu' à compter pour plus de 70% du coût total des systèmes. Cette situation est due à plusieurs facteurs, parmi lesquels les plus importants sont: l'imprécision des spécifications des utilisateurs, l'environnement d'exécution changeant rapidement et la mauvaise qualité interne des systèmes. Parmi tous ces facteurs, le seul sur lequel nous ayons un réel contrôle est la qualité interne des systèmes. De nombreux modèles de qualité ont été proposés dans la littérature pour contribuer à contrôler la qualité. Cependant, la plupart de ces modèles utilisent des métriques de classes (nombre de méthodes d'une classe par exemple) ou des métriques de relations entre classes (couplage entre deux classes par exemple) pour mesurer les attributs internes des systèmes. Pourtant, la qualité des systèmes par objets ne dépend pas uniquement de la structure de leurs classes et que mesurent les métriques, mais aussi de la façon dont celles-ci sont organisées, c'est-à-dire de leur conception, qui se manifeste généralement à travers les patrons de conception et les anti-patrons. Dans cette thèse nous proposons la méthode DEQUALITE, qui permet de construire systématiquement des modèles de qualité prenant en compte non seulement les attributs internes des systèmes (grâce aux métriques), mais aussi leur conception (grâce aux patrons de conception et anti-patrons). Cette méthode utilise une approche par apprentissage basée sur les réseaux bayésiens et s'appuie sur les résultats d'une série d'expériences portant sur l'évaluation de l'impact des patrons de conception et des anti-patrons sur la qualité des systèmes. Ces expériences réalisées sur 9 grands systèmes libres orientés objet nous permettent de formuler les conclusions suivantes: • Contre l'intuition, les patrons de conception n'améliorent pas toujours la qualité des systèmes; les implantations très couplées de patrons de conception par exemple affectent la structure des classes et ont un impact négatif sur leur propension aux changements et aux fautes. • Les classes participantes dans des anti-atrons sont beaucoup plus susceptibles de changer et d'être impliquées dans des corrections de fautes que les autres classes d'un système. • Un pourcentage non négligeable de classes sont impliquées simultanément dans des patrons de conception et dans des anti-patrons. Les patrons de conception ont un effet positif en ce sens qu'ils atténuent les anti-patrons. Nous appliquons et validons notre méthode sur trois systèmes libres orientés objet afin de démontrer l'apport de la conception des systèmes dans l'évaluation de la qualité.
Resumo:
Cette dissertation explore la carrière de la rencontre manquée Lacanienne dans la littérature canonique américaine du dix-neuvième siècle à travers le prisme de la psychanalyse, la déconstruction, le postmodernisme et le postcolonialisme. Je me concentre particulièrement sur La Lettre Écarlate de Hawthorne et Moby-Dick de Melville, en montrant comment ils sont investis dans l'économie narrative de la rencontre manquée, l'économie de ce qui est au-delà de la symbolisation et l'assimilation. L’introduction examine les contours et les détours historiques, philosophiques et théoriques du concept de la rencontre manquée. Cette dissertation a donc deux objectifs: d'une part, elle tente d'examiner le statut et la fonction de la rencontre manquée dans la littérature américaine du dix-neuvième siècle, et d’autre part, elle explore comment la théorisation de la rencontre manquée pourrait nous aider à aller au-delà de la théorisation binaire qui caractérise les scènes géopolitiques actuelles. Mon premier chapitre sur La Lettre Écarlate de Hawthorne, tente de tracer la carrière du signifiant comme une navette entre l'archive et l'avenir, entre le sujet et l'objet, entre le signifiant et le signifié. Le but de ce chapitre est de rendre compte de la temporalité du signifiant et la temporalité de la subjectivité et d’expliquer comment ils répondent à la temporalité du tuché. En explorant la dimension crypto-temporelle de la rencontre manquée, ce chapitre étudie l'excès de cryptes par la poétique (principalement prosopopée, anasémie, et les tropes d'exhumation). Le deuxième chapitre élabore sur les contours de la rencontre manquée. En adoptant des approches psychanalytiques et déconstructives, ce chapitre négocie la temporalité de la rencontre manquée (la temporalité de l'automaton et de la répétition). En explorant la temporalité narrative (prolepse et analepse) conjointement à la psycho-poétique du double, ce chapitre essaie de dévoiler les vicissitudes de la mélancolie et la “dépression narcissique” dans Moby-Dick (en particulier la répétition d'Achab lors de sa rencontre originelle dénarrée ou jamais racontée avec le cachalot blanc et sa position mélancolique par rapport à l'objet qu'il a perdu). En exposant la nature du trauma comme une rencontre manquée, dont les résidus se manifestent symptomatiquement par la répétition (et le doublement), ce chapitre explique le glissement de la lettre (par l'entremise du supplément et de la différance). Le troisième chapitre élargit la portée de la rencontre manquée pour inclure les Autres de l'Amérique. Le but principal de ce chapitre est d'évaluer les investitures politiques, culturelles, imaginaires et libidinales de la rencontre manquée dans le Réel, le Symbolique nationale des États-Unis et la réalité géopolitique actuelle. Il traite également de la relation ambiguë entre la jouissance et le Symbolique: la manière dont la jouissance anime et régit le Symbolique tout en confondant la distinction entre le Réel et la réalité et en protégeant ses manœuvres excessives.
Resumo:
The physical, psychological and sexual violence among the couples of adolescents and young adults that are not married neither cohabiting (well-known generally as “dating violence”), has been object of a vast number of investigations in the last two decades that show a high prevalence inside the adolescent and juvenile population. The objective of this work was to carry out an analysis of the literature in connection with the prevalence, risk factors and difficulties associated with this partner violence type. This analysis allowed to elaborate an outline of the factors that could favor the acts of violence, including the previous experiences of victimization inside and outside the family, the acceptance of the violence toward the couple, and the relationship with pairs that have exercised this form of violence.
Resumo:
In this essay Alison Donnell returns to the material object of Edward Baugh's essay, published in the pages of the Trinidadian little magazine Tapia in 1977, in order to re-read the force of its arguments in the context of its own politicocultural history and to assess the significance of its publication venue. Donnell attends to Baugh's own standing in the highly charged field of Caribbean literary criticism as a critic of both Walcott and Naipaul, and acknowledges his creative contribution to this field as a poet. She also considers how, in the years between the original publication of Baugh's article and its republication, the questions of historical invisibility have entered newly disputed territories that demand attention to how gender, indigeneity, spirituality, and sexuality shape ideas of historical and literary legitimacy, in addition to those foundational questions around a politics of race and class.
Resumo:
Based on literature review, electronic systems design employ largely top-down methodology. The top-down methodology is vital for success in the synthesis and implementation of electronic systems. In this context, this paper presents a new computational tool, named BD2XML, to support electronic systems design. From a block diagram system of mixed-signal is generated object code in XML markup language. XML language is interesting because it has great flexibility and readability. The BD2XML was developed with object-oriented paradigm. It was used the AD7528 converter modeled in MATLAB / Simulink as a case study. The MATLAB / Simulink was chosen as a target due to its wide dissemination in academia and industry. From this case study it is possible to demonstrate the functionality of the BD2XML and make it a reflection on the design challenges. Therefore, an automatic tool for electronic systems design reduces the time and costs of the design.
Resumo:
Visual working memory (VWM) involves maintaining and processing visual information, often for the purpose of making immediate decisions. Neuroimaging experiments of VWM provide evidence in support of a neural system mainly involving a fronto-parietal neuronal network, but the role of specific brain areas is less clear. A proposal that has recently generated considerable debate suggests that a dissociation of object and location VWM occurs within the prefrontal cortex, in dorsal and ventral regions, respectively. However, re-examination of the relevant literature presents a more robust distribution suggestive of a general caudal-rostral dissociation from occipital and parietal structures, caudally, to prefrontal regions, rostrally, corresponding to location and object memory, respectively. The purpose of the present study was to identify a dissociation of location and object VWM across two imaging methods (magnetoencephalography, MEG, and functional magnetic imaging, fMRI). These two techniques provide complimentary results due the high temporal resolution of MEG and the high spatial resolution of fMRI. The use of identical location and object change detection tasks was employed across techniques and reported for the first time. Moreover, this study is the first to use matched stimulus displays across location and object VWM conditions. The results from these two imaging methods provided convergent evidence of a location and object VWM dissociation favoring a general caudal-rostral rather than the more common prefrontal dorsal-ventral view. Moreover, neural activity across techniques was correlated with behavioral performance for the first time and provided convergent results. This novel approach of combining imaging tools to study memory resulted in robust evidence suggesting a novel interpretation of location and object memory. Accordingly, this study presents a novel context within which to explore the neural substrates of WM across imaging techniques and populations.
Resumo:
With the availability of lower cost but highly skilled software development labor from offshore regions, entrepreneurs from developed countries who do not have software development experience can utilize this workforce to develop innovative software products. In order to succeed in offshored innovation projects, the often extreme knowledge boundaries between the onsite entrepreneur and the offshore software development team have to be overcome. Prior research has proposed that boundary objects are critical for bridging such boundaries – if they are appropriately used. Our longitudinal, revelatory case study of a software innovation project is one of the first to explore the role of the software prototype as a digital boundary object. Our study empirically unpacks five use practices that transform the software prototype into a boundary object such that knowledge boundaries are bridged. Our findings provide new theoretical insights for literature on software innovation and boundary objects, and have implications for practice.
Resumo:
En esta tesis se presenta un análisis en profundidad de cómo se deben utilizar dos tipos de métodos directos, Lucas-Kanade e Inverse Compositional, en imágenes RGB-D y se analiza la capacidad y precisión de los mismos en una serie de experimentos sintéticos. Estos simulan imágenes RGB, imágenes de profundidad (D) e imágenes RGB-D para comprobar cómo se comportan en cada una de las combinaciones. Además, se analizan estos métodos sin ninguna técnica adicional que modifique el algoritmo original ni que lo apoye en su tarea de optimización tal y como sucede en la mayoría de los artículos encontrados en la literatura. Esto se hace con el fin de poder entender cuándo y por qué los métodos convergen o divergen para que así en el futuro cualquier interesado pueda aplicar los conocimientos adquiridos en esta tesis de forma práctica. Esta tesis debería ayudar al futuro interesado a decidir qué algoritmo conviene más en una determinada situación y debería también ayudarle a entender qué problemas le pueden dar estos algoritmos para poder poner el remedio más apropiado. Las técnicas adicionales que sirven de remedio para estos problemas quedan fuera de los contenidos que abarca esta tesis, sin embargo, sí se hace una revisión sobre ellas.---ABSTRACT---This thesis presents an in-depth analysis about how direct methods such as Lucas- Kanade and Inverse Compositional can be applied in RGB-D images. The capability and accuracy of these methods is also analyzed employing a series of synthetic experiments. These simulate the efects produced by RGB images, depth images and RGB-D images so that diferent combinations can be evaluated. Moreover, these methods are analyzed without using any additional technique that modifies the original algorithm or that aids the algorithm in its search for a global optima unlike most of the articles found in the literature. Our goal is to understand when and why do these methods converge or diverge so that in the future, the knowledge extracted from the results presented here can efectively help a potential implementer. After reading this thesis, the implementer should be able to decide which algorithm fits best for a particular task and should also know which are the problems that have to be addressed in each algorithm so that an appropriate correction is implemented using additional techniques. These additional techniques are outside the scope of this thesis, however, they are reviewed from the literature.
Resumo:
This dissertation examines the corpse as an object in and of American hardboiled detective fiction written between 1920 and 1950. I deploy several theoretical frames, including narratology, body-as-text theory, object relations theory, and genre theory, in order to demonstrate the significance of objects, symbols, and things primarily in the clever and crafty work of Dashiell Hammett (1894-1961) and Raymond Chandler (1888-1959), but also touching on the writings of their lesser known accomplices. I construct a literary genealogy of American hardboiled detective fiction originating in the writings of Edgar Allan Poe, compare the contributions of classic or Golden Age detective fiction in England, and describe the socio-economic contexts, particularly the predominance of the “pulps,” that gave birth to the realism of the Hardboiled School. Taking seriously Chandler’s obsession with the art of murder, I engage with how authors pre-empt their readers’ knowledge of the tricks of the trade and manipulate their expectations, as well as discuss the characteristics and effect of the inimitable hardboiled style, its sharpshooting language and deadpan humour. Critical scholarship has rarely addressed the body and figure of the corpse, preferring to focus instead on the machinations of the femme fatale, the performance of masculinity, or the prevalence of violence. I cast new light on the world of hardboiled detective fiction by dissecting the corpse as the object that both motivates and de-composes (or rots away from) the narrative that makes it signify. I treat the corpse as an inanimate object, indifferent to representation, that destabilizes the integrity and self-possession, as well as the ratiocination, of the detective who authors the narrative of how the corpse came to be. The corpse is all deceptive and dangerous surface rather than the container of hidden depths of life and meaning that the detective hopes to uncover and reconstruct. I conclude with a chapter that is both critical denouement and creative writing experiment to reveal the self-reflexive (and at times metafictional) dimensions of hardboiled fiction. My dissertation, too, in the manner of hardboiled fiction, hopes to incriminate my readers as much as enlighten them.
Resumo:
Play is the primary occupation of childhood and provides a potentially powerful means of assessing and treating children with autistic disorder. This study utilized a cross-sectional comparison design to investigate the nature of play engagement in children with AD (n = 24), relative to typically developing children (n = 34) matched for chronological age. Play behaviours were recorded in a clinical play environment. Videotapes comprising 15 minutes of the children's spontaneous play behaviour were analysed using time-interval analysis. The particular play behaviours observed and play objects used were coded. Differences in play behaviours (p < 0.0001) and play object preferences (p < 0.0001) were identified between the groups. Findings regarding play behaviour contribute to contention in the literature surrounding functional and symbolic play. Explanations for play object preferences are postulated. Recommendations are made regarding clinical application of findings in terms of enhancing assessment and intervention by augmenting motivation.
Resumo:
Most object-based approaches to Geographical Information Systems (GIS) have concentrated on the representation of geometric properties of objects in terms of fixed geometry. In our road traffic marking application domain we have a requirement to represent the static locations of the road markings but also enforce the associated regulations, which are typically geometric in nature. For example a give way line of a pedestrian crossing in the UK must be within 1100-3000 mm of the edge of the crossing pattern. In previous studies of the application of spatial rules (often called 'business logic') in GIS emphasis has been placed on the representation of topological constraints and data integrity checks. There is very little GIS literature that describes models for geometric rules, although there are some examples in the Computer Aided Design (CAD) literature. This paper introduces some of the ideas from so called variational CAD models to the GIS application domain, and extends these using a Geography Markup Language (GML) based representation. In our application we have an additional requirement; the geometric rules are often changed and vary from country to country so should be represented in a flexible manner. In this paper we describe an elegant solution to the representation of geometric rules, such as requiring lines to be offset from other objects. The method uses a feature-property model embraced in GML 3.1 and extends the possible relationships in feature collections to permit the application of parameterized geometric constraints to sub features. We show the parametric rule model we have developed and discuss the advantage of using simple parametric expressions in the rule base. We discuss the possibilities and limitations of our approach and relate our data model to GML 3.1. © 2006 Springer-Verlag Berlin Heidelberg.
Resumo:
We present a novel analysis of the state of the art in object tracking with respect to diversity found in its main component, an ensemble classifier that is updated in an online manner. We employ established measures for diversity and performance from the rich literature on ensemble classification and online learning, and present a detailed evaluation of diversity and performance on benchmark sequences in order to gain an insight into how the tracking performance can be improved. © Springer-Verlag 2013.
Resumo:
In the visual perception literature, the recognition of faces has often been contrasted with that of non-face objects, in terms of differences with regard to the role of parts, part relations and holistic processing. However, recent evidence from developmental studies has begun to blur this sharp distinction. We review evidence for a protracted development of object recognition that is reminiscent of the well-documented slow maturation observed for faces. The prolonged development manifests itself in a retarded processing of metric part relations as opposed to that of individual parts and offers surprising parallels to developmental accounts of face recognition, even though the interpretation of the data is less clear with regard to holistic processing. We conclude that such results might indicate functional commonalities between the mechanisms underlying the recognition of faces and non-face objects, which are modulated by different task requirements in the two stimulus domains.