915 resultados para variance shadow maps
Resumo:
This paper is devoted to the analysis of career paths and employability. The state-of-the-art on this topic is rather poor in methodologies. Some authors propose distances well adapted to the data, but are limiting their analysis to hierarchical clustering. Other authors apply sophisticated methods, but only after paying the price of transforming the categorical data into continuous, via a factorial analysis. The latter approach has an important drawback since it makes a linear assumption on the data. We propose a new methodology, inspired from biology and adapted to career paths, combining optimal matching and self-organizing maps. A complete study on real-life data will illustrate our proposal.
Resumo:
This paper presents a mapping and navigation system for a mobile robot, which uses vision as its sole sensor modality. The system enables the robot to navigate autonomously, plan paths and avoid obstacles using a vision based topometric map of its environment. The map consists of a globally-consistent pose-graph with a local 3D point cloud attached to each of its nodes. These point clouds are used for direction independent loop closure and to dynamically generate 2D metric maps for locally optimal path planning. Using this locally semi-continuous metric space, the robot performs shortest path planning instead of following the nodes of the graph --- as is done with most other vision-only navigation approaches. The system exploits the local accuracy of visual odometry in creating local metric maps, and uses pose graph SLAM, visual appearance-based place recognition and point clouds registration to create the topometric map. The ability of the framework to sustain vision-only navigation is validated experimentally, and the system is provided as open-source software.
Resumo:
On 9 January 1927 Le Corbusier materialised on the front cover of the Faisceau journal edited by Georges Valois Le Nouveau Siècle which printed the single-point perspective of Le Corbusier’s Plan Voisin and an extract from the architect’s discourse in Urbanisme. In May Le Corbusier presented slides of his urban designs at a fascist rally. These facts have been known ever since the late 1980s when studies emerged in art history that situated Le Corbusier’s philosophy in relation to the birth of twentieth-century fascism in France—an elision in the dominant reading of Le Corbusier’s philosophy, as a project of social utopianism, whose received genealogy is Saint-Simon and Charles Fourier. Le Corbusier participated with the first group in France to call itself fascist, Valois’s militant Faisceau des Combattants et Producteurs, the “Blue Shirts,” inspired by the Italian “Fasci” of Mussolini. Thanks to Mark Antliff, we know the Faisceau did not misappropriate Le Corbusier’s plans, in some remote quasi-symbolic sense, rather Valois’s organisation was premised on the redesign of Paris based on Le Corbusier’s schematic designs. Le Corbusier’s Urbanisme was considered the “prodigious” model for the fascist state Valois called La Cité Française – after his mentor the anarcho-syndicalist Georges Sorel. Valois stated that Le Corbusier’s architectural concepts were “an expression of our profoundest thoughts,” the Faisceau, who “saw their own thought materialized” on the pages of Le Corbusier’s plans. The question I pose is, In what sense is Le Corbusier’s plan a complete representation of La Cité? For Valois, the fascist city “represents the collective will of La Cité” invoking Enlightenment philosophy, operative in Sorel, namely Rousseau, for whom the notion of “collective will” is linked to the idea of political representation: to ‘stand in’ for someone or a group of subjects i.e. the majority vote. The figures in Voisin are not empty abstractions but the result of “the will” of the “combatant-producers” who build the town. Yet, the paradox in anarcho-syndicalist anti-enlightenment thought – and one that became a problem for Le Corbusier – is precisely that of authority and representation. In Le Corbusier’s plan, the “morality of the producers” and “the master” (the transcendent authority that hovers above La Cité) is lattened into a single picture plane, thereby abolishing representation. I argue that La Cité pushed to the limits of formal abstraction by Le Corbusier thereby reverts to the Enlightenment myth it first opposed, what Theodor Adorno would call the dialectic of enlightenment.
Resumo:
In our rejoinder to Don Weatherburn's paper,"Law and Order Blues", we do not take issue with his advocacy of the need to take crime seriously and to foster a more rational approach to the problems it poses. Where differences do emerge is (1) with his claim that he is willing to do so whilst we (in our different ways) are not; and (2) on the question of what this involves. Of particular concern is the way in which his argument proceeds by a combination of simple misrepresentation of the positions it seeks to disparage, and silence concerning issues of real substance where intellectual debate and exchange would be welcome and useful. Our paper challenges, in turn, the misrepresentation of Indermaur's analysis of trends in violent crime, the misrepresentation of Hogg and Brown's Rethinking Law and Order, the misrepresentation of the findings of some of the research into the effectiveness of punitive policies and the silence on sexual assault in "Law and Order Blues". We suggest that his silence on sexual assault reflects a more widespread unwillingness to acknowledge the methodological problems that arise in the measurement of crime because such problems severely limit the extent to which confident assertions can be made about prevalence and trends.
Resumo:
Robust descriptor matching across varying lighting conditions is important for vision-based robotics. We present a novel strategy for quantifying the lighting variance of descriptors. The strategy works by utilising recovered low dimensional mappings from Isomap and our measure of the lighting variance of each of these mappings. The resultant metric allows different descriptors to be compared given a dataset and a set of keypoints. We demonstrate that the SIFT descriptor typically has lower lighting variance than other descriptors, although the result depends on semantic class and lighting conditions.
Resumo:
This paper proposes techniques to improve the performance of i-vector based speaker verification systems when only short utterances are available. Short-length utterance i-vectors vary with speaker, session variations, and the phonetic content of the utterance. Well established methods such as linear discriminant analysis (LDA), source-normalized LDA (SN-LDA) and within-class covariance normalisation (WCCN) exist for compensating the session variation but we have identified the variability introduced by phonetic content due to utterance variation as an additional source of degradation when short-duration utterances are used. To compensate for utterance variations in short i-vector speaker verification systems using cosine similarity scoring (CSS), we have introduced a short utterance variance normalization (SUVN) technique and a short utterance variance (SUV) modelling approach at the i-vector feature level. A combination of SUVN with LDA and SN-LDA is proposed to compensate the session and utterance variations and is shown to provide improvement in performance over the traditional approach of using LDA and/or SN-LDA followed by WCCN. An alternative approach is also introduced using probabilistic linear discriminant analysis (PLDA) approach to directly model the SUV. The combination of SUVN, LDA and SN-LDA followed by SUV PLDA modelling provides an improvement over the baseline PLDA approach. We also show that for this combination of techniques, the utterance variation information needs to be artificially added to full-length i-vectors for PLDA modelling.
Resumo:
Operating in vegetated environments is a major challenge for autonomous robots. Obstacle detection based only on geometric features causes the robot to consider foliage, for example, small grass tussocks that could be easily driven through, as obstacles. Classifying vegetation does not solve this problem since there might be an obstacle hidden behind the vegetation. In addition, dense vegetation typically needs to be considered as an obstacle. This paper addresses this problem by augmenting probabilistic traversability map constructed from laser data with ultra-wideband radar measurements. An adaptive detection threshold and a probabilistic sensor model are developed to convert the radar data to occupancy probabilities. The resulting map captures the fine resolution of the laser map but clears areas from the traversability map that are induced by obstacle-free foliage. Experimental results validate that this method is able to improve the accuracy of traversability maps in vegetated environments.
Resumo:
In our rejoinder to Don Weatherburn's paper, “Law and Order Blues”, we do not take issue with his advocacy of the need to take crime seriously and to foster a more rational approach to the problems it poses. Where differences do emerge is (1) with his claim that he is willing to do so whilst we (in our different ways) are not; and (2) on the question of what this involves. Of particular concern is the way in which his argument proceeds by a combination of simple misrepresentation of the positions it seeks to disparage, and silence concerning issues of real substance where intellectual debate and exchange would be welcome and useful. Our paper challenges, in turn, the misrepresentation of Indermaur's analysis of trends in violent crime, the misrepresentation of Hogg and Brown's Rethinking Law and Order, the misrepresentation of the findings of some of the research into the effectiveness of punitive policies and the silence on sexual assault in “Law and Order Blues”. We suggest that his silence on sexual assault reflects a more widespread unwillingness to acknowledge the methodological problems that arise in the measurement of crime because such problems severely limit the extent to which confident assertions can be made about prevalence and trends.
Resumo:
In a total solar eclipse, the Moon completely covers the Sun, casting a shadow several hundred km wide across the face of the Earth. This paper describes observations of the 14 November 2012 total eclipse of the Sun visible from north Queensland, Australia. The edge of the umbra was captured on video during totality, and this video is provided for teaching purposes. A series of simple 'kitchen' experiments are described which demonstrate the 'sunset' effect seen on the horizon during a total solar eclipse and also the curved umbra seen in the sky when the eclipsed Sun is relatively close to the horizon.
Resumo:
BACKGROUND: The increasing number of assembled mammalian genomes makes it possible to compare genome organisation across mammalian lineages and reconstruct chromosomes of the ancestral marsupial and therian (marsupial and eutherian) mammals. However, the reconstruction of ancestral genomes requires genome assemblies to be anchored to chromosomes. The recently sequenced tammar wallaby (Macropus eugenii) genome was assembled into over 300,000 contigs. We previously devised an efficient strategy for mapping large evolutionarily conserved blocks in non-model mammals, and applied this to determine the arrangement of conserved blocks on all wallaby chromosomes, thereby permitting comparative maps to be constructed and resolve the long debated issue between a 2n=14 and 2n=22 ancestral marsupial karyotype. RESULTS: We identified large blocks of genes conserved between human and opossum, and mapped genes corresponding to the ends of these blocks by fluorescence in situ hybridization (FISH). A total of 242 genes was assigned to wallaby chromosomes in the present study, bringing the total number of genes mapped to 554 and making it the most densely cytogenetically mapped marsupial genome. We used these gene assignments to construct comparative maps between wallaby and opossum, which uncovered many intrachromosomal rearrangements, particularly for genes found on wallaby chromosomes X and 3. Expanding comparisons to include chicken and human permitted the putative ancestral marsupial (2n=14) and therian mammal (2n=19) karyotypes to be reconstructed. CONCLUSIONS: Our physical mapping data for the tammar wallaby has uncovered the events shaping marsupial genomes and enabled us to predict the ancestral marsupial karyotype, supporting a 2n=14 ancestor. Futhermore, our predicted therian ancestral karyotype has helped to understand the evolution of the ancestral eutherian genome.
Resumo:
Recently, mean-variance analysis has been proposed as a novel paradigm to model document ranking in Information Retrieval. The main merit of this approach is that it diversifies the ranking of retrieved documents. In its original formulation, the strategy considers both the mean of relevance estimates of retrieved documents and their variance. How- ever, when this strategy has been empirically instantiated, the concepts of mean and variance are discarded in favour of a point-wise estimation of relevance (to replace the mean) and of a parameter to be tuned or, alternatively, a quantity dependent upon the document length (to replace the variance). In this paper we revisit this ranking strategy by going back to its roots: mean and variance. For each retrieved document, we infer a relevance distribution from a series of point-wise relevance estimations provided by a number of different systems. This is used to compute the mean and the variance of document relevance estimates. On the TREC Clueweb collection, we show that this approach improves the retrieval performances. This development could lead to new strategies to address the fusion of relevance estimates provided by different systems.
Resumo:
A defining characteristic of contemporary welfare governance in many western countries has been a reduced role for governments in direct provision of welfare, including housing, education, health and income support. One of the unintended consequences of devolutionary trends in social welfare is the development of a ‘shadow welfare state’ (Fairbanks, 2009; Gottschalk, 2000), which is a term used to describe the complex partnerships between statebased social protection, voluntarism and marketised forms of welfare. Coupled with this development, conditional workfare schemes in countries such as the United States, Canada, the UK and Australia are pushing more people into informal and semi-formal means of poverty survival (Karger, 2005). These transformations are actively reshaping welfare subjectivities and the role of the state in urban governance. Like other countries such as the US, Canada and the UK, the fringe lending sector in Australia has experienced considerable growth over the last decade. Large numbers of people on low incomes in Australia are turning to non-mainstream financial services, such as payday lenders, for the provision of credit to make ends meet. In this paper, we argue that the use of fringe lenders by people on low incomes reveals important theoretical and practical insights into the relationship between the mixed economy of welfare and the mixed economy of credit in poverty survival.
Resumo:
We have analyzed segregation patterns of markers among the late generation progeny of several crosses of pea. From the patterns of association of these markers we have deduced linkage orders. Salient features of these linkages are discussed, as is the relationship between the data presented here and previously published genetic and cytogenetic data.
Resumo:
We examine some variations of standard probability designs that preferentially sample sites based on how easy they are to access. Preferential sampling designs deliver unbiased estimates of mean and sampling variance and will ease the burden of data collection but at what cost to our design efficiency? Preferential sampling has the potential to either increase or decrease sampling variance depending on the application. We carry out a simulation study to gauge what effect it will have when sampling Soil Organic Carbon (SOC) values in a large agricultural region in south-eastern Australia. Preferential sampling in this region can reduce the distance to travel by up to 16%. Our study is based on a dataset of predicted SOC values produced from a datamining exercise. We consider three designs and two ways to determine ease of access. The overall conclusion is that sampling performance deteriorates as the strength of preferential sampling increases, due to the fact the regions of high SOC are harder to access. So our designs are inadvertently targeting regions of low SOC value. The good news, however, is that Generalised Random Tessellation Stratification (GRTS) sampling designs are not as badly affected as others and GRTS remains an efficient design compared to competitors.
Resumo:
In Crypto’95, Micali and Sidney proposed a method for shared generation of a pseudo-random function f(·) among n players in such a way that for all the inputs x, any u players can compute f(x) while t or fewer players fail to do so, where 0 ≤ t < u ≤ n. The idea behind the Micali-Sidney scheme is to generate and distribute secret seeds S = s1, . . . , sd of a poly-random collection of functions, among the n players, each player gets a subset of S, in such a way that any u players together hold all the secret seeds in S while any t or fewer players will lack at least one element from S. The pseudo-random function is then computed as where f s i (·)’s are poly-random functions. One question raised by Micali and Sidney is how to distribute the secret seeds satisfying the above condition such that the number of seeds, d, is as small as possible. In this paper, we continue the work of Micali and Sidney. We first provide a general framework for shared generation of pseudo-random function using cumulative maps. We demonstrate that the Micali-Sidney scheme is a special case of this general construction.We then derive an upper and a lower bound for d. Finally we give a simple, yet efficient, approximation greedy algorithm for generating the secret seeds S in which d is close to the optimum by a factor of at most u ln 2.