764 resultados para Labels.
An external field prior for the hidden Potts model with application to cone-beam computed tomography
Resumo:
In images with low contrast-to-noise ratio (CNR), the information gain from the observed pixel values can be insufficient to distinguish foreground objects. A Bayesian approach to this problem is to incorporate prior information about the objects into a statistical model. A method for representing spatial prior information as an external field in a hidden Potts model is introduced. This prior distribution over the latent pixel labels is a mixture of Gaussian fields, centred on the positions of the objects at a previous point in time. It is particularly applicable in longitudinal imaging studies, where the manual segmentation of one image can be used as a prior for automatic segmentation of subsequent images. The method is demonstrated by application to cone-beam computed tomography (CT), an imaging modality that exhibits distortions in pixel values due to X-ray scatter. The external field prior results in a substantial improvement in segmentation accuracy, reducing the mean pixel misclassification rate for an electron density phantom from 87% to 6%. The method is also applied to radiotherapy patient data, demonstrating how to derive the external field prior in a clinical context.
Resumo:
In this paper we present for the first time a complete symbolic navigation system that performs goal-directed exploration to unfamiliar environments on a physical robot. We introduce a novel construct called the abstract map to link provided symbolic spatial information with observed symbolic information and actual places in the real world. Symbolic information is observed using a text recognition system that has been developed specifically for the application of reading door labels. In the study described in this paper, the robot was provided with a floor plan and a destination. The destination was specified by a room number, used both in the floor plan and on the door to the room. The robot autonomously navigated to the destination using its text recognition, abstract map, mapping, and path planning systems. The robot used the symbolic navigation system to determine an efficient path to the destination, and reached the goal in two different real-world environments. Simulation results show that the system reduces the time required to navigate to a goal when compared to random exploration.
Resumo:
This paper presents a technique for the automated removal of noise from process execution logs. Noise is the result of data quality issues such as logging errors and manifests itself in the form of infrequent process behavior. The proposed technique generates an abstract representation of an event log as an automaton capturing the direct follows relations between event labels. This automaton is then pruned from arcs with low relative frequency and used to remove from the log those events not fitting the automaton, which are identified as outliers. The technique has been extensively evaluated on top of various auto- mated process discovery algorithms using both artificial logs with different levels of noise, as well as a variety of real-life logs. The results show that the technique significantly improves the quality of the discovered process model along fitness, appropriateness and simplicity, without negative effects on generalization. Further, the technique scales well to large and complex logs.
Resumo:
In the United States, there has been fierce debate over state, federal and international efforts to engage in genetically modified food labelling (GM food labelling). A grassroots coalition of consumers, environmentalists, organic farmers, and the food movement has pushed for law reform in respect of GM food labelling. The Just Label It campaign has encouraged United States consumers to send comments to the United States Food and Drug Administration to label genetically modified foods. This Chapter explores the various justifications made in respect of genetically modified food labelling. There has been a considerable effort to portray the issue of GM food labelling as one of consumer rights as part of ‘the right to know’. There has been a significant battle amongst farmers over GM food labelling – with organic farmers and biotechnology companies, fighting for precedence. There has also been a significant discussion about the use of GM food labelling as a form of environmental legislation. The prescriptions in GM food labelling regulations may serve to promote eco-labelling, and deter greenwashing. There has been a significant debate over whether GM food labelling may serve to regulate corporations – particularly from the food, agriculture, and biotechnology industries. There are significant issues about the interaction between intellectual property laws – particularly in respect of trade mark law and consumer protection – and regulatory proposals focused upon biotechnology. There has been a lack of international harmonization in respect of GM food labelling. As such, there has been a major use of comparative arguments about regulator models in respect of food labelling. There has also been a discussion about international law, particularly with the emergence of sweeping regional trade proposals, such as the Trans-Pacific Partnership, and the Trans-Atlantic Trade and Investment Partnership. This Chapter considers the United States debates over genetically modified food labelling – at state, federal, and international levels. The battles often involved the use of citizen-initiated referenda. The policy conflicts have been policy-centric disputes – pitting organic farmers, consumers, and environmentalists against the food industry and biotechnology industry. Such battles have raised questions about consumer rights, public health, freedom of speech, and corporate rights. The disputes highlighted larger issues about lobbying, fund-raising, and political influence. The role of money in United States has been a prominent concern of Lawrence Lessig in his recent academic and policy work with the group, Rootstrikers. Part 1 considers the debate in California over Proposition 37. Part 2 explores other key state initiatives in respect of GM food labelling. Part 3 examines the Federal debate in the United States over GM food labelling. Part 4 explores whether regional trade agreements – such as the Trans-Pacific Partnership (TPP) and the Trans-Atlantic Trade and Investment Partnership (TTIP) – will impact upon
Resumo:
This paper addresses the problem of predicting the outcome of an ongoing case of a business process based on event logs. In this setting, the outcome of a case may refer for example to the achievement of a performance objective or the fulfillment of a compliance rule upon completion of the case. Given a log consisting of traces of completed cases, given a trace of an ongoing case, and given two or more possible out- comes (e.g., a positive and a negative outcome), the paper addresses the problem of determining the most likely outcome for the case in question. Previous approaches to this problem are largely based on simple symbolic sequence classification, meaning that they extract features from traces seen as sequences of event labels, and use these features to construct a classifier for runtime prediction. In doing so, these approaches ignore the data payload associated to each event. This paper approaches the problem from a different angle by treating traces as complex symbolic sequences, that is, sequences of events each carrying a data payload. In this context, the paper outlines different feature encodings of complex symbolic sequences and compares their predictive accuracy on real-life business process event logs.
Resumo:
Similar to most other creative industries, the evolution of the music industry is heavily shaped by media technologies. This was equally true in 1999, when the global recorded music industry had experienced two decades of continuous growth largely driven by the rapid transition from vinyl records to Compact Discs. The transition encouraged avid music listeners to purchase much of their music collections all over again in order to listen to their favourite music with ‘digital sound’. As a consequence of this successful product innovation, recorded music sales (unit measure) more than doubled between the early 1980s and the end of the 1990s. It was with this backdrop that the first peer-to-peer file sharing service was developed and released to the mainstream music market in 1999 by the college student Shawn Fanning. The service was named Napster and it marks the beginning of an era that is now a classic example of how an innovation is able to disrupt an entire industry and make large swathes of existing industry competences obsolete. File sharing services such as Napster, followed by a range of similar services in its path, reduced physical unit sales in the music industry to levels that had not been seen since the 1970s. The severe impact of the internet on physical sales shocked many music industry executives who spent much of the 2000s vigorously trying to reverse the decline and make the disruptive technologies go away. At the end, they learned that their efforts were to no avail and the impact on the music industry proved to be transformative, irreversible and, to many music industry professionals, also devastating. Thousands of people lost their livelihood, large and small music companies have folded or been forced into mergers or acquisitions. But as always during periods of disruption, the past 15 years have also been very innovative, spurring a plethora of new music business models. These new business models have mainly emerged outside the music industry and the innovators have been often been required to be both persuasive and persistent in order to get acceptance from the risk-averse and cash-poor music industry establishment. Apple was one such change agent that in 2003 was the first company to open up a functioning and legal market for online music. iTunes Music Store was the first online retail outlet that was able to offer the music catalogues from all the major music companies; it used an entirely novel pricing model, and it allowed consumers to de-bundle the music album and only buy the songs that they actually liked. Songs had previously been bundled by physical necessity as discs or cassettes, but with iTunes Music Store, the institutionalized album bundle slowly started to fall apart. The consequences had an immediate impact on music retailing and within just a few years, many brick and mortar record stores were forced out of business in markets across the world. The transformation also had disruptive consequences beyond music retailing and redefined music companies’ organizational structures, work processes and routines, as well as professional roles. iTunes Music Store in one sense was a disruptive innovation, but it was at the same time relatively incremental, since the major labels positions and power structures remained largely unscathed. The rights holders still controlled their intellectual properties and the structures that guided the royalties paid per song that was sold were predictable, transparent and in line with established music industry practices.
Resumo:
High end network security applications demand high speed operation and large rule set support. Packet classification is the core functionality that demands high throughput in such applications. This paper proposes a packet classification architecture to meet such high throughput. We have implemented a Firewall with this architecture in reconflgurable hardware. We propose an extension to Distributed Crossproducting of Field Labels (DCFL) technique to achieve scalable and high performance architecture. The implemented Firewall takes advantage of inherent structure and redundancy of rule set by using our DCFL Extended (DCFLE) algorithm. The use of DCFLE algorithm results in both speed and area improvement when it is implemented in hardware. Although we restrict ourselves to standard 5-tuple matching, the architecture supports additional fields. High throughput classification invariably uses Ternary Content Addressable Memory (TCAM) for prefix matching, though TCAM fares poorly in terms of area and power efficiency. Use of TCAM for port range matching is expensive, as the range to prefix conversion results in large number of prefixes leading to storage inefficiency. Extended TCAM (ETCAM) is fast and the most storage efficient solution for range matching. We present for the first time a reconfigurable hardware implementation of ETCAM. We have implemented our Firewall as an embedded system on Virtex-II Pro FPGA based platform, running Linux with the packet classification in hardware. The Firewall was tested in real time with 1 Gbps Ethernet link and 128 sample rules. The packet classification hardware uses a quarter of logic resources and slightly over one third of memory resources of XC2VP30 FPGA. It achieves a maximum classification throughput of 50 million packet/s corresponding to 16 Gbps link rate for the worst case packet size. The Firewall rule update involves only memory re-initialization in software without any hardware change.
Resumo:
High end network security applications demand high speed operation and large rule set support. Packet classification is the core functionality that demands high throughput in such applications. This paper proposes a packet classification architecture to meet such high throughput. We have Implemented a Firewall with this architecture in reconfigurable hardware. We propose an extension to Distributed Crossproducting of Field Labels (DCFL) technique to achieve scalable and high performance architecture. The implemented Firewall takes advantage of inherent structure and redundancy of rule set by using, our DCFL Extended (DCFLE) algorithm. The use of DCFLE algorithm results In both speed and area Improvement when It is Implemented in hardware. Although we restrict ourselves to standard 5-tuple matching, the architecture supports additional fields.High throughput classification Invariably uses Ternary Content Addressable Memory (TCAM) for prefix matching, though TCAM fares poorly In terms of area and power efficiency. Use of TCAM for port range matching is expensive, as the range to prefix conversion results in large number of prefixes leading to storage inefficiency. Extended TCAM (ETCAM) is fast and the most storage efficient solution for range matching. We present for the first time a reconfigurable hardware Implementation of ETCAM. We have implemented our Firewall as an embedded system on Virtex-II Pro FPGA based platform, running Linux with the packet classification in hardware. The Firewall was tested in real time with 1 Gbps Ethernet link and 128 sample rules. The packet classification hardware uses a quarter of logic resources and slightly over one third of memory resources of XC2VP30 FPGA. It achieves a maximum classification throughput of 50 million packet/s corresponding to 16 Gbps link rate for file worst case packet size. The Firewall rule update Involves only memory re-initialiization in software without any hardware change.
Resumo:
We examined whether homophobic epithets (e.g., faggot) function as labels of deviance for homosexuals that contribute to their dehumanization and physical distance. Across two studies, participants were supraliminally (Study 1) and subliminally (Study 2) exposed to a homophobic epithet, a category label, or a generic insult. Participants were then asked to associate human related and animal-related words to homosexuals and heterosexuals. Results showed that after exposure to a homophobic epithet, compared with a category label or a generic insult, participants associated less human-related words with homosexuals, indicating dehumanization. In Study 2, we also assessed the effect of a homophobic epithet on physical distance from a target group member and found that homophobic epithets led to greater physical distancing of a gay man. These findings indicate that homophobic epithets foster dehumanization and avoidance of gay people, in ways that other insults or labels do not.
Resumo:
The minimum cost classifier when general cost functionsare associated with the tasks of feature measurement and classification is formulated as a decision graph which does not reject class labels at intermediate stages. Noting its complexities, a heuristic procedure to simplify this scheme to a binary decision tree is presented. The optimizationof the binary tree in this context is carried out using ynamicprogramming. This technique is applied to the voiced-unvoiced-silence classification in speech processing.
Resumo:
This dissertation analyzes the interrelationship between death, the conditions of (wo)man s social being, and the notion of value as it emerges in the fiction of the American novelist Thomas Pynchon (1937 ). Pynchon s present work includes six novels V. (1963), The Crying of Lot 49 (1966), Gravity s Rainbow (1973), Vineland (1990), Mason & Dixon (1997), Against the Day (2006) and several short stories. Death constitues a central thematic in Pynchon s work, and it emerges through recurrent questions of mortality, suicide, mass destruction, sacrifice, afterlife, entropy, the relationship between the animate and the inanimate, and the limits of representation. In Pynchon, death is never a mere biological given (or event); it is always determined within a certain historical, cultural, and ideological context. Throughout his work, Pynchon questions the strict ontological separation of life and death by showing the relationship between this separation and social power. Conceptual divisions also reflect the relationship between society and its others, and death becomes that through which lines of social demarcation are articulated. Determined as a conceptual and social "other side", death in Pynchon forms a challenge to modern culture, and makes an unexpected return: the dead return to haunt the living, the inanimate and the animate fuse, and technoscientific attempts at overcoming and controlling death result in its re-emergence in mass destruction and ecological damage. The questioning of the ontological line also affects the structuration of Pynchon's prose, where the recurrent narrated and narrative desire to reach the limits of representation is openly associated with death. Textualized, death appears in Pynchon's writing as a sudden rupture within the textual functioning, when the "other side", that is, the bare materiality of the signifier is foregrounded. In this study, Pynchon s cultural criticism and his poetics come together, and I analyze the subversive role of death in his fiction through Jean Baudrillard s genealogy of the modern notion of death from L échange symbolique et la mort (1976). Baudrillard sees an intrinsic bond between the social repression of death in modernity and the emergence of modern political economy, and in his analysis economy and language appear as parallel systems for generating value (exchange value/ sign-value). For Baudrillard, the modern notion of death as negativity in relation to the positivity of life, and the fact that death cannot be given a proper meaning, betray an antagonistic relation between death and the notion of value. As a mode of negativity (that is, non-value), death becomes a moment of rupture in relation to value-based thinking in short, rationalism. Through this rupture emerges a form of thinking Baudrillard labels the symbolic, characterized by ambivalence and the subversion of conceptual opposites.
Resumo:
The statistical minimum risk pattern recognition problem, when the classification costs are random variables of unknown statistics, is considered. Using medical diagnosis as a possible application, the problem of learning the optimal decision scheme is studied for a two-class twoaction case, as a first step. This reduces to the problem of learning the optimum threshold (for taking appropriate action) on the a posteriori probability of one class. A recursive procedure for updating an estimate of the threshold is proposed. The estimation procedure does not require the knowledge of actual class labels of the sample patterns in the design set. The adaptive scheme of using the present threshold estimate for taking action on the next sample is shown to converge, in probability, to the optimum. The results of a computer simulation study of three learning schemes demonstrate the theoretically predictable salient features of the adaptive scheme.
Resumo:
Relaxation labeling processes are a class of mechanisms that solve the problem of assigning labels to objects in a manner that is consistent with respect to some domain-specific constraints. We reformulate this using the model of a team of learning automata interacting with an environment or a high-level critic that gives noisy responses as to the consistency of a tentative labeling selected by the automata. This results in an iterative linear algorithm that is itself probabilistic. Using an explicit definition of consistency we give a complete analysis of this probabilistic relaxation process using weak convergence results for stochastic algorithms. Our model can accommodate a range of uncertainties in the compatibility functions. We prove a local convergence result and show that the point of convergence depends both on the initial labeling and the constraints. The algorithm is implementable in a highly parallel fashion.
Resumo:
The use of paramagnetic probes in membrane research is reviewed. Electron paramagnetic resonance studies on model and biological membranes doped with covalent and non-covalent spin-labels have been discussed with special emphasis on the methodology and the type of information obtainable on several important phenomena like membrane fluidity, lipid flip-flop, lateral diffusion of lipids, lipid phase separation, lipid bilayer phase transitions, lipid-protein interactions and membrane permeability. Nuclear magnetic resonance spectroscopy has also been effectively used to study the conformations of cation mediators across membranes and to analyse in detail the transmembrane ionic motions at the mechanistic level.
Resumo:
Australian shoppers have inadvertently invited global discount grocers to our shores by demonstrating their readiness to adopt private labels. In 2001, German discounter Aldi opened its first store in Sydney. The impact this business format would have on the Australian grocery sector was underestimated.