998 resultados para false set


Relevância:

60.00% 60.00%

Publicador:

Resumo:

An examination of Australian media reports over the last twelve months on the subject of Indigenous arts suggests a number of significant contradictions. Indigenous affairs Minister Amanda Vanstone called Aboriginal arts ‘Australia’s greatest cultural gift to the world’ (Australian, 24 January 2006), while the always-controversial expatriate Germaine Greer argued that much Indigenous art was in fact poor quality and ‘a big con’ (West Australian, 13 December 2005). Curators at France’s Musee du Quai Branly dedicated a wing of the new gallery to Aboriginal art. Yet many Indigenous leaders – including David Ross from the Central Land Council and Hetti Perkins, curator of Indigenous Arts at the Art Gallery of NSW – continue to publicise the widespread exploitation of Aboriginal artists in Central Australia by unscrupulous art dealers (Northern Territory News, 22 December 2005). Former head of the Northern Land Council and former Australian of the Year, Galarrwuy Yunupingu, who twenty years ago presented Bob Hawke with the painting Barunga Statement in celebration of the government’s commitment to a treaty, recently threatened to take the painting back from Parliament House in protest against ‘successive governments’’ neglect of Indigenous policy (Sydney Morning Herald, 21 January 2006). And in the performing arts, Richard Walley drew attention to the lack of professional recognition of Indigenous performing artists (Australian, 24 January 2006).

Such contradictions within the management and marketing of Indigenous arts have persisted for several years, and it was in response that this special issue of the Asia Pacific Journal of Arts and Cultural Management was initiated. As guest editors, we sought to present research that examines, more deeply and constructively, the marketing of Indigenous arts in Australia both historically and in the present. What emerges from this collection of five papers is a familiar scholarly theme: a tension between the ‘periphery’ and the ‘centre’, between outback and city, between larger and smaller Australian states and between Australia and other nations.

Jonathan Sweet’s ‘UNESCO and cultural heritage practice in Australia in the 1950s’ looks at the evolving relationship between Australia and the United Nations through an analysis of a significant touring exhibition: Australian Aboriginal Culture. Sweet pinpoints the 1950s as a period in which Australian museology’s approach to Indigenous cultures gradually changed, and in which Australian participation in UNESCO through the exhibition helped shape the ideological position UNESCO advocated. His article provides a useful historical contrast against which the following four articles may be read.

Chapman, Cardamone, Manahan and Rentschler look at local and contemporary issues in Indigenous arts marketing. Katrina Chapman’s ‘Positioning urban Aboriginal art in the Australian Indigenous art market’ investigates perceptions about contemporary urban Aboriginal art, concluding that the estrangement – and indeed stereotyping – of urban and traditional art creates a false set of values that urban artists are challenging. Similarly, Megan Cardamone, Esmai Manahan and Ruth Rentschler contrast perceptions of Aboriginal arts from the northern and south-eastern states, identifying crucial misconceptions that contribute to the value system applied to these arts. As Ruth Rentschler is a joint editor of this issue, the review process for this article has been managed by Katya Johanson as co-editor.

Two case studies of marketing the arts – which look at different artforms and in opposite sides of the country – then follow. Jennifer Radbourne, Janet Campbell and Vera Ding’s ‘Building audiences for Indigenous theatre’ analyses research on audiences and potential audiences for Kooemba Jdarra – Brisbane’s Indigenous performing arts company – to identify the ways in which audience attendance may be encouraged.

Finally, Jacqui Healy’s ‘Balgo 4-04’ provides a close examination of a unique art exhibition: a major commercial exhibition of the kind usually seen in Sydney and Melbourne, held in an arts centre in the middle of the Tanami Desert and retailing directly to collectors.

The editors are grateful to Warlayirti Artists Art Centre for permission to use the photographs that accompany Jacqui Healy’s article. We would also like to thank the contributors, Jo Caust for the opportunity to present this special issue, and Pearl Field for her assistance in putting it all together.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

When cement hydrated compositions are analyzed by usual initial mass basis TG curves to calculate mass losses, the higher is the amount of additive added or is the combined water content, the higher is the cement 'dilution' in the initial mass of the sample. In such cases, smaller mass changes in the different mass loss steps are obtained, due to the actual smaller content of cement in the initial mass compositions. To have a same mass basis of comparison, and to avoid erroneous results of initial components content there from, thermal analysis data and curves have to be transformed on cement calcined basis, i.e. on the basis of cement oxides mass present in the calcined samples or on the sample cement initial mass basis.The paper shows and discusses the fundamentals of these bases of calculation, with examples on free and combined water analysis, on calcium sulfate hydration during false cement set and on quantitative evaluation and comparison of pozzolanic materials activity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ever since Cox et. al published their paper, “A Secure, Robust Watermark for Multimedia” in 1996 [6], there has been tremendous progress in multimedia watermarking. The same pattern re-emerged with Agrawal and Kiernan publishing their work “Watermarking Relational Databases” in 2001 [1]. However, little attention has been given to primitive data collections with only a handful works of research known to the authors [11, 10]. This is primarily due to the absence of an attribute that differentiates marked items from unmarked item during insertion and detection process. This paper presents a distribution-independent, watermarking model that is secure against secondary-watermarking in addition to conventional attacks such as data addition, deletion and distortion. The low false positives and high capacity provide additional strength to the scheme. These claims are backed by experimental results provided in the paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of the study was to determine, through meta-analysis, the rate of confirmed false reports of sexual assault to police. The meta-analysis initially involved a search for relevant articles. The search revealed seven studies where researchers or their trained helpers evaluated reported sexual assault cases to determine the rate of confirmed false reports. The meta-analysis calculated an overall rate and tested for possible moderators of effect size. The meta-analytic rate of false reports of sexual assault was .052 (95% CIs .030, .089). The rates for the individual studies were heterogeneous, suggesting the possibility of moderators of rate. However, the four possible moderators examined, year of publication, whether the data set used had information in addition to police reports, whether the study was completed in the U.S. or elsewhere, and whether inter-rater reliabilities were reported, were all not significant. The meta-analysis of seven relevant studies shows that confirmed false allegations of sexual assault made to police occur at a significant rate. The total false reporting rate, including both confirmed and equivocal cases, would be greater than the 5 percent rate found here.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impact of erroneous genotypes having passed standard quality control (QC) can be severe in genome-wide association studies, genotype imputation, and estimation of heritability and prediction of genetic risk based on single nucleotide polymorphisms (SNP). To detect such genotyping errors, a simple two-locus QC method, based on the difference in test statistic of association between single SNPs and pairs of SNPs, was developed and applied. The proposed approach could detect many problematic SNPs with statistical significance even when standard single SNP QC analyses fail to detect them in real data. Depending on the data set used, the number of erroneous SNPs that were not filtered out by standard single SNP QC but detected by the proposed approach varied from a few hundred to thousands. Using simulated data, it was shown that the proposed method was powerful and performed better than other tested existing methods. The power of the proposed approach to detect erroneous genotypes was approximately 80% for a 3% error rate per SNP. This novel QC approach is easy to implement and computationally efficient, and can lead to a better quality of genotypes for subsequent genotype-phenotype investigations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Bloom filter is a space efficient randomized data structure for representing a set and supporting membership queries. Bloom filters intrinsically allow false positives. However, the space savings they offer outweigh the disadvantage if the false positive rates are kept sufficiently low. Inspired by the recent application of the Bloom filter in a novel multicast forwarding fabric, this paper proposes a variant of the Bloom filter, the optihash. The optihash introduces an optimization for the false positive rate at the stage of Bloom filter formation using the same amount of space at the cost of slightly more processing than the classic Bloom filter. Often Bloom filters are used in situations where a fixed amount of space is a primary constraint. We present the optihash as a good alternative to Bloom filters since the amount of space is the same and the improvements in false positives can justify the additional processing. Specifically, we show via simulations and numerical analysis that using the optihash the false positives occurrences can be reduced and controlled at a cost of small additional processing. The simulations are carried out for in-packet forwarding. In this framework, the Bloom filter is used as a compact link/route identifier and it is placed in the packet header to encode the route. At each node, the Bloom filter is queried for membership in order to make forwarding decisions. A false positive in the forwarding decision is translated into packets forwarded along an unintended outgoing link. By using the optihash, false positives can be reduced. The optimization processing is carried out in an entity termed the Topology Manger which is part of the control plane of the multicast forwarding fabric. This processing is only carried out on a per-session basis, not for every packet. The aim of this paper is to present the optihash and evaluate its false positive performances via simulations in order to measure the influence of different parameters on the false positive rate. The false positive rate for the optihash is then compared with the false positive probability of the classic Bloom filter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Decentralisation, provincial government, and regional autonomy continue as influential factors in Papua New Guinea’s political economy.  The role played in creating PNG’s provincial government system by separatist movements in East New Britain, Bougainville and elsewhere is acknowledged.  However, as the Constitutional Planning Committee (CPC) discovered during its program of consultations with the Papua New Guinean people from 1972 to 1974, there was a strong groundswell around the country for district-level governments.  This article investigates how the CPC stimulated discussion of this issue through its own activities, and how the people in their discussion groups responded to the CPC’s ‘Discussion Paper on Relations Between the Central Government and Other Levels of Government’.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Faces are complex patterns that often differ in only subtle ways. Face recognition algorithms have difficulty in coping with differences in lighting, cameras, pose, expression, etc. We propose a novel approach for facial recognition based on a new feature extraction method called fractal image-set encoding. This feature extraction method is a specialized fractal image coding technique that makes fractal codes more suitable for object and face recognition. A fractal code of a gray-scale image can be divided in two parts – geometrical parameters and luminance parameters. We show that fractal codes for an image are not unique and that we can change the set of fractal parameters without significant change in the quality of the reconstructed image. Fractal image-set coding keeps geometrical parameters the same for all images in the database. Differences between images are captured in the non-geometrical or luminance parameters – which are faster to compute. Results on a subset of the XM2VTS database are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An information filtering (IF) system monitors an incoming document stream to find the documents that match the information needs specified by the user profiles. To learn to use the user profiles effectively is one of the most challenging tasks when developing an IF system. With the document selection criteria better defined based on the users’ needs, filtering large streams of information can be more efficient and effective. To learn the user profiles, term-based approaches have been widely used in the IF community because of their simplicity and directness. Term-based approaches are relatively well established. However, these approaches have problems when dealing with polysemy and synonymy, which often lead to an information overload problem. Recently, pattern-based approaches (or Pattern Taxonomy Models (PTM) [160]) have been proposed for IF by the data mining community. These approaches are better at capturing sematic information and have shown encouraging results for improving the effectiveness of the IF system. On the other hand, pattern discovery from large data streams is not computationally efficient. Also, these approaches had to deal with low frequency pattern issues. The measures used by the data mining technique (for example, “support” and “confidences”) to learn the profile have turned out to be not suitable for filtering. They can lead to a mismatch problem. This thesis uses the rough set-based reasoning (term-based) and pattern mining approach as a unified framework for information filtering to overcome the aforementioned problems. This system consists of two stages - topic filtering and pattern mining stages. The topic filtering stage is intended to minimize information overloading by filtering out the most likely irrelevant information based on the user profiles. A novel user-profiles learning method and a theoretical model of the threshold setting have been developed by using rough set decision theory. The second stage (pattern mining) aims at solving the problem of the information mismatch. This stage is precision-oriented. A new document-ranking function has been derived by exploiting the patterns in the pattern taxonomy. The most likely relevant documents were assigned higher scores by the ranking function. Because there is a relatively small amount of documents left after the first stage, the computational cost is markedly reduced; at the same time, pattern discoveries yield more accurate results. The overall performance of the system was improved significantly. The new two-stage information filtering model has been evaluated by extensive experiments. Tests were based on the well-known IR bench-marking processes, using the latest version of the Reuters dataset, namely, the Reuters Corpus Volume 1 (RCV1). The performance of the new two-stage model was compared with both the term-based and data mining-based IF models. The results demonstrate that the proposed information filtering system outperforms significantly the other IF systems, such as the traditional Rocchio IF model, the state-of-the-art term-based models, including the BM25, Support Vector Machines (SVM), and Pattern Taxonomy Model (PTM).