801 resultados para Tate Gallery


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Australian research and technological solutions are now being applied throughout the world.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Robust facial expression recognition (FER) under occluded face conditions is challenging. It requires robust algorithms of feature extraction and investigations into the effects of different types of occlusion on the recognition performance to gain insight. Previous FER studies in this area have been limited. They have spanned recovery strategies for loss of local texture information and testing limited to only a few types of occlusion and predominantly a matched train-test strategy. This paper proposes a robust approach that employs a Monte Carlo algorithm to extract a set of Gabor based part-face templates from gallery images and converts these templates into template match distance features. The resulting feature vectors are robust to occlusion because occluded parts are covered by some but not all of the random templates. The method is evaluated using facial images with occluded regions around the eyes and the mouth, randomly placed occlusion patches of different sizes, and near-realistic occlusion of eyes with clear and solid glasses. Both matched and mis-matched train and test strategies are adopted to analyze the effects of such occlusion. Overall recognition performance and the performance for each facial expression are investigated. Experimental results on the Cohn-Kanade and JAFFE databases demonstrate the high robustness and fast processing speed of our approach, and provide useful insight into the effects of occlusion on FER. The results on the parameter sensitivity demonstrate a certain level of robustness of the approach to changes in the orientation and scale of Gabor filters, the size of templates, and occlusions ratios. Performance comparisons with previous approaches show that the proposed method is more robust to occlusion with lower reductions in accuracy from occlusion of eyes or mouth.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectives: Concentrations of troponin measured with high sensitivity troponin assays are raised in a number of emergency department (ED) patients; however many are not diagnosed with acute myocardial infarction (AMI). Clinical comparisons between the early use (2 h after presentation) of high sensitivity cardiac troponin T (hs-cTnT) and I (hs-cTnI) assays for the diagnosis of AMI have not been reported. Design and methods: Early (0 h and 2 h) hs-cTnT and hs-cTnI assay results in 1571 ED patients with potential acute coronary syndrome (ACS) without ST elevation on electrocardiograph (ECG) were evaluated. The primary outcome was diagnosis of index MI adjudicated by cardiologists using the local cTnI assay results taken ≥6 h after presentation, ECGs and clinical information. Stored samples were later analysed with hs-cTnT and hs-cTnI assays. Results: The ROC analysis for AMI (204 patients; 13.0%) for hs-cTnT and hs-cTnI after 2 h was 0.95 (95% CI: 0.94–0.97) and 0.98 (95% CI: 0.97–0.99) respectively. The sensitivity, specificity, PLR, and NLR of hs-cTnT and hs-cTnI for AMI after 2 h were 94.1% (95% CI: 90.0–96.6) and 95.6% (95% CI: 91.8–97.7), 79.0% (95% CI: 76.8–81.1) and 92.5% (95% CI: 90.9–93.7), 4.48 (95% CI: 4.02–5.00) and 12.86 (95% CI: 10.51–15.31), and 0.07 (95% CI: 0.04–0.13) and 0.05 (95% CI:0.03–0.09) respectively. Conclusions: Exclusion of AMI 2 h after presentation in emergency patients with possible ACS can be achieved using hs-cTnT or hs-cTnI assays. Significant differences in specificity of these assays are relevant and if using the hs-cTnT assay, further clinical assessment in a larger proportion of patients would be required.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traditional nearest points methods use all the samples in an image set to construct a single convex or affine hull model for classification. However, strong artificial features and noisy data may be generated from combinations of training samples when significant intra-class variations and/or noise occur in the image set. Existing multi-model approaches extract local models by clustering each image set individually only once, with fixed clusters used for matching with various image sets. This may not be optimal for discrimination, as undesirable environmental conditions (eg. illumination and pose variations) may result in the two closest clusters representing different characteristics of an object (eg. frontal face being compared to non-frontal face). To address the above problem, we propose a novel approach to enhance nearest points based methods by integrating affine/convex hull classification with an adapted multi-model approach. We first extract multiple local convex hulls from a query image set via maximum margin clustering to diminish the artificial variations and constrain the noise in local convex hulls. We then propose adaptive reference clustering (ARC) to constrain the clustering of each gallery image set by forcing the clusters to have resemblance to the clusters in the query image set. By applying ARC, noisy clusters in the query set can be discarded. Experiments on Honda, MoBo and ETH-80 datasets show that the proposed method outperforms single model approaches and other recent techniques, such as Sparse Approximated Nearest Points, Mutual Subspace Method and Manifold Discriminant Analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Existing multi-model approaches for image set classification extract local models by clustering each image set individually only once, with fixed clusters used for matching with other image sets. However, this may result in the two closest clusters to represent different characteristics of an object, due to different undesirable environmental conditions (such as variations in illumination and pose). To address this problem, we propose to constrain the clustering of each query image set by forcing the clusters to have resemblance to the clusters in the gallery image sets. We first define a Frobenius norm distance between subspaces over Grassmann manifolds based on reconstruction error. We then extract local linear subspaces from a gallery image set via sparse representation. For each local linear subspace, we adaptively construct the corresponding closest subspace from the samples of a probe image set by joint sparse representation. We show that by minimising the sparse representation reconstruction error, we approach the nearest point on a Grassmann manifold. Experiments on Honda, ETH-80 and Cambridge-Gesture datasets show that the proposed method consistently outperforms several other recent techniques, such as Affine Hull based Image Set Distance (AHISD), Sparse Approximated Nearest Points (SANP) and Manifold Discriminant Analysis (MDA).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article summarizes a panel held at the 15th Pacific Asia Conference on Information Systems (PACIS) in Brisbane, Austrailia, in 2011. The panelists proposed a new research agenda for information systems success research. The DeLone and McLean IS Success Model has been one of the most influential models in Information Systems research. However, the nature of information systems continues to change. Information systems are increasingly implemented across layers of infrastructure and application architecture. The diffusion of information systems into many spheres of life means that information systems success needs to be considered in multiple contexts. Services play a much more prominent role in the economies of countries, making the “service” context of information systems increasingly important. Further, improved understandings of theory and measurement offer new opportunities for novel approaches and new research questions about information systems success.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With specific reference to the writing of Dan Graham and the experiences of creative practice, this paper will elaborate an account of studio practice as a topology - a theory drawn from mathematics in which space is understood not as a static field but in terms of properties of connectedness, movement and differentiation. This paper will trace a brief sequence of topological formulations to draw together the expression of topology as form and its structural dimension as a methodology in the specific context of the author’s studio practice. In so doing, this paper seeks to expand the notion of topology in art beyond its association with Conceptual Art of the 1960s and 70s to propose that topology provides a dynamic theoretical model for apprehending the generative ‘logic’ that gives direction and continuity to the art-making process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dodecylamine was successfully intercalated into the layer space of kaolinite by utilizing the methanol treated kaolinite–dimethyl sulfoxide (DMSO) intercalation complex as an intermediate. The basal spacing of kaolinite, measured by X-ray diffraction (XRD), increased from 0.72 nm to 4.29 nm after the intercalation of dodecylamine. Also, the significant variation observed in the Fourier Transform Infrared Spectroscopy (FTIR) spectra of kaolinite when intercalated with dodecylamine verified the feasibility of intercalation of dodecylamine into kaolinite. Isothermal-isobaric (NPT) molecular dynamics simulation with the use of Dreiding force field was performed to probe into the layering behavior and structure of nanoconfined dodecylamine in the kaolinite gallery. The concentration profiles of the nitrogen atom, methyl group and methylene group of intercalated dodecylamine molecules in the direction perpendicular to the kaolinite basal surface indicated that the alkyl chains within the interlayer space of kaolinite exhibited an obvious layering structure. However, the unified bilayer, pseudo-trilayer, or paraffin-type arrangements of alkyl chains deduced based on their chain length combined with the measured basal spacing of organoclays were not found in this study. The alkyl chains aggregated to a mixture of ordered paraffin-type-like structure and disordered gauche conformation in the middle interlayer space of kaolinite, and some alkyl chains arranged in two bilayer structures, in which one was close to the silica tetrahedron surface, and the other was close to the alumina octahedron surface with their alkyl chains parallel to the kaolinite basal surface.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An increasing range of services are now offered via online applications and e-commerce websites. However, problems with online services still occur at times, even for the best service providers due to the technical failures, informational failures, or lack of required website functionalities. Also, the widespread and increasing implementation of web services means that service failures are both more likely to occur, and more likely to have serious consequences. In this paper we first develop a digital service value chain framework based on existing service delivery models adapted for digital services. We then review current literature on service failure prevention, and provide a typology of technolo- gies and approaches that can be used to prevent failures of different types (functional, informational, system), that can occur at different stages in the web service delivery. This makes a contribution to theory by relating specific technologies and technological approaches to the point in the value chain framework where they will have the maximum impact. Our typology can also be used to guide the planning, justification and design of robust, reliable web services.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The nature of services and service delivery has been changing rapidly since the 1980’s when many seminal papers in services research were published. Services are increasingly digital, or have a digital component. Further, a large and heterogeneous literature, with competing and overlapping definitions, many of which are dated and inappropriate to contemporary digital services offerings is impeding progress in digital services research. In this conceptual paper, we offer a critical review of some existing conceptualizations of services and digital services. We argue that an inductive approach to understanding cognition about digital services is required to develop a taxonomy of digital services and a new vocabulary. We argue that this is a pre-requisite to theorizing about digital services, including understanding quality drivers, value propositions, and quality determinants for different digital service types. We propose a research approach for reconceptualising digital services and service quality, and outline methodological approaches and outcomes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Anuradha Mathur and Dilip da Cunha theorise in their work on cities and flooding that it is not the floodwaters that threaten lives and homes, the real cause of danger in natural disaster is the fixity of modern civilisation. Their work traces the fluidity of the boundaries between 'dry' and 'wet' land challenging the deficiencies of traditional cartography in representing the extents of bodies of water. Mathur and da Cunha propose a process of unthinking to address the redevelopment of communities in the aftermath of natural disaster. By documenting the path of floodwaters in non-Euclidean space they propose a more appropriate response to flooding. This research focuses on the documentation of flooding in the interior of dwellings, which is an extreme condition of damage by external conditions in an environment designed to protect from these very elements. Because the floodwaters don't discriminate between the interior and the exterior, they move between structures with disregard for the systems of space we have in place. With the rapid clean up that follows flood damage, little material evidence is left for post mortem examination. This is especially the case for the flood damaged interior, piles of materials susceptible to the elements, furniture, joinery and personal objects line curbsides awaiting disposal. There is a missed opportunity in examining the interior in the after math of flood, in the way that Mathur and Dilip investigate floods and the design of cities, the flooded interior proffers an undersigned interior to study. In the absence of intact flood damaged interior, this research relies on two artists' documentation of the flooded interior. The first case study is the mimetic scenographic interiors of a flood-damaged office exhibited in the Bangkok art gallery by the group _Proxy in 2011. The second case study is Robert Polidori's photographic exhibition in New Orleans, described by Julianna Preston as, 'a series of interiors undetected by satellite imaging or storm radar. More telling, more dramatic, more unnerving, more alarming, they force a disturbance of what is familiar'.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Interactive art system +-now has a tangible interface augmented and real-time computer graphics elements. It concerns creative audience experiences facilitated through perceptual emergence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A single channel video projection with image, text and sound components. It was projected so as entirely fill a 3 x 3.5 wall in a 6 x 3.5 metre gallery space. The work deals with the role of humour and the fictocritical in exploring the relationship between politics and art.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A collaborative, participatory denim craft station installed by the LEVEL feminist collective in the Q[ARI] Project - Artist-Run Initiatives exhibition, at the Griffith University Art Gallery in 2013. The exhibition was funded by the Queensland Government through Arts Queensland, and featured seven artist-run-initiatives from Brisbane.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper examines whether managers strategically time their earnings forecasts (MEFs) as litigation risk increases. We find as litigation risk increases, the propensity to release a delayed forecast until after the market is closed (AMC) or a Friday decreases but not proportionally more for bad news than for good news. Host costly this behaviour is to investors is questionable as share price returns do not reveal any under-reaction to strategically timed bad news MEF released AMC. We also find evidence consistent with managers timing their MEFs during a natural no-trading period to better disseminate information.