997 resultados para Conference interpreting


Relevância:

20.00% 20.00%

Publicador:

Resumo:

“Who are you? How do you define yourself, your identity?” With these words Allan Moore opens his exhaustive new work proposing a more comprehensive approach to the musicological analysis of popular song. The last three decades have seen a huge expansion of the anthology of the sociological and cultural meanings of pop, but Moore’s book is not another exploration of this field, although some of these ideas are incorporated in this work. Rather, he addresses the limitations of conventional musicology when dealing particularly with songs: “I address popular song rather than popular music. The defining feature of popular song lies in the interaction of everyday words and music… it is how they interact that produces significance in the experience of song”.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a graph-based method to weight medical concepts in documents for the purposes of information retrieval. Medical concepts are extracted from free-text documents using a state-of-the-art technique that maps n-grams to concepts from the SNOMED CT medical ontology. In our graph-based concept representation, concepts are vertices in a graph built from a document, edges represent associations between concepts. This representation naturally captures dependencies between concepts, an important requirement for interpreting medical text, and a feature lacking in bag-of-words representations. We apply existing graph-based term weighting methods to weight medical concepts. Using concepts rather than terms addresses vocabulary mismatch as well as encapsulates terms belonging to a single medical entity into a single concept. In addition, we further extend previous graph-based approaches by injecting domain knowledge that estimates the importance of a concept within the global medical domain. Retrieval experiments on the TREC Medical Records collection show our method outperforms both term and concept baselines. More generally, this work provides a means of integrating background knowledge contained in medical ontologies into data-driven information retrieval approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction QC and EQA are integral to good pathology laboratory practice. Medical Laboratory Science students undertake a project exploring internal QC and EQA procedures used in chemical pathology laboratories. Each student represents an individual lab and the class group represents the peer group of labs performing the same assay using the same method. Methods Using a manual BCG assay for serum albumin, normal and abnormal controls are run with a patient sample over 7 weeks. The QC results are assessed each week using calculated z-scores and both 2S & 3S control rules to determine whether a run is ‘in control’. At the end of the 7 weeks a completed LJ chart is assessed using the Westgard Multirules. Students investigate causes of error and the implications for both lab practice and patient care if runs are not ‘in control’. Twice in the 7 weeks two EQA samples (with target values unknown) are assayed alongside the weekly QC and patient samples. Results from each student are collated and form the basis of an EQA program. ALP are provided and students complete a Youden Plot, which is used to analyse the performance of each ‘lab’ and the method to identify bias. Students explore the concept of possible clinical implications of a biased method and address the actions that should be taken if a lab is not in consensus with the peer group. Conclusion This project is a model of ‘real world’ practice in which student demonstrate an understanding of the importance of QC procedures in a pathology laboratory, apply and interpret statistics and QC rules and charts, apply critical thinking and analytical skills to quality performance data to make recommendations for further practice and improve their technical competence and confidence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

CTAC2012 was the 16th biennial Computational Techniques and Applications Conference, and took place at Queensland University of Technology from 23 - 26 September, 2012. The ANZIAM Special Interest Group in Computational Techniques and Applications is responsible for the CTAC meetings, the first of which was held in 1981.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

May was a particularly busy month with lots of exciting architectural things happening in Brisbane, including the sell-out 2012 National Architecture Conference. The total number of conference attendees was 1,625, which was the largest number of attendees to any Australian National Architecture Conference to date. This was the first time that the National Architecture Conference had been held in Brisbane in over 20 years, and the enormous turnout of 947 Queenslanders to the conference was testament to the positive decision to include Brisbane as a conference venue. The theme of this year’s conference was ‘experience’. Building on ideas introduced in the recent ‘natural artifice’ conference, creative directors Shane Thompson, Michael Rayner and Peter Skinner focused closely on the real, sensed experience of architecture within its natural and constructed settings and the experience of designing and making architecture. The conference attracted a variety of high profile international speakers, including architect and professor, Wang Shu, the 2012 Pritzker Architecture Prize Laureate and co-founder of the Amateur Architecture Studio in China. Other highlights included presentations from Peter Rich [South Africa], Kathryn Findlay [United Kingdom], Rachel Neeson [Australia], Anuradha Mathur & Dilip da Cunha [United States] and Kjetil Thorsen [Norway]. QUT had a strong presence at the conference. In addition to pleasing attendance rates from QUT School of Design students and staff, our Head-of-School Professor Paul Sanders, was given the honourable task of introducing keynote speaker Peter Rich, and facilitating the Q&A session after his presentation, which received a standing ovation. There were many events organised for students and young architects by QUT’s SONA reps, including a masterclass, opening party, collaborative design and construction of the SONA Pavilion, and finally, organisation of the all important SONA Hangover Breakfast, the morning after the closing party. The 2012 National Architecture Conference was truly memorable and an experience not to have been missed. I encourage anyone with a passion for architecture and a desire to be completely inspired by current and emerging leaders in our exciting profession, to start making plans to attend next year’s conference.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Emerging sciences, such as conceptual cost estimating, seem to have to go through two phases. The first phase involves reducing the field of study down to its basic ingredients - from systems development to technological development (techniques) to theoretical development. The second phase operates in the direction in building up techniques from theories, and systems from techniques. Cost estimating is clearly and distinctly still in the first phase. A great deal of effort has been put into the development of both manual and computer based cost estimating systems during this first phase and, to a lesser extent, the development of a range of techniques that can be used (see, for instance, Ashworth & Skitmore, 1986). Theoretical developments have not, as yet, been forthcoming. All theories need the support of some observational data and cost estimating is not likely to be an exception. These data do not need to be complete in order to build theories. As it is possible to construct an image of a prehistoric animal such as the brontosaurus from only a few key bones and relics, so a theory of cost estimating may possibly be found on a few factual details. The eternal argument of empiricists and deductionists is that, as theories need factual support, so do we need theories in order to know what facts to collect. In cost estimating, the basic facts of interest concern accuracy, the cost of achieving this accuracy, and the trade off between the two. When cost estimating theories do begin to emerge, it is highly likely that these relationships will be central features. This paper presents some of the facts we have been able to acquire regarding one part of this relationship - accuracy, and its influencing factors. Although some of these factors, such as the amount of information used in preparing the estimate, will have cost consequences, we have not yet reached the stage of quantifying these costs. Indeed, as will be seen, many of the factors do not involve any substantial cost considerations. The absence of any theory is reflected in the arbitrary manner in which the factors are presented. Rather, the emphasis here is on the consideration of purely empirical data concerning estimating accuracy. The essence of good empirical research is to .minimize the role of the researcher in interpreting the results of the study. Whilst space does not allow a full treatment of the material in this manner, the principle has been adopted as closely as possible to present results in an uncleaned and unbiased way. In most cases the evidence speaks for itself. The first part of the paper reviews most of the empirical evidence that we have located to date. Knowledge of any work done, but omitted here would be most welcome. The second part of the paper presents an analysis of some recently acquired data pertaining to this growing subject.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this presentation, renowned arts practitioner, Sean Mee, and Nigel Lavender, Executive Director of the Queensland Music Festival, talk about how community arts practice can be used to build cultural captial in communities, using examples such large-scale musicals such as The Road We're ON (Charleville) and Behind the Cand (Bowen), Mee and Lavender highlight the importance of community-driven narrative and particiaption.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper’s aim is to be a satirical reflection of participant experiences at an accounting conference.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding the effects of design interventions on the meanings people associate with landscapes is important to critical and ethical practice in landscape architecture. Case study research has become a common way researchers evaluate design interventions and related issues, with a standardised method promoted by the Landscape Architecture Foundation (LAF). However, the method is somewhat undeveloped for interpreting landscape meanings – something most commonly undertaken as historic landscape studies, but not as studies of design effect. This research proposes a new method for such interpretation, using a case study of Richard Haag’s radical 1971 proposal for a new kind of park on the site of the former Seattle gas works.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: Flat-detector, cone-beam computed tomography (CBCT) has enormous potential to improve the accuracy of treatment delivery in image-guided radiotherapy (IGRT). To assist radiotherapists in interpreting these images, we use a Bayesian statistical model to label each voxel according to its tissue type. Methods: The rich sources of prior information in IGRT are incorporated into a hidden Markov random field (MRF) model of the 3D image lattice. Tissue densities in the reference CT scan are estimated using inverse regression and then rescaled to approximate the corresponding CBCT intensity values. The treatment planning contours are combined with published studies of physiological variability to produce a spatial prior distribution for changes in the size, shape and position of the tumour volume and organs at risk (OAR). The voxel labels are estimated using the iterated conditional modes (ICM) algorithm. Results: The accuracy of the method has been evaluated using 27 CBCT scans of an electron density phantom (CIRS, Inc. model 062). The mean voxel-wise misclassification rate was 6.2%, with Dice similarity coefficient of 0.73 for liver, muscle, breast and adipose tissue. Conclusions: By incorporating prior information, we are able to successfully segment CBCT images. This could be a viable approach for automated, online image analysis in radiotherapy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cone-beam computed tomography (CBCT) has enormous potential to improve the accuracy of treatment delivery in image-guided radiotherapy (IGRT). To assist radiotherapists in interpreting these images, we use a Bayesian statistical model to label each voxel according to its tissue type. The rich sources of prior information in IGRT are incorporated into a hidden Markov random field model of the 3D image lattice. Tissue densities in the reference CT scan are estimated using inverse regression and then rescaled to approximate the corresponding CBCT intensity values. The treatment planning contours are combined with published studies of physiological variability to produce a spatial prior distribution for changes in the size, shape and position of the tumour volume and organs at risk. The voxel labels are estimated using iterated conditional modes. The accuracy of the method has been evaluated using 27 CBCT scans of an electron density phantom. The mean voxel-wise misclassification rate was 6.2\%, with Dice similarity coefficient of 0.73 for liver, muscle, breast and adipose tissue. By incorporating prior information, we are able to successfully segment CBCT images. This could be a viable approach for automated, online image analysis in radiotherapy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a method for analysing videogames based on game activities. It examines the impact of these activities on the player experience. The research approach applies heuristic checklists that deconstruct games in terms of cognitive processes that players engage in during gameplay (e.g., addressing goals, interpreting feedback). For this study we examined three puzzle games, Portal 2, I-Fluid and Braid. The Player Experience of Need Satisfaction (PENS) survey is used to measure player experience following gameplay. Cognitive action provided within games is examined in light of reported player experiences to determine the extent to which these activities influence players’ feelings of competence, autonomy, intuitive control and presence. Findings indicate that the positive experiences are directly influenced by game activity design. Our study also demonstrates the value of expert review in deconstructing gameplay activity as a means of providing direction for game design that enhances the player experience.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper I will explore some experience-based perspectives on information literacy research and practice. The research based understanding of what information literacy looks like to those experiencing it, is very different from the standard interpretations of information literacy as involving largely text based information searching, interpretation, evaluation and use. It also involves particular understandings of the interrelation between information and learning experiences. In following this thread of the history of information literacy I will reflect on aspects of the past, present and future of information literacy research. In each of these areas I explore experiential, especially phenomenographic approaches to information literacy and information literacy education, to reveal the unfolding understanding of people’s experience of information literacy stemming from this orientation. In addressing the past I will look in particular at the contribution of the seven faces of information literacy and some lessons learned from attending to variation in experience. I will explore important directions and insights that this history may help us to retain; including the value of understanding peoples’ information literacy experience. In addressing the present, I will introduce more recent work that adopts the key ideas of informed learning by attending to both information and learning experiences in specific contexts. I will look at some contemporary directions and key issues, including the reinvention of the phenomenographic, or relational approach to information literacy as informed learning or using information to learn. I will also provide some examples of the contribution of experiential approaches to information literacy research and practice. The evolution and development of the phenomenographic approach to information literacy, and the growing attention to a dual focus on information and learning experiences in this approach will be highlighted. Finally, in addressing the future I will return to advocacy, the recognition and pursuit of the transforming and empowering heart of information literacy; and suggest that for information literacy research, including the experiential, a turn towards the emancipatory has much to offer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Scores of well-researched individual papers and posters specifically or indirectly addressing the occurrence, measurement or exposure impacts of chemicals in buildings were presented at 2012 Healthy Buildings Conference. Many of these presentations offered advances in sampling and characterisation of chemical pollutants while others extended the frontiers of knowledge on the emission, adsorption, risk, fate and compositional levels of chemicals in indoor and outdoor microenvironments. Several modelled or monitored indoor chemistry, including processes that generated secondary pollutants. This article provides an overview of the state of knowledge on healthy buildings based on papers presented in chemistry sessions at Healthy Buildings 2012 (HB2012) Conference. It also suggests future directions in healthy buildings research.