421 resultados para Philips, Ambrose


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data quality has become a major concern for organisations. The rapid growth in the size and technology of a databases and data warehouses has brought significant advantages in accessing, storing, and retrieving information. At the same time, great challenges arise with rapid data throughput and heterogeneous accesses in terms of maintaining high data quality. Yet, despite the importance of data quality, literature has usually condensed data quality into detecting and correcting poor data such as outliers, incomplete or inaccurate values. As a result, organisations are unable to efficiently and effectively assess data quality. Having an accurate and proper data quality assessment method will enable users to benchmark their systems and monitor their improvement. This paper introduces a granules mining for measuring the random degree of error data which will enable decision makers to conduct accurate quality assessment and allocate the most severe data, thereby providing an accurate estimation of human and financial resources for conducting quality improvement tasks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Decision table and decision rules play an important role in rough set based data analysis, which compress databases into granules and describe the associations between granules. Granule mining was also proposed to interpret decision rules in terms of association rules and multi-tier structure. In this paper, we further extend granule mining to describe the relationships between granules not only by traditional support and confidence, but by diversity and condition diversity as well. Diversity measures how diverse of a granule associated with the other ganules, it provides a kind of novel knowledge in databases. Some experiments are conducted to test the proposed new concepts for describing the characteristics of a real network traffic data collection. The results show that the proposed concepts are promising.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work was motivated by the limited knowledge on personal exposure to ultrafine (UF) particles, and it quantifies school children’s personal exposure to UF particles, in terms of number, using Philips Aerasense Nano Tracers (NTs). This study is being conducted in conjunction with the “Ultrafine Particles from Traffic Emissions and Children’s Health (UPTECH)” project, which aims to determine the relationship between exposure to traffic related UF particles and children’s health (http://www.ilaqh.qut.edu.au/Misc/UPTECH%20 Home.htm). To achieve this, air quality and some health data are being collected at 25 schools within the Brisbane Metropolitan Area in Australia over two years. The school children’s personal exposure to UF particles in the first 17 schools are presented here. These schools were tested between Oct 2010 and Dec 2011. Data collection is expected to be complete by mid 2012.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the “gold standard” for predicting dose deposition in the patient. In this study, software has been developed that enables the transfer of treatment plan information from the treatment planning system to a Monte Carlo dose calculation engine. A database of commissioned linear accelerator models (Elekta Precise and Varian 2100CD at various energies) has been developed using the EGSnrc/BEAMnrc Monte Carlo suite. Planned beam descriptions and CT images can be exported from the treatment planning system using the DICOM framework. The information in these files is combined with an appropriate linear accelerator model to allow the accurate calculation of the radiation field incident on a modelled patient geometry. The Monte Carlo dose calculation results are combined according to the monitor units specified in the exported plan. The result is a 3D dose distribution that could be used to verify treatment planning system calculations. The software, MCDTK (Monte Carlo Dicom ToolKit), has been developed in the Java programming language and produces BEAMnrc and DOSXYZnrc input files, ready for submission on a high-performance computing cluster. The code has been tested with the Eclipse (Varian Medical Systems), Oncentra MasterPlan (Nucletron B.V.) and Pinnacle3 (Philips Medical Systems) planning systems. In this study the software was validated against measurements in homogenous and heterogeneous phantoms. Monte Carlo models are commissioned through comparison with quality assurance measurements made using a large square field incident on a homogenous volume of water. This study aims to provide a valuable confirmation that Monte Carlo calculations match experimental measurements for complex fields and heterogeneous media.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Total Dik! is a collaborative project between the Queensland University of Technology (QUT) and Queensland Theatre Company (QTC). Total Dik! explores transmedia storytelling in live performance from concept development to delivery and builds on works, By the Way, Meet Vera Stark, (Forrester2012), Hotel Modern’s Kamp (2005) and God’s Beard (2012) that use visual art, puppetry, music and film. The project’s first iteration enabled an interrogation of the integration of media-rich elements with live performers in a theatrical environment. Performative transmedia storytelling draws on the tenets of convergent media theory developed by Jenkins (2007, 2012), Dena (2010) and Philips (2012). This exploratory work, juxtaposing transmedia storytelling techniques with live performance, draws on Samuel Becket’s challenges to theatre orthodoxy, and touches on Brechtian notions of alienation through ‘sleight-of-hand’ or processual unpacking and deconstruction during performance. Total Dik! blends a convergence of technologies, models, green screen capture, and live dimensions of performance in one narrative allowing the work’s creators to test new combinations of transmedia storytelling techniques on a traditional performance platform.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Instances of parallel ecotypic divergence where adaptation to similar conditions repeatedly cause similar phenotypic changes in closely related organisms are useful for studying the role of ecological selection in speciation. Here we used a combination of traditional and next generation genotyping techniques to test for the parallel divergence of plants from the Senecio lautus complex, a phenotypically variable groundsel that has adapted to disparate environments in the South Pacific. Phylogenetic analysis of a broad selection of Senecio species showed that members of the S. lautus complex form a distinct lineage that has diversified recently in Australasia. An inspection of thousands of polymorphisms in the genome of 27 natural populations from the S. lautus complex in Australia revealed a signal of strong genetic structure independent of habitat and phenotype. Additionally, genetic differentiation between populations was correlated with the geographical distance separating them, and the genetic diversity of populations strongly depended on geographical location. Importantly, coastal forms appeared in several independent phylogenetic clades, a pattern that is consistent with the parallel evolution of these forms. Analyses of the patterns of genomic differentiation between populations further revealed that adjacent populations displayed greater genomic heterogeneity than allopatric populations and are differentiated according to variation in soil composition. These results are consistent with a process of parallel ecotypic divergence in face of gene flow.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Airborne particles have been shown to be associated with a wide range of adverse health effects, which has led to a recent increase in medical research to gain better insight into their health effects. However, accurate evaluation of the exposure-dose-response relationship is highly dependent on the ability to track actual exposure levels of people to airborne particles. This is quite a complex task, particularly in relation to submicrometer and ultrafine particles, which can vary quite significantly in terms of particle surface area and number concentrations. Therefore, suitable monitors that can be worn for measuring personal exposure to these particles are needed. This paper presents an evaluation of the metrological performance of six diffusion charger sensors, NanoTracer (Philips Aerasense) monitors, when measuring particle number and surface area concentrations, as well as particle number distribution mean when compared to reference instruments. Tests in the laboratory (by generating monodisperse and polydisperse aerosols) and in the field (using natural ambient particles) were designed to evaluate the response of these devices under both steady-state and dynamics conditions. Results showed that the NanoTracers performed well when measuring steady state aerosols, however they strongly underestimated actual concentrations during dynamic response testing. The field experiments also showed that, when the majority of the particles were smaller than 20 nm, which occurs during particle formation events in the atmosphere, the NanoTracer underestimated number concentration quite significantly. Even though the NanoTracer can be used for personal monitoring of exposure to ultrafine particles, it also has limitations which need to be considered in order to provide meaningful results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Acacia Light Wall is a permanent public artwork within the 3 stage Eden on the Yarra – a residential / commercial development on Victoria Street Abbotsford, Melbourne. The work was commissioned by the Hampton Group for Acacia Place, the first building in the development. The stylised screen was inspired by tangled wattle trees (Australia’s most common Acacia). The work consists of two walls, made from laser cut aluminium screen, acrylic ‘windows” Philips Colour Kinetic controllable LED (1250 nodes), Philips Colour Kinetics control ‘iPlayers”. One wall is 10 m long x 3 to 5 metres and the second is 12m by 3m. The windows are lit by an array of 600+ LED’s in each wall. These lights change colour from week to week marking the progress of the seasons. We worked with the project horticulturalist to develop a palate of colours for each week’s ‘light show’ that was drawn from local flowers and foliage likely to be in bloom that week. The lighting display is not static but rather a very slow moving (morphing) light show. It isn’t fast and flashy. Instead it’s restful and profound.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background The vegetative phenotype of the pea mutant unifoliata (uni) is a simplification of the wild-type compound leaf to a single leaflet. Mutant uni plants are also self-sterile and the flowers resemble known floral meristem and organ identity mutants. In Antirrhinum and Arabidopsis, mutations in the floral meristem identity gene FLORICAULA/LEAFY (FLO/LFY) affect flower development alone, whereas the tobacco FLO/LFY homologue, NFL, is expressed in vegetative tissues, suggesting that NFL specifies determinacy in the progenitor cells for both flowers and leaves. In this paper, we characterised the pea homologue of FLO/LFY. Results The pea cDNA homologue of FLO/LFY, PEAFLO, mapped to the uni locus in recombinant-inbred mapping populations and markers based on PEAFLO cosegregated with uni in segregating sibling populations. The characterisation of two spontaneous uni mutant alleles, one containing a deletion and the other a point mutation in the PEAFLO coding sequences, predicted that PEAFLO corresponds to UNI and that the mutant vegetative phenotype was conferred by the defective PEAFLO gene. Conclusions The uni mutant demonstrates that there are shared regulatory processes in the morphogenesis of leaves and flowers and that floral meristem identity genes have an extended role in plant development. Pleiotropic regulatory genes such as UNI support the hypothesis that leaves and flowers derive from a common ancestral sporophyll-like structure. The regulation of indeterminacy during leaf and flower morphogenesis by UNI may reflect a primitive function for the gene in the pre-angiosperm era.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The textual turn is a good friend of expert spectating, where it assumes the role of writing-productive apparatus, but no friend at all of expert practices or practitioners (Melrose, 2003). Introduction The challenge of time-based embodied performance when the artefact is unstable As a former full-time professional practitioner with an embodied dance practice as performer, choreographer and artistic director for three decades, I somewhat unexpectedly entered the world of academia in 2000 after completing a practice-based PhD, which was described by its examiners as ‘pioneering’. Like many artists my intention was to deepen and extend my practice through formal research into my work and its context (which was intercultural) and to privilege the artist’s voice in a research world where it was too often silent. Practice as research, practice-based research, and practice-led research were not yet fully named. It was in its infancy and my biggest challenge was to find a serviceable methodology which did not betray my intentions to keep practice at the centre of the research. Over the last 15 years, practice led doctoral research, where examinable creative work is placed alongside an accompanying (exegetical) written component, has come a long way. It has been extensively debated with a range of theories and models proposed (Barrett & Bolt, 2007, Pakes, 2003 & 2004, Piccini, 2005, Philips, Stock & Vincs 2009, Stock, 2009 & 2010, Riley & Hunter 2009, Haseman, 2006, Hecq, 2012). Much of this writing is based around epistemological concerns where the research methodologies proposed normally incorporate a contextualisation of the creative work in its field of practice, and more importantly validation and interrogation of the processes of the practice as the central ‘data gathering’ method. It is now widely accepted, at least in the Australian creative arts context, that knowledge claims in creative practice research arise from the material activities of the practice itself (Carter, 2004). The creative work explicated as the tangible outcome of that practice is sometimes referred to as the ‘artefact’. Although the making of the artefact, according to Colbert (2009, p. 7) is influenced by “personal, experiential and iterative processes”, mapping them through a research pathway is “difficult to predict [for] “the adjustments made to the artefact in the light of emerging knowledge and insights cannot be foreshadowed”. Linking the process and the practice outcome most often occurs through the textual intervention of an exegesis which builds, and/or builds on, theoretical concerns arising in and from the work. This linking produces what Barrett (2007) refers to as “situated knowledge… that operates in relation to established knowledge” (p. 145). But what if those material forms or ‘artefacts’ are not objects or code or digitised forms, but live within the bodies of artist/researchers where the nature of the practice itself is live, ephemeral and constantly transforming, as in dance and physical performance? Even more unsettling is when the ‘artefact’ is literally embedded and embodied in the work and in the maker/researcher; when subject and object are merged. To complicate matters, the performing arts are necessarily collaborative, relying not only on technical mastery and creative/interpretive processes, but on social and artistic relationships which collectively make up the ‘artefact’. This chapter explores issues surrounding live dance and physical performance when placed in a research setting, specifically the complexities of being required to translate embodied dance findings into textual form. Exploring how embodied knowledge can be shared in a research context for those with no experiential knowledge of communicating through and in dance, I draw on theories of “dance enaction” (Warburton, 2011) together with notions of “affective intensities” and “performance mastery” (Melrose, 2003), “intentional activity” (Pakes, 2004) and the place of memory. In seeking ways to capture in another form the knowledge residing in live dance practice, thus making implicit knowledge explicit, I further propose there is a process of triple translation as the performance (the living ‘artefact’) is documented in multi-facetted ways to produce something durable which can be re-visited. This translation becomes more complex if the embodied knowledge resides in culturally specific practices, formed by world views and processes quite different from accepted norms and conventions (even radical ones) of international doctoral research inquiry. But whatever the combination of cultural, virtual and genre-related dance practices being researched, embodiment is central to the process, outcome and findings, and the question remains of how we will use text and what forms that text might take.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current educational reform, policy and public discourse emphasise standardisation of testing, curricula and professional practice, yet the landscape of literacy practices today is fluid, interactive, multimodal, ever-changing, adaptive and collaborative. How then can English and literacy educators negotiate these conflicting terrains? The nature of today’s literacy practices is reflected in a concept of living texts which refers to experienced events and encounters that offer meaning-making that is fluid, interactive and changing. Literacy learning possibilities with living texts are described and discussed by the authors who independently investigated the place of living texts across two distinctly different learning contexts: a young people’s community arts project and a co-taught multiliteracies project in a high school. In the community arts project, young people created living texts as guided walks of urban spaces that adapt and change to varying audiences. In the multiliteracies project, two parents and a teacher created interactive spaces through co-teaching and cogenerative dialoguing. These spaces generate living texts that yield a purposefully connected curriculum rich in community-relevant and culturally significant texts. These two studies are shared with a view of bringing living texts into literacy education to loosen rigidity in standardisation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Relational elements of language (e.g. spatial prepositions) act to direct attention to aspects of an incoming message. The listener or reader must be able to use these elements to focus and refocus attention on the mental representation being constructed. Research has shown that this type of attention control is specific to language and can be distinguished from attention control for non-relational (semantic or content) elements. Twenty-two monolinguals (18–30 years) and nineteen bilinguals (18–30 years) completed two conditions of an alternating-runs task-switching paradigm in their first language. The relational condition involved processing spatial prepositions, and the non-relational condition involved processing concrete nouns and adjectives. Overall, monolinguals had significantly larger shift costs (i.e. greater attention control burden) in the relational condition than the non-relational condition, whereas bilinguals performed similarly in both conditions. This suggests that proficiency in a second language has a positive impact on linguistic attention control in one's native language.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Catheter ablation procedures for atrial fibrillation (AF) may frequently require long fluoroscopic times. We sought to undertake a review of radiation safety practice in our Cardiac Electrophysiology Laboratory and implement changes to minimize fluoroscopic doses. We also sought to compare the results with radiation doses for percutaneous coronary intervention (PCI) cases performed in our hospital. Methods: Fluoroscopic times and doses for AF ablation procedures performed by a single operator on a Philips Integris H3000 image-intensifier were analysed for 11-month period. Results were compared with all PCI procedures performed over a similar period by multiple operators on a Philips Integris Allura FD system. Comprehensive review of radiation practice in the Electrophysiology laboratory identified the potential to reduce pulse frame rates and doses, and to narrow the field of interest without impacting the performance of the procedure. These changes were implemented and results analysed after a further 11 months. Results: In the pre-intervention period 50 AF catheter ablations had a mean fluoroscopic time of 86.4 min and mean fluoroscopic dose 68.4 Gy/cm2. Post-intervention 75 procedures had a mean fluorosocopic time of 68.9 min (p < 0.0001) and mean dose of 14.3 Gy/cm2 (p < 0.0001) 128 PCI procedures had a mean combined fluoroscopic and image acquisition time of 10.0 min and mean total dose 38.8 Gy/cm2. Conclusions: Catheter ablation procedures for AF may require lengthy use of fluoroscopy but simple modifications to radiation practice can result in marked reductions in radiation dose that compare favourably with PCI case doses

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Collection contains materials pertaining to the life and work of Stone.