679 resultados para Produce


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research introduces the proposition that Electronic Dance Music’s beat-mixing function could be implemented to create immediacy in other musical genres. The inclusion of rhythmic sections at the beginning and end of each musical work created a ‘DJ friendly’ environment. The term used in this thesis to refer to the application of beat-mixing in Rock music is ‘ClubRock’. Collaboration between a number of DJs and Rock music professionals applied the process of beat-mixing to blend Rock tracks to produce a continuous ClubRock set. The DJ technique of beat-mixing Rock music transformed static renditions into a fluid creative work. The hybridisation of the two genres, EDM and Rock, resulted in a contribution to Rock music compositional approaches and the production of a unique Rock album; Manarays—Get Lucky.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Accuracy of dose delivery in external beam radiotherapy is usually verified with electronic portal imaging (EPI) in which the treatment beam is used to check the positioning of the patient. However the resulting megavoltage x-ray images suffer from poor quality. The image quality can be improved by developing a special operating mode in the linear accelerator. The existing treatment beam is modified such that it produces enough low-energy photons for imaging. In this work the problem of optimizing the beam/detector combination to achieve optimal electronic portal image quality is addressed. The linac used for this study was modified to produce two experimental photon beams. These beams, named Al6 and Al10, were non-flat and were produced by 4MeV electrons hitting aluminum targets, 6 and 10mm thick respectively. The images produced by a conventional EPI system (6MV treatment beam and camera-based EPID with a Cu plate & Gd2O2S screen ) were compared with the images produced by the experimental beams and various screens with the same camera). The contrast of 0.8cm bone equivalent material in 5 cm water increased from 1.5% for the conventional system to 11% for the combination of Al6 beam with a 200mg/cm2 Gd2O2S screen. The signal-to-noise ratio calculated for 1cGy flood field images increased by about a factor of two for the same EPI systems. The spatial resolution of the two imaging systems was comparable. This work demonstrates that significant improvements in portal image contrast can be obtained by simultaneous optimization of the linac spectrum and EPI detector.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Evidence based practice (EBP) focuses on solving ‘tame’ problems, where literature supports question construction toward determining a solution. What happens when there is no existing evidence, or when the need for agility precludes a full EBP implementation? How might we build a more agile and innovative practice that facilitates the design of solutions to complex and wicked problems, particularly in cases where there is no existing literature? As problem solving and innovation methods, EBP and design thinking overlap considerably. The literature indicates the potential benefits to be gained for evidence based practice from adopting a human-centred rather than literature-focused foundation. The design thinking process is social and collaborative by nature, which enables it to be more agile and produce more innovative results than evidence based practice. This paper recommends a hybrid approach to maximise the strengths and benefits of the two methods for designing solutions to wicked problems. Incorporating design thinking principles and tools into EBP has the potential to move its applicability beyond tame problems and continuous improvement, and toward wicked problem solving and innovation. The potential of this hybrid approach in practice is yet to be explored.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we propose a method to generate a large scale and accurate dense 3D semantic map of street scenes. A dense 3D semantic model of the environment can significantly improve a number of robotic applications such as autonomous driving, navigation or localisation. Instead of using offline trained classifiers for semantic segmentation, our approach employs a data-driven, nonparametric method to parse scenes which easily scale to a large environment and generalise to different scenes. We use stereo image pairs collected from cameras mounted on a moving car to produce dense depth maps which are combined into a global 3D reconstruction using camera poses from stereo visual odometry. Simultaneously, 2D automatic semantic segmentation using a nonparametric scene parsing method is fused into the 3D model. Furthermore, the resultant 3D semantic model is improved with the consideration of moving objects in the scene. We demonstrate our method on the publicly available KITTI dataset and evaluate the performance against manually generated ground truth.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The emergence of pseudo-marginal algorithms has led to improved computational efficiency for dealing with complex Bayesian models with latent variables. Here an unbiased estimator of the likelihood replaces the true likelihood in order to produce a Bayesian algorithm that remains on the marginal space of the model parameter (with latent variables integrated out), with a target distribution that is still the correct posterior distribution. Very efficient proposal distributions can be developed on the marginal space relative to the joint space of model parameter and latent variables. Thus psuedo-marginal algorithms tend to have substantially better mixing properties. However, for pseudo-marginal approaches to perform well, the likelihood has to be estimated rather precisely. This can be difficult to achieve in complex applications. In this paper we propose to take advantage of multiple central processing units (CPUs), that are readily available on most standard desktop computers. Here the likelihood is estimated independently on the multiple CPUs, with the ultimate estimate of the likelihood being the average of the estimates obtained from the multiple CPUs. The estimate remains unbiased, but the variability is reduced. We compare and contrast two different technologies that allow the implementation of this idea, both of which require a negligible amount of extra programming effort. The superior performance of this idea over the standard approach is demonstrated on simulated data from a stochastic volatility model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Invasion waves of cells play an important role in development, disease and repair. Standard discrete models of such processes typically involve simulating cell motility, cell proliferation and cell-to-cell crowding effects in a lattice-based framework. The continuum-limit description is often given by a reaction–diffusion equation that is related to the Fisher–Kolmogorov equation. One of the limitations of a standard lattice-based approach is that real cells move and proliferate in continuous space and are not restricted to a predefined lattice structure. We present a lattice-free model of cell motility and proliferation, with cell-to-cell crowding effects, and we use the model to replicate invasion wave-type behaviour. The continuum-limit description of the discrete model is a reaction–diffusion equation with a proliferation term that is different from lattice-based models. Comparing lattice based and lattice-free simulations indicates that both models lead to invasion fronts that are similar at the leading edge, where the cell density is low. Conversely, the two models make different predictions in the high density region of the domain, well behind the leading edge. We analyse the continuum-limit description of the lattice based and lattice-free models to show that both give rise to invasion wave type solutions that move with the same speed but have very different shapes. We explore the significance of these differences by calibrating the parameters in the standard Fisher–Kolmogorov equation using data from the lattice-free model. We conclude that estimating parameters using this kind of standard procedure can produce misleading results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cell trajectory data is often reported in the experimental cell biology literature to distinguish between different types of cell migration. Unfortunately, there is no accepted protocol for designing or interpreting such experiments and this makes it difficult to quantitatively compare different published data sets and to understand how changes in experimental design influence our ability to interpret different experiments. Here, we use an individual based mathematical model to simulate the key features of a cell trajectory experiment. This shows that our ability to correctly interpret trajectory data is extremely sensitive to the geometry and timing of the experiment, the degree of motility bias and the number of experimental replicates. We show that cell trajectory experiments produce data that is most reliable when the experiment is performed in a quasi 1D geometry with a large number of identically{prepared experiments conducted over a relatively short time interval rather than few trajectories recorded over particularly long time intervals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This chapter investigates the relationship between technical and operational skills and the development of conceptual knowledge and literacy in Media Arts learning. It argues that there is a relationship between the stories, expressions and ideas that students aim to produce with communications media, and their ability to realise these in material form through technical processes in specific material contexts. Our claim is that there is a relationship between the technical and the operational, along with material relations and the development of conceptual knowledge and literacy in media arts learning. We place more emphasis on the material aspects of literacy than is usually the case in socio-cultural accounts of media literacy. We provide examples from a current project to demonstrate that it is just as important to address the material as it is the discursive and conceptual when considering how students develop media literacy in classroom spaces.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Practice-led journalism research techniques were used in this study to produce a ‘first draft of history’ recording the human experience of survivors and rescuers during the January 2011 flash flood disaster in Toowoomba and the Lockyer Valley in Queensland, Australia. The study aimed to discover what can be learnt from engaging in journalistic reporting of natural disasters. This exegesis demonstrates that journalism can be both a creative practice and a research methodology. About 120 survivors, rescuers and family members of victims participated in extended interviews about what happened to them and how they survived. Their stories are the basis for two creative outputs of the study: a radio documentary and a non-fiction book, that document how and why people died, or survived, or were rescued. Listeners and readers are taken "into the flood" where they feel anxious for those in peril, relief when people are saved, and devastated when babies, children and adults are swept away to their deaths. In undertaking reporting about the human experience of the floods, several significant elements about journalistic reportage of disasters were exposed. The first related to the vital role that the online social media played during the disaster for individuals, citizen reporters, journalists and emergency services organisations. Online social media offer reporters powerful new reporting tools for both gathering and disseminating news. The second related to the performance of journalists in covering events involving traumatic experiences. Journalists are often required to cover trauma and are often amongst the first-responders to disasters. This study found that almost all of the disaster survivors who were approached were willing to talk in detail about their traumatic experiences. A finding of this project is that journalists who interview trauma survivors can develop techniques for improving their ability to interview people who have experienced traumatic events. These include being flexible with interview timing and selecting a location; empowering interviewees to understand they don’t have to answer every question they are asked; providing emotional security for interviewees; and by being committed to accuracy. Survivors may exhibit posttraumatic stress symptoms but some exhibit and report posttraumatic growth. The willingness of a high proportion of the flood survivors to participate in the flood research made it possible to document a relatively unstudied question within the literature about journalism and trauma – when and why disaster survivors will want to speak to reporters. The study sheds light on the reasons why a group of traumatised people chose to speak about their experiences. Their reasons fell into six categories: lessons need to be learned from the disaster; a desire for the public to know what had happened; a sense of duty to make sure warning systems and disaster responses to be improved in future; personal recovery; the financial disinterest of reporters in listening to survivors; and the timing of the request for an interview. Feedback to the creative-practice component of this thesis - the book and radio documentary - shows that these issues are not purely matters of ethics. By following appropriate protocols, it is possible to produce stories that engender strong audience responses such as that the program was "amazing and deeply emotional" and "community storytelling at its most important". Participants reported that the experience of the interview process was "healing" and that the creative outcome resulted in "a very precious record of an afternoon of tragedy and triumph and the bitter-sweetness of survival".

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Classifier selection is a problem encountered by multi-biometric systems that aim to improve performance through fusion of decisions. A particular decision fusion architecture that combines multiple instances (n classifiers) and multiple samples (m attempts at each classifier) has been proposed in previous work to achieve controlled trade-off between false alarms and false rejects. Although analysis on text-dependent speaker verification has demonstrated better performance for fusion of decisions with favourable dependence compared to statistically independent decisions, the performance is not always optimal. Given a pool of instances, best performance with this architecture is obtained for certain combination of instances. Heuristic rules and diversity measures have been commonly used for classifier selection but it is shown that optimal performance is achieved for the `best combination performance' rule. As the search complexity for this rule increases exponentially with the addition of classifiers, a measure - the sequential error ratio (SER) - is proposed in this work that is specifically adapted to the characteristics of sequential fusion architecture. The proposed measure can be used to select a classifier that is most likely to produce a correct decision at each stage. Error rates for fusion of text-dependent HMM based speaker models using SER are compared with other classifier selection methodologies. SER is shown to achieve near optimal performance for sequential fusion of multiple instances with or without the use of multiple samples. The methodology applies to multiple speech utterances for telephone or internet based access control and to other systems such as multiple finger print and multiple handwriting sample based identity verification systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During the last several decades, the quality of natural resources and their services have been exposed to significant degradation from increased urban populations combined with the sprawl of settlements, development of transportation networks and industrial activities (Dorsey, 2003; Pauleit et al., 2005). As a result of this environmental degradation, a sustainable framework for urban development is required to provide the resilience of natural resources and ecosystems. Sustainable urban development refers to the management of cities with adequate infrastructure to support the needs of its population for the present and future generations as well as maintain the sustainability of its ecosystems (UNEP/IETC, 2002; Yigitcanlar, 2010). One of the important strategic approaches for planning sustainable cities is „ecological planning‟. Ecological planning is a multi-dimensional concept that aims to preserve biodiversity richness and ecosystem productivity through the sustainable management of natural resources (Barnes et al., 2005). As stated by Baldwin (1985, p.4), ecological planning is the initiation and operation of activities to direct and control the acquisition, transformation, disruption and disposal of resources in a manner capable of sustaining human activities with a minimum disruption of ecosystem processes. Therefore, ecological planning is a powerful method for creating sustainable urban ecosystems. In order to explore the city as an ecosystem and investigate the interaction between the urban ecosystem and human activities, a holistic urban ecosystem sustainability assessment approach is required. Urban ecosystem sustainability assessment serves as a tool that helps policy and decision-makers in improving their actions towards sustainable urban development. There are several methods used in urban ecosystem sustainability assessment among which sustainability indicators and composite indices are the most commonly used tools for assessing the progress towards sustainable land use and urban management. Currently, a variety of composite indices are available to measure the sustainability at the local, national and international levels. However, the main conclusion drawn from the literature review is that they are too broad to be applied to assess local and micro level sustainability and no benchmark value for most of the indicators exists due to limited data availability and non-comparable data across countries. Mayer (2008, p. 280) advocates that by stating "as different as the indices may seem, many of them incorporate the same underlying data because of the small number of available sustainability datasets". Mori and Christodoulou (2011) also argue that this relative evaluation and comparison brings along biased assessments, as data only exists for some entities, which also means excluding many nations from evaluation and comparison. Thus, there is a need for developing an accurate and comprehensive micro-level urban ecosystem sustainability assessment method. In order to develop such a model, it is practical to adopt an approach that uses a method to utilise indicators for collecting data, designate certain threshold values or ranges, perform a comparative sustainability assessment via indices at the micro-level, and aggregate these assessment findings to the local level. Hereby, through this approach and model, it is possible to produce sufficient and reliable data to enable comparison at the local level, and provide useful results to inform the local planning, conservation and development decision-making process to secure sustainable ecosystems and urban futures. To advance research in this area, this study investigated the environmental impacts of an existing urban context by using a composite index with an aim to identify the interaction between urban ecosystems and human activities in the context of environmental sustainability. In this respect, this study developed a new comprehensive urban ecosystem sustainability assessment tool entitled the „Micro-level Urban-ecosystem Sustainability IndeX‟ (MUSIX). The MUSIX model is an indicator-based indexing model that investigates the factors affecting urban sustainability in a local context. The model outputs provide local and micro-level sustainability reporting guidance to help policy-making concerning environmental issues. A multi-method research approach, which is based on both quantitative analysis and qualitative analysis, was employed in the construction of the MUSIX model. First, a qualitative research was conducted through an interpretive and critical literature review in developing a theoretical framework and indicator selection. Afterwards, a quantitative research was conducted through statistical and spatial analyses in data collection, processing and model application. The MUSIX model was tested in four pilot study sites selected from the Gold Coast City, Queensland, Australia. The model results detected the sustainability performance of current urban settings referring to six main issues of urban development: (1) hydrology, (2) ecology, (3) pollution, (4) location, (5) design, and; (6) efficiency. For each category, a set of core indicators was assigned which are intended to: (1) benchmark the current situation, strengths and weaknesses, (2) evaluate the efficiency of implemented plans, and; (3) measure the progress towards sustainable development. While the indicator set of the model provided specific information about the environmental impacts in the area at the parcel scale, the composite index score provided general information about the sustainability of the area at the neighbourhood scale. Finally, in light of the model findings, integrated ecological planning strategies were developed to guide the preparation and assessment of development and local area plans in conjunction with the Gold Coast Planning Scheme, which establishes regulatory provisions to achieve ecological sustainability through the formulation of place codes, development codes, constraint codes and other assessment criteria that provide guidance for best practice development solutions. These relevant strategies can be summarised as follows: • Establishing hydrological conservation through sustainable stormwater management in order to preserve the Earth’s water cycle and aquatic ecosystems; • Providing ecological conservation through sustainable ecosystem management in order to protect biological diversity and maintain the integrity of natural ecosystems; • Improving environmental quality through developing pollution prevention regulations and policies in order to promote high quality water resources, clean air and enhanced ecosystem health; • Creating sustainable mobility and accessibility through designing better local services and walkable neighbourhoods in order to promote safe environments and healthy communities; • Sustainable design of urban environment through climate responsive design in order to increase the efficient use of solar energy to provide thermal comfort, and; • Use of renewable resources through creating efficient communities in order to provide long-term management of natural resources for the sustainability of future generations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Currently there is a lack of choice when selecting synthetic materials with the cell-instructive properties demanded by modern biomaterials. The purpose of this study was to investigate the attachment of cells onto hydrogels prepared from poly(2-oxazoline)s selectively-functionalized with cell adhesion motifs. A water-soluble macromer based on the microwave-assisted cationic ring-opening polymerization of 2-methyl-2-oxazoline and 2-(dec-9-enyl)-2-oxazoline was functionalized with the peptide CRGDSG or controls using thiol-ene photochemistry followed by facile crosslinking in the presence of a dithiol crosslinker. The growth of human fibroblasts on the hydrogel surfaces was dictated by the structure and amount of incorporated peptide. Controls without any peptide showed resistance to cellular attachment. The benignity of the crosslinking conditions was demonstrated by the incorporation of fibroblasts within the hydrogels to produce three-dimensional cell-polymer constructs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Detailed mechanisms for the formation of hydroxyl or alkoxyl radicals in the reactions between tetrachloro-p-benzoquinone (TCBQ) and organic hydroperoxides are crucial for better understanding the potential carcinogenicity of polyhalogenated quinones. Herein, the mechanism of the reaction between TCBQ and H2O2 has been systematically investigated at the B3LYP/6-311++G** level of theory in the presence of different numbers of water molecules. We report that the whole reaction can easily take place with the assistance of explicit water molecules. Namely, an initial intermediate is formed first. After that, a nucleophilic attack of H2O2 onto TCBQ occurs, which results in the formation of a second intermediate that contains an OOH group. Subsequently, this second intermediate decomposes homolytically through cleavage of the O-O bond to produce a hydroxyl radical. Energy analyses suggest that the nucleophilic attack is the rate-determining step in the whole reaction. The participation of explicit water molecules promotes the reaction significantly, which can be used to explain the experimental phenomena. In addition, the effects of F, Br, and CH3 substituents on this reaction have also been studied.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This PhD practice-led research inquiry sets out to examine and describe how the fluid interactions between memory and time can be rendered via the remediation of my painting and the construction of a digital image archive. My abstract digital art and handcrafted practice is informed by Deleuze and Guattari’s rhizomics of becoming. I aim to show that the technological mobility of my creative strategies produce new conditions of artistic possibility through the mobile principles of rhizomic interconnection, multiplicity and diversity. Subsequently through the ongoing modification of past painting I map how emergent forms and ideas open up new and incisive engagements with the experience of a ‘continual present’. The deployment of new media and cross media processes in my art also deterritorialises the modernist notion of painting as a static and two dimensional spatial object. Instead, it shows painting in a postmodern field of dynamic and transformative intermediality through digital formats of still and moving images that re-imagines the relationship between memory, time and creative practice.