115 resultados para room set up


Relevância:

100.00% 100.00%

Publicador:

Resumo:

ADAM Cass's I Love You, Bro is an engaging portrayal of just how far some young people can go in constructing fantasy worlds online. The play is, according to Cass, based on the case of two teenage boys in Britain in the early 2000s. Troubled teen Johnny lives at home with his mother and her new partner. Lurking in an online chat room one day, he strikes up a conversation with MarkyMark, a slightly older soccer-playing boy from the popular crowd in his own local town, who mistakes him for a girl. The plot unfolds from this one moment of mistaken identity. Johnny concocts an increasingly tenuous series of characters, plot twists and intrigues to try to maintain his relationship with MarkyMark and deal with the lie at the heart of his first love, eventually conspiring - as he tells us from the first moment - to cause his own murder.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There’s a diagram that does the rounds online that neatly sums up the difference between the quality of equipment used in the studio to produce music, and the quality of the listening equipment used by the consumer...

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Commission has been asked to identify appropriate options for reducing entry and exit barriers including advice on the potential impacts of the personal/corporate insolvency regimes on business exits...

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Commission has released a Draft Report on Business Set-Up, Transfer and Closure for public consultation and input. It is pleasing to note that three chapters of the Draft Report address aspects of personal and corporate insolvency. Nevertheless, we continue to make the submission to national policy inquiries and discussions that a comprehensive review should be undertaken of the regulation of insolvency and restructuring in Australia. The last comprehensive review of the insolvency system was by the Australian Law Reform Commission (the Harmer Report) and was handed down in 1988. Whilst there have been aspects of our insolvency laws that have been reviewed since that time, none has been able to provide the clear and comprehensive analysis that is able to come from a more considered review. Such a review ought to be conducted by the Australian Law Reform Commission or similar independent panel set up for the task. We also suggest that there is a lack of data available to assist with addressing questions raised by the Draft Report. There is a need to invest in finding out, in a rigorous and informed way, how the current law operates. Until there is a willingness to make a public investment in such research with less reliance upon the anecdotal (often from well-meaning but ultimately inadequately informed participants and others) the government cannot be sure that the insolvency regime we have provides the most effective regime to underpin Australia’s commercial and financial dealings, nor that any change is justified. We also make the submission that there are benefits in a serious investigation into a merged regulatory architecture of personal and corporate insolvency and a combined personal and corporate insolvency regulator.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The microbial mediated production of nitrous oxide (N2O) and its reduction to dinitrogen (N2) via denitrification represents a loss of nitrogen (N) from fertilised agro-ecosystems to the atmosphere. Although denitrification has received great interest by biogeochemists in the last decades, the magnitude of N2lossesand related N2:N2O ratios from soils still are largely unknown due to methodical constraints. We present a novel 15N tracer approach, based on a previous developed tracer method to study denitrification in pure bacterial cultures which was modified for the use on soil incubations in a completely automated laboratory set up. The method uses a background air in the incubation vessels that is replaced with a helium-oxygen gas mixture with a 50-fold reduced N2 background (2 % v/v). This method allows for a direct and sensitive quantification of the N2 and N2O emissions from the soil with isotope-ratio mass spectrometry after 15N labelling of denitrification N substrates and minimises the sensitivity to the intrusion of atmospheric N2 at the same time. The incubation set up was used to determine the influence of different soil moisture levels on N2 and N2O emissions from a sub-tropical pasture soil in Queensland/Australia. The soil was labelled with an equivalent of 50 μg-N per gram dry soil by broadcast application of KNO3solution (4 at.% 15N) and incubated for 3 days at 80% and 100% water filled pore space (WFPS), respectively. The headspace of the incubation vessel was sampled automatically over 12hrs each day and 3 samples (0, 6, and 12 hrs after incubation start) of headspace gas analysed for N2 and N2O with an isotope-ratio mass spectrometer (DELTA V Plus, Thermo Fisher Scientific, Bremen, Germany(. In addition, the soil was analysed for 15N NO3- and NH4+ using the 15N diffusion method, which enabled us to obtain a complete N balance. The method proved to be highly sensitive for N2 and N2O emissions detecting N2O emissions ranging from 20 to 627 μN kg-1soil-1hr-1and N2 emissions ranging from 4.2 to 43 μN kg-1soil-1hr-1for the different treatments. The main end-product of denitrification was N2O for both water contents with N2 accounting for 9% and 13% of the total denitrification losses at 80% and 100%WFPS, respectively. Between 95-100% of the added 15N fertiliser could be recovered. Gross nitrification over the 3 days amounted to 8.6 μN g-1 soil-1 and 4.7 μN g-1 soil-1, denitrification to 4.1 μN g-1 soil-1 and 11.8 μN g-1 soil-1at 80% and 100%WFPS, respectively. The results confirm that the tested method allows for a direct and highly sensitive detection of N2 and N2O fluxes from soils and hence offers a sensitive tool to study denitrification and N turnover in terrestrial agro-ecosystems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose This study evaluated the impact of patient set-up errors on the probability of pulmonary and cardiac complications in the irradiation of left-sided breast cancer. Methods and Materials Using the CMS XiO Version 4.6 (CMS Inc., St Louis, MO) radiotherapy planning system's NTCP algorithm and the Lyman -Kutcher-Burman (LKB) model, we calculated the DVH indices for the ipsilateral lung and heart and the resultant normal tissue complication probabilities (NTCP) for radiation-induced pneumonitis and excess cardiac mortality in 12 left-sided breast cancer patients. Results Isocenter shifts in the posterior direction had the greatest effect on the lung V20, heart V25, mean and maximum doses to the lung and the heart. Dose volume histograms (DVH) results show that the ipsilateral lung V20 tolerance was exceeded in 58% of the patients after 1cm posterior shifts. Similarly, the heart V25 tolerance was exceeded after 1cm antero-posterior and left-right isocentric shifts in 70% of the patients. The baseline NTCPs for radiation-induced pneumonitis ranged from 0.73% - 3.4% with a mean value of 1.7%. The maximum reported NTCP for radiation-induced pneumonitis was 5.8% (mean 2.6%) after 1cm posterior isocentric shift. The NTCP for excess cardiac mortality were 0 % in 100% of the patients (n=12) before and after setup error simulations. Conclusions Set-up errors in left sided breast cancer patients have a statistically significant impact on the Lung NTCPs and DVH indices. However, with a central lung distance of 3cm or less (CLD <3cm), and a maximum heart distance of 1.5cm or less (MHD<1.5cm), the treatment plans could tolerate set-up errors of up to 1cm without any change in the NTCP to the heart.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The potential to cultivate new relationships with spectators has long been cited as a primary motivator for those using digital technologies to construct networked or telematics performances or para-performance encounters in which performers and spectators come together in virtual – or at least virtually augmented – spaces and places. Today, with Web 2.0 technologies such as social media platforms becoming increasingly ubiquitous, and increasingly easy to use, more and more theatre makers are developing digitally mediated relationships with spectators. Sometimes for the purpose of an aesthetic encounter, sometimes for critical encounter, or sometimes as part of an audience politicisation, development or engagement agenda. Sometimes because this is genuinely an interest, and sometimes because spectators or funding bodies expect at least some engagement via Facebook, Twitter or Instagram. In this paper, I examine peculiarities and paradoxes emerging in some of these efforts to engage spectators via networked performance or para-performance encounters. I use examples ranging from theatre, to performance art, to political activism – from ‘cyberformaces’ on Helen Varley Jamieson’s Upstage Avatar Performance Platform, to Wafaa Bilal’s Domestic Tension installation where spectators around the world could use a webcam in a chat room to target him with paintballs while he was in residence in a living room set up in a gallery for a week, as a comment on use of drone technology in war, to Liz Crow’s Bedding Out where she invited people to physically and virtually join her in her bedroom to discuss the impact of an anti-disabled austerity politics emerging in her country, to Dislife’s use of holograms of disabled people popping up in disabled parking spaces when able bodied drivers attempted to pull into them, amongst others. I note the frequency with which these performance practices deploy discourses of democratisation, participation, power and agency to argue that these technologies assist in positioning spectators as co-creators actively engaged in the evolution of a performance (and, in politicised pieces that point to racism, sexism, or ableism, pushing spectators to reflect on their agency in that dramatic or daily-cum-dramatic performance of prejudice). I investigate how a range of issues – from the scenographic challenges in deploying networked technologies for both participant and bystander audiences others have already noted, to the siloisation of aesthetic, critical and audience activation activities on networked technologies, to conventionalised dramaturgies of response informed by power, politics and impression management that play out in online as much as offline performances, to the high personal, social and professional stakes involved in participating in a form where spectators responses are almost always documented, recorded and re-represented to secondary and tertiary sets of spectators via the circulation into new networks social media platforms so readily facilitate – complicate discourses of democratic co-creativity associated with networked performance and para-performance activities.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis develops a detailed conceptual design method and a system software architecture defined with a parametric and generative evolutionary design system to support an integrated interdisciplinary building design approach. The research recognises the need to shift design efforts toward the earliest phases of the design process to support crucial design decisions that have a substantial cost implication on the overall project budget. The overall motivation of the research is to improve the quality of designs produced at the author's employer, the General Directorate of Major Works (GDMW) of the Saudi Arabian Armed Forces. GDMW produces many buildings that have standard requirements, across a wide range of environmental and social circumstances. A rapid means of customising designs for local circumstances would have significant benefits. The research considers the use of evolutionary genetic algorithms in the design process and the ability to generate and assess a wider range of potential design solutions than a human could manage. This wider ranging assessment, during the early stages of the design process, means that the generated solutions will be more appropriate for the defined design problem. The research work proposes a design method and system that promotes a collaborative relationship between human creativity and the computer capability. The tectonic design approach is adopted as a process oriented design that values the process of design as much as the product. The aim is to connect the evolutionary systems to performance assessment applications, which are used as prioritised fitness functions. This will produce design solutions that respond to their environmental and function requirements. This integrated, interdisciplinary approach to design will produce solutions through a design process that considers and balances the requirements of all aspects of the design. Since this thesis covers a wide area of research material, 'methodological pluralism' approach was used, incorporating both prescriptive and descriptive research methods. Multiple models of research were combined and the overall research was undertaken following three main stages, conceptualisation, developmental and evaluation. The first two stages lay the foundations for the specification of the proposed system where key aspects of the system that have not previously been proven in the literature, were implemented to test the feasibility of the system. As a result of combining the existing knowledge in the area with the newlyverified key aspects of the proposed system, this research can form the base for a future software development project. The evaluation stage, which includes building the prototype system to test and evaluate the system performance based on the criteria defined in the earlier stage, is not within the scope this thesis. The research results in a conceptual design method and a proposed system software architecture. The proposed system is called the 'Hierarchical Evolutionary Algorithmic Design (HEAD) System'. The HEAD system has shown to be feasible through the initial illustrative paper-based simulation. The HEAD system consists of the two main components - 'Design Schema' and the 'Synthesis Algorithms'. The HEAD system reflects the major research contribution in the way it is conceptualised, while secondary contributions are achieved within the system components. The design schema provides constraints on the generation of designs, thus enabling the designer to create a wide range of potential designs that can then be analysed for desirable characteristics. The design schema supports the digital representation of the human creativity of designers into a dynamic design framework that can be encoded and then executed through the use of evolutionary genetic algorithms. The design schema incorporates 2D and 3D geometry and graph theory for space layout planning and building formation using the Lowest Common Design Denominator (LCDD) of a parameterised 2D module and a 3D structural module. This provides a bridge between the standard adjacency requirements and the evolutionary system. The use of graphs as an input to the evolutionary algorithm supports the introduction of constraints in a way that is not supported by standard evolutionary techniques. The process of design synthesis is guided as a higher level description of the building that supports geometrical constraints. The Synthesis Algorithms component analyses designs at four levels, 'Room', 'Layout', 'Building' and 'Optimisation'. At each level multiple fitness functions are embedded into the genetic algorithm to target the specific requirements of the relevant decomposed part of the design problem. Decomposing the design problem to allow for the design requirements of each level to be dealt with separately and then reassembling them in a bottom up approach reduces the generation of non-viable solutions through constraining the options available at the next higher level. The iterative approach, in exploring the range of design solutions through modification of the design schema as the understanding of the design problem improves, assists in identifying conflicts in the design requirements. Additionally, the hierarchical set-up allows the embedding of multiple fitness functions into the genetic algorithm, each relevant to a specific level. This supports an integrated multi-level, multi-disciplinary approach. The HEAD system promotes a collaborative relationship between human creativity and the computer capability. The design schema component, as the input to the procedural algorithms, enables the encoding of certain aspects of the designer's subjective creativity. By focusing on finding solutions for the relevant sub-problems at the appropriate levels of detail, the hierarchical nature of the system assist in the design decision-making process.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Providing help for research degree writing within a formal structure is difficult because research students come into their degree with widely varying needs and levels of experience. Providing writing assistance within a less structured learning context is an approach which has been trialled in higher education with promising results (Boud, Cohen & Sampson, 2001; Stracke, 2010; Devendish et al., 2009). While semi structured approaches have been the subject of study, little attention has been paid to the processes of informal learning which exist within doctoral education. In this paper we explore a 'writing movement' which has started to be taken up at various locations in Australia through the auspices of social media (Twitter and Facebook). 'Shut up and Write' is a concept first used in the cafe scene in San Francisco, where writers converge at a specific time and place and write together, without showing each other the outcomes, temporarily transforming writing from a solitary practice to a social one. In this paper we compare the experience of facilitating shut up and write sessions in two locations: RMIT University and Queensland University of Technology. The authors describe the set up and functioning of the different groups and report on feedback from regular participants, both physical and virtual. We suggest that informal learning practices can be exploited to assist research students to orientate themselves to the university environment and share vital technical skills, with very minimal input from academic staff. This experience suggests there is untapped potential within these kinds of activities to promote learning within the research degree experience which is sustainable and builds a stronger sense of community.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Introduction: The motivation for developing megavoltage (and kilovoltage) cone beam CT (MV CBCT) capabilities in the radiotherapy treatment room was primarily based on the need to improve patient set-up accuracy. There has recently been an interest in using the cone beam CT data for treatment planning. Accurate treatment planning, however, requires knowledge of the electron density of the tissues receiving radiation in order to calculate dose distributions. This is obtained from CT, utilising a conversion between CT number and electron density of various tissues. The use of MV CBCT has particular advantages compared to treatment planning with kilovoltage CT in the presence of high atomic number materials and requires the conversion of pixel values from the image sets to electron density. Therefore, a study was undertaken to characterise the pixel value to electron density relationship for the Siemens MV CBCT system, MVision, and determine the effect, if any, of differing the number of monitor units used for acquisition. If a significant difference with number of monitor units was seen then pixel value to ED conversions may be required for each of the clinical settings. The calibration of the MV CT images for electron density offers the possibility for a daily recalculation of the dose distribution and the introduction of new adaptive radiotherapy treatment strategies. Methods: A Gammex Electron Density CT Phantom was imaged with the MVCB CT system. The pixel value for each of the sixteen inserts, which ranged from 0.292 to 1.707 relative electron density to the background solid water, was determined by taking the mean value from within a region of interest centred on the insert, over 5 slices within the centre of the phantom. These results were averaged and plotted against the relative electron densities of each insert with a linear least squares fit was preformed. This procedure was performed for images acquired with 5, 8, 15 and 60 monitor units. Results: The linear relationship between MVCT pixel value and ED was demonstrated for all monitor unit settings and over a range of electron densities. The number of monitor units utilised was found to have no significant impact on this relationship. Discussion: It was found that the number of MU utilised does not significantly alter the pixel value obtained for different ED materials. However, to ensure the most accurate and reproducible MV to ED calibration, one MU setting should be chosen and used routinely. To ensure accuracy for the clinical situation this MU setting should correspond to that which is used clinically. If more than one MU setting is used clinically then an average of the CT values acquired with different numbers of MU could be utilized without loss in accuracy. Conclusions: No significant differences have been shown between the pixel value to ED conversion for the Siemens MV CT cone beam unit with change in monitor units. Thus as single conversion curve could be utilised for MV CT treatment planning. To fully utilise MV CT imaging for radiotherapy treatment planning further work will be undertaken to ensure all corrections have been made and dose calculations verified. These dose calculations may be either for treatment planning purposes or for reconstructing the delivered dose distribution from transit dosimetry measurements made using electronic portal imaging devices. This will potentially allow the cumulative dose distribution to be determined through the patient’s multi-fraction treatment and adaptive treatment strategies developed to optimize the tumour response.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Textual document set has become an important and rapidly growing information source in the web. Text classification is one of the crucial technologies for information organisation and management. Text classification has become more and more important and attracted wide attention of researchers from different research fields. In this paper, many feature selection methods, the implement algorithms and applications of text classification are introduced firstly. However, because there are much noise in the knowledge extracted by current data-mining techniques for text classification, it leads to much uncertainty in the process of text classification which is produced from both the knowledge extraction and knowledge usage, therefore, more innovative techniques and methods are needed to improve the performance of text classification. It has been a critical step with great challenge to further improve the process of knowledge extraction and effectively utilization of the extracted knowledge. Rough Set decision making approach is proposed to use Rough Set decision techniques to more precisely classify the textual documents which are difficult to separate by the classic text classification methods. The purpose of this paper is to give an overview of existing text classification technologies, to demonstrate the Rough Set concepts and the decision making approach based on Rough Set theory for building more reliable and effective text classification framework with higher precision, to set up an innovative evaluation metric named CEI which is very effective for the performance assessment of the similar research, and to propose a promising research direction for addressing the challenging problems in text classification, text mining and other relative fields.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

As part of YANQ's decentralisation across the state, YANQ have set up 10 Networks across Queensland, with Facilitators based in each of the regions. We encourage you to get in contact with your local Facilitator if you would like to have input on Workforce Development or youth policy issues. CPLANs aim to create an ongoing and sustainable structure across ten regions in Queensland to support a consistent focus on: ⋅ Policy issues relevant to young people; and ⋅ Workforce development strategies for the youth sector from a local, regional and state perspective. The ten CPLANs fall under the existing structure of YANQ and utlise and lever off the comprehensive network of youth inter-­‐agencies and networks across the state. The ten CPLANs are made up of representatives from the youth sector in each region who have an interest in contributing to policy development and workforce issues.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Australian mosquitoes from which Japanese encephalitis virus (JEV) has been recovered (Culex annulirostris, Culex gelidus, and Aedes vigilax) were assessed for their ability to be infected with the ChimeriVax-JE vaccine, with yellow fever vaccine virus 17D (YF 17D) from which the backbone of ChimeriVax-JE vaccine is derived and with JEV-Nakayama. None of the mosquitoes became infected after being fed orally with 6.1 log(10) plaque-forming units (PFU)/mL of ChimeriVax-JE vaccine, which is greater than the peak viremia in vaccinees (mean peak viremia = 4.8 PFU/mL, range = 0-30 PFU/mL of 0.9 days mean duration, range = 0-11 days). Some members of all three species of mosquito became infected when fed on JEV-Nakayama, but only Ae. vigilax was infected when fed on YF 17D. The results suggest that none of these three species of mosquito are likely to set up secondary cycles of transmission of ChimeriVax-JE in Australia after feeding on a viremic vaccinee.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Knowledge of particle emission characteristics associated with forest fires and in general, biomass burning, is becoming increasingly important due to the impact of these emissions on human health. Of particular importance is developing a better understanding of the size distribution of particles generated from forest combustion under different environmental conditions, as well as provision of emission factors for different particle size ranges. This study was aimed at quantifying particle emission factors from four types of wood found in South East Queensland forests: Spotted Gum (Corymbia citriodora), Red Gum (Eucalypt tereticornis), Blood Gum (Eucalypt intermedia), and Iron bark (Eucalypt decorticans); under controlled laboratory conditions. The experimental set up included a modified commercial stove connected to a dilution system designed for the conditions of the study. Measurements of particle number size distribution and concentration resulting from the burning of woods with a relatively homogenous moisture content (in the range of 15 to 26 %) and for different rates of burning were performed using a TSI Scanning Mobility Particle Sizer (SMPS) in the size range from 10 to 600 nm and a TSI Dust Trak for PM2.5. The results of the study in terms of the relationship between particle number size distribution and different condition of burning for different species show that particle number emission factors and PM2.5 mass emission factors depend on the type of wood and the burning rate; fast burning or slow burning. The average particle number emission factors for fast burning conditions are in the range of 3.3 x 1015 to 5.7 x 1015 particles/kg, and for PM2.5 are in the range of 139 to 217 mg/kg.