988 resultados para Scatter plot


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reports on the early stages of a design experiment in educational assessment that challenges the dichotomous legacy evident in many assessment activities. Combining social networking technologies with the sociology of education the paper proposes that assessment activities are best understood as a negotiable field of exchange. In this design experiment students, peers and experts engage in explicit, "front-end" assessment (Wyatt-Smith, 2008) to translate holistic judgments into institutional, and potentiality economic capital without adhering to long lists of pre-set criteria. This approach invites participants to use social networking technologies to judge creative works using scatter graphs, keywords and tag clouds. In doing so assessors will refine their evaluative expertise and negotiate the characteristics of creative works from which criteria will emerge (Sadler, 2008). The real-time advantages of web-based technologies will aggregate, externalise and democratise this transparent method of assessment for most, if not all, creative works that can be represented in a digital format.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The most interesting questions that arise in patent law are the ones that test the boundaries of patentable subject matter. One of those questions has been put forward recently in the United States in an argument in favour of patenting the plots of fictional stories. United States attorney Andrew F Knight has claimed that storylines are patentable subject matter and should be recognised as such. What he claims is patentable is not the copyrightable expression of a written story or even a written outline of a plot but the underlying plot of a story itself. The commercial application of ‘storyline patents’, as he describes them, is said to be their exclusive use in books and movies. This article analyses the claims made and argues that storylines are not patentable subject matter under Australian law. It also contends that policy considerations, as well as the very nature of creative works, weigh against recognition of ‘storyline patents’.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We developed orthogonal least-squares techniques for fitting crystalline lens shapes, and used the bootstrap method to determine uncertainties associated with the estimated vertex radii of curvature and asphericities of five different models. Three existing models were investigated including one that uses two separate conics for the anterior and posterior surfaces, and two whole lens models based on a modulated hyperbolic cosine function and on a generalized conic function. Two new models were proposed including one that uses two interdependent conics and a polynomial based whole lens model. The models were used to describe the in vitro shape for a data set of twenty human lenses with ages 7–82 years. The two-conic-surface model (7 mm zone diameter) and the interdependent surfaces model had significantly lower merit functions than the other three models for the data set, indicating that most likely they can describe human lens shape over a wide age range better than the other models (although with the two-conic-surfaces model being unable to describe the lens equatorial region). Considerable differences were found between some models regarding estimates of radii of curvature and surface asphericities. The hyperbolic cosine model and the new polynomial based whole lens model had the best precision in determining the radii of curvature and surface asphericities across the five considered models. Most models found significant increase in anterior, but not posterior, radius of curvature with age. Most models found a wide scatter of asphericities, but with the asphericities usually being positive and not significantly related to age. As the interdependent surfaces model had lower merit function than three whole lens models, there is further scope to develop an accurate model of the complete shape of human lenses of all ages. The results highlight the continued difficulty in selecting an appropriate model for the crystalline lens shape.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The effects of radiation backscattered from the secondary collimators into the monitor chamber in an Elekta linac (producing 6 and 10 MV photon beams) are investigated using BEAMnrc Monte Carlo simulations. The degree and effects of this backscattered radiation are assessed by evaluating the changes to the calculated dose in the monitor chamber, and by determining a correction factor for those changes. Additionally, the fluency and energy characteristics of particles entering the monitor chamber from the downstream direction are evaluated by examining BEAMnrc phase-space data. It is shown that the proportion of particles backscattered into the monitor chamber is small (<0.35 %), for all field sizes studied. However, when the backscatter plate is removed from the model linac, these backscattered particles generate a noticeable increase in dose to the monitor chamber (up to approximate to 2.4 % for the 6 MV beam and up to 4.4 % for the 10 MV beam). With its backscatter plate in place, the Elekta linac (operating at 6 and 10 MV) is subject to negligible variation of monitor chamber dose with field size. At these energies, output variations in photon beams produced by the clinical Elekta linear accelerator can be attributed to head scatter alone. Corrections for field-size-dependence of monitor chamber dose are not necessary when running Monte Carlo simulations of the Elekta linac operating at 6 and 10 MV.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multivariate methods are required to assess the interrelationships among multiple, concurrent symptoms. We examined the conceptual and contextual appropriateness of commonly used multivariate methods for cancer symptom cluster identification. From 178 publications identified in an online database search of Medline, CINAHL, and PsycINFO, limited to articles published in English, 10 years prior to March 2007, 13 cross-sectional studies met the inclusion criteria. Conceptually, common factor analysis (FA) and hierarchical cluster analysis (HCA) are appropriate for symptom cluster identification, not principal component analysis. As a basis for new directions in symptom management, FA methods are more appropriate than HCA. Principal axis factoring or maximum likelihood factoring, the scree plot, oblique rotation, and clinical interpretation are recommended approaches to symptom cluster identification.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Tide Lords series of fantasy novels set out to examine the issue of immortality. Its purpose was to look at the desirability of immortality, specifically why people actively seek it. It was meant to examine the practicality of immortality, specifically — having got there, what does one do to pass the time with eternity to fill? I also wished to examine the notion of true immortality — immortals who could not be killed. What I did not anticipate when embarking upon this series, and what did not become apparent until after the series had been sold to two major publishing houses in Australia and the US, was the strength of the immortality tropes. This series was intended to fly in the face of these tropes, but confronted with the reality of such a work, the Australian publishers baulked at the ideas presented, requesting the series be re-written with the tropes taken into consideration. They wanted immortals who could die, mortals who wanted to be immortal. And a hero with a sense of humour. This exegesis aims to explore where these tropes originated. It will also discuss the ways I negotiated a way around the tropes, and was eventually able to please the publishers by appearing to adhere to the tropes, while still staying true to the story I wanted to tell. As such, this discussion is, in part, an analysis of how an author negotiates the tensions around writing within a genre while trying to innovate within it.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An experimental investigation has been made of a round, non-buoyant plume of nitric oxide, NO, in a turbulent grid flow of ozone, 03, using the Turbulent Smog Chamber at the University of Sydney. The measurements have been made at a resolution not previously reported in the literature. The reaction is conducted at non-equilibrium so there is significant interaction between turbulent mixing and chemical reaction. The plume has been characterized by a set of constant initial reactant concentration measurements consisting of radial profiles at various axial locations. Whole plume behaviour can thus be characterized and parameters are selected for a second set of fixed physical location measurements where the effects of varying the initial reactant concentrations are investigated. Careful experiment design and specially developed chemilurninescent analysers, which measure fluctuating concentrations of reactive scalars, ensure that spatial and temporal resolutions are adequate to measure the quantities of interest. Conserved scalar theory is used to define a conserved scalar from the measured reactive scalars and to define frozen, equilibrium and reaction dominated cases for the reactive scalars. Reactive scalar means and the mean reaction rate are bounded by frozen and equilibrium limits but this is not always the case for the reactant variances and covariances. The plume reactant statistics are closer to the equilibrium limit than those for the ambient reactant. The covariance term in the mean reaction rate is found to be negative and significant for all measurements made. The Toor closure was found to overestimate the mean reaction rate by 15 to 65%. Gradient model turbulent diffusivities had significant scatter and were not observed to be affected by reaction. The ratio of turbulent diffusivities for the conserved scalar mean and that for the r.m.s. was found to be approximately 1. Estimates of the ratio of the dissipation timescales of around 2 were found downstream. Estimates of the correlation coefficient between the conserved scalar and its dissipation (parallel to the mean flow) were found to be between 0.25 and the significant value of 0.5. Scalar dissipations for non-reactive and reactive scalars were found to be significantly different. Conditional statistics are found to be a useful way of investigating the reactive behaviour of the plume, effectively decoupling the interaction of chemical reaction and turbulent mixing. It is found that conditional reactive scalar means lack significant transverse dependence as has previously been found theoretically by Klimenko (1995). It is also found that conditional variance around the conditional reactive scalar means is relatively small, simplifying the closure for the conditional reaction rate. These properties are important for the Conditional Moment Closure (CMC) model for turbulent reacting flows recently proposed by Klimenko (1990) and Bilger (1993). Preliminary CMC model calculations are carried out for this flow using a simple model for the conditional scalar dissipation. Model predictions and measured conditional reactive scalar means compare favorably. The reaction dominated limit is found to indicate the maximum reactedness of a reactive scalar and is a limiting case of the CMC model. Conventional (unconditional) reactive scalar means obtained from the preliminary CMC predictions using the conserved scalar p.d.f. compare favorably with those found from experiment except where measuring position is relatively far upstream of the stoichiometric distance. Recommendations include applying a full CMC model to the flow and investigations both of the less significant terms in the conditional mean species equation and the small variation of the conditional mean with radius. Forms for the p.d.f.s, in addition to those found from experiments, could be useful for extending the CMC model to reactive flows in the atmosphere.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Advances in symptom management strategies through a better understanding of cancer symptom clusters depend on the identification of symptom clusters that are valid and reliable. The purpose of this exploratory research was to investigate alternative analytical approaches to identify symptom clusters for patients with cancer, using readily accessible statistical methods, and to justify which methods of identification may be appropriate for this context. Three studies were undertaken: (1) a systematic review of the literature, to identify analytical methods commonly used for symptom cluster identification for cancer patients; (2) a secondary data analysis to identify symptom clusters and compare alternative methods, as a guide to best practice approaches in cross-sectional studies; and (3) a secondary data analysis to investigate the stability of symptom clusters over time. The systematic literature review identified, in 10 years prior to March 2007, 13 cross-sectional studies implementing multivariate methods to identify cancer related symptom clusters. The methods commonly used to group symptoms were exploratory factor analysis, hierarchical cluster analysis and principal components analysis. Common factor analysis methods were recommended as the best practice cross-sectional methods for cancer symptom cluster identification. A comparison of alternative common factor analysis methods was conducted, in a secondary analysis of a sample of 219 ambulatory cancer patients with mixed diagnoses, assessed within one month of commencing chemotherapy treatment. Principal axis factoring, unweighted least squares and image factor analysis identified five consistent symptom clusters, based on patient self-reported distress ratings of 42 physical symptoms. Extraction of an additional cluster was necessary when using alpha factor analysis to determine clinically relevant symptom clusters. The recommended approaches for symptom cluster identification using nonmultivariate normal data were: principal axis factoring or unweighted least squares for factor extraction, followed by oblique rotation; and use of the scree plot and Minimum Average Partial procedure to determine the number of factors. In contrast to other studies which typically interpret pattern coefficients alone, in these studies symptom clusters were determined on the basis of structure coefficients. This approach was adopted for the stability of the results as structure coefficients are correlations between factors and symptoms unaffected by the correlations between factors. Symptoms could be associated with multiple clusters as a foundation for investigating potential interventions. The stability of these five symptom clusters was investigated in separate common factor analyses, 6 and 12 months after chemotherapy commenced. Five qualitatively consistent symptom clusters were identified over time (Musculoskeletal-discomforts/lethargy, Oral-discomforts, Gastrointestinaldiscomforts, Vasomotor-symptoms, Gastrointestinal-toxicities), but at 12 months two additional clusters were determined (Lethargy and Gastrointestinal/digestive symptoms). Future studies should include physical, psychological, and cognitive symptoms. Further investigation of the identified symptom clusters is required for validation, to examine causality, and potentially to suggest interventions for symptom management. Future studies should use longitudinal analyses to investigate change in symptom clusters, the influence of patient related factors, and the impact on outcomes (e.g., daily functioning) over time.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dangerous Places is a novel about the gap between mythological (or 'dreamed') constructions of reality and actual life. The story centres on V en, a married woman with two young children. Her love for her children is fiercely protective and encompassing, but she feels alienated from her husband and to a certain extent her society; so when her first love, Yanni, re-enters her life,she is strongly tempted to resume her affair with him. She is however seduced more by the memories she has 'mythologized' about him than by his physical reality; in the course of the novel she is forced to come to terms with her own delusions. The subplot of the novel involves other characters who are caught between illusion and reality as well, and who deal with 'truth' in differing ways. The themes of the book are explored using a number of structures which underlie and support the surface story. The Greek myths of Adonis/ Aphrodite and Hades/Persephone are framing agents for the plot, and the setting in contemporary Brisbane and North Stradbroke Island is symbolic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis applies Monte Carlo techniques to the study of X-ray absorptiometric methods of bone mineral measurement. These studies seek to obtain information that can be used in efforts to improve the accuracy of the bone mineral measurements. A Monte Carlo computer code for X-ray photon transport at diagnostic energies has been developed from first principles. This development was undertaken as there was no readily available code which included electron binding energy corrections for incoherent scattering and one of the objectives of the project was to study the effects of inclusion of these corrections in Monte Carlo models. The code includes the main Monte Carlo program plus utilities for dealing with input data. A number of geometrical subroutines which can be used to construct complex geometries have also been written. The accuracy of the Monte Carlo code has been evaluated against the predictions of theory and the results of experiments. The results show a high correlation with theoretical predictions. In comparisons of model results with those of direct experimental measurements, agreement to within the model and experimental variances is obtained. The code is an accurate and valid modelling tool. A study of the significance of inclusion of electron binding energy corrections for incoherent scatter in the Monte Carlo code has been made. The results show this significance to be very dependent upon the type of application. The most significant effect is a reduction of low angle scatter flux for high atomic number scatterers. To effectively apply the Monte Carlo code to the study of bone mineral density measurement by photon absorptiometry the results must be considered in the context of a theoretical framework for the extraction of energy dependent information from planar X-ray beams. Such a theoretical framework is developed and the two-dimensional nature of tissue decomposition based on attenuation measurements alone is explained. This theoretical framework forms the basis for analytical models of bone mineral measurement by dual energy X-ray photon absorptiometry techniques. Monte Carlo models of dual energy X-ray absorptiometry (DEXA) have been established. These models have been used to study the contribution of scattered radiation to the measurements. It has been demonstrated that the measurement geometry has a significant effect upon the scatter contribution to the detected signal. For the geometry of the models studied in this work the scatter has no significant effect upon the results of the measurements. The model has also been used to study a proposed technique which involves dual energy X-ray transmission measurements plus a linear measurement of the distance along the ray path. This is designated as the DPA( +) technique. The addition of the linear measurement enables the tissue decomposition to be extended to three components. Bone mineral, fat and lean soft tissue are the components considered here. The results of the model demonstrate that the measurement of bone mineral using this technique is stable over a wide range of soft tissue compositions and hence would indicate the potential to overcome a major problem of the two component DEXA technique. However, the results also show that the accuracy of the DPA( +) technique is highly dependent upon the composition of the non-mineral components of bone and has poorer precision (approximately twice the coefficient of variation) than the standard DEXA measurements. These factors may limit the usefulness of the technique. These studies illustrate the value of Monte Carlo computer modelling of quantitative X-ray measurement techniques. The Monte Carlo models of bone densitometry measurement have:- 1. demonstrated the significant effects of the measurement geometry upon the contribution of scattered radiation to the measurements, 2. demonstrated that the statistical precision of the proposed DPA( +) three tissue component technique is poorer than that of the standard DEXA two tissue component technique, 3. demonstrated that the proposed DPA(+) technique has difficulty providing accurate simultaneous measurement of body composition in terms of a three component model of fat, lean soft tissue and bone mineral,4. and provided a knowledge base for input to decisions about development (or otherwise) of a physical prototype DPA( +) imaging system. The Monte Carlo computer code, data, utilities and associated models represent a set of significant, accurate and valid modelling tools for quantitative studies of physical problems in the fields of diagnostic radiology and radiography.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work is focussed on developing a commissioning procedure so that a Monte Carlo model, which uses BEAMnrc’s standard VARMLC component module, can be adapted to match a specific BrainLAB m3 micro-multileaf collimator (μMLC). A set of measurements are recommended, for use as a reference against which the model can be tested and optimised. These include radiochromic film measurements of dose from small and offset fields, as well as measurements of μMLC transmission and interleaf leakage. Simulations and measurements to obtain μMLC scatter factors are shown to be insensitive to relevant model parameters and are therefore not recommended, unless the output of the linear accelerator model is in doubt. Ultimately, this note provides detailed instructions for those intending to optimise a VARMLC model to match the dose delivered by their local BrainLAB m3 μMLC device.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reads season 1 of the critically-acclaimed Canadian television series “Slings & Arrows” (2003). This six-episode series is set in a fictionalised version of the Stratford Festival, and tells the story of a plagued production of Shakespeare’s Hamlet. It follows the play’s rehearsal after the death of the festival’s artistic director; Geoffrey Tennant (himself a plagued Hamlet) takes over the role of director, and must face his past in order to produce a Hamlet that will save the festival, redeem his reputation, and repair his interpersonal relationships. Drawing on popular and theatrical understandings of Shakespeare’s play, the series negotiates tropes of metatheatre, filiality, cultural production and consumption, in order to demonstrate the ongoing relevance and legitimacy of “Shakespeare” in the twenty-first century. The “Slings & Arrows” narrative revolves around the doubled-plot of Hamlet and the experiences of the company mounting Hamlet. In quite obvious ways, the show thus thematises ways in which Shakespeare can be used to read one’s own life and world. In the broader sense, however, the show also offers theatre/performance as a catalyst for affect. In doing so, the show functions as a relatively straight adaptation of Hamlet, and a metatheatrical/metafictional commentary on the functions of Hamlet within contemporary culture. In Shakespeare’s play, the production of “The Mouse-Trap” proves, both to Hamlet and the audience, the legitimacy of the ghost’s claims. Similarly, in “Slings & Arrows”, the successful performance of Hamlet legitimises Geoffrey’s position as artistic director of the festival, and affirms for the viewer the value of Shakespearean production in contemporary culture. In each text, theatre/performance enables and legitimises a son carrying out a dead father’s wishes in order to restore or reproduce socio-cultural order. The metatheatrics of these gestures engage the reader/viewer in a self-reflexive process whereby the ‘value’ of theatre is thematised and performed, and the consumer is positioned as the arbiter and agent of that value: complicit in its production even as they are the site of its consumption.